Coincidentally, I'm also in the process of
writing an article about women in the film.
Magazine has recently had two articles on this
topic -- one in their summer issue re: documentaries
and another a few issues ago more generally.
I think that feminists have had an impact on
the film industry -- mostly in supporting the
burgeoning independent film industry, where
roles for women are more honest and accurate
and where women have more room to direct, produce
and not be sexuality. Feminists have also had
an impact on individual actors thus allowing
them to refuse roles they don't want to and also
to promote their stake in their characters.
think it's limited an inaccurate to just look
at numbers -- directors, producers, etc...
-- or even limit women's advancement to how they
are or aren't sexualized -- especially since
some women want to be sexualized. I think that
it's deeper than that -- what is the role of
film in our lives and how well can women access
the film world and what choices are available
Hope that helps,