home what'snew resources ask amy news activism antiviolence events marketplace aboutus
Ask a Question!
Meet Amy!
Amy's Resource Guide
Ask Amy Main
TOPICS
Feminism
Girls/Children
Health
International
Media
Miscellaneous
Most Asked Questions
Politics
Reproductive Rights
Sexual Harassment
Violence Against Women
Women's History
Work/Career
   
 


 
 
Media

I would like to hear your thoughts about how women's roles in society have changed and how this is reflected in film. I would also like to know if the Feminist Film Theory has noted changes in women's roles in film from 1950s on, and whether women are becoming more or less equal in status in Hollywood to men.

I would really appreciate your thoughts on my investigation- "As society has changed, women's roles have changed.  How is this reflected in the film industry?"

Many thanks,

Olivia

   

Olivia,

It’s really hard for me to make general comments about women in film: one, because I'm actually not that familiar with how women were represented historically (pre-1960) in film, and also because it really does depend on the actual film. I would say that the main advantage today is that we have more range. We haven't necessarily done away with stereotypical images of women, but at least we have expanded images of women. And I also think that women aren't solely helpless. They might be sex objects, but they are in control. Also, the main difference between then and now is that women are actually the creators of film, including some sexist films, which makes it confusing, or at least subjects the creators and images to more attacks.

Beyond looking at images, I think content is key. There are now so many more movies available and thus greater plot lines. I would say the greatest danger today is that the economics of who is represented is so out of whack, what is presented as poor is really lower middle class. Also, I think that the overall obsession with Hollywood beyond who they represent, who they are. I hope that helps.

Good luck,

- Amy