home what'snew resources ask amy news activism antiviolence events marketplace aboutus
Ask a Question!
Meet Amy!
Amy's Resource Guide
Ask Amy Main
TOPICS
Feminism
Girls/Children
Health
International
Media
Miscellaneous
Most Asked Questions
Politics
Reproductive Rights
Sexual Harassment
Violence Against Women
Women's History
Work/Career
   
 
 
Health

Hi,

I'm a fourth-year nursing student. On of my essays for this semester is to critically discuss from a feminist perspective why it is important that "women exercise control over their own health." I was wondering if you could provide me with some advice on were to start and where I can find the most relevant information.

Thank you,

Maria

   

Dear Maria,

Historically women have been left out of--or at least marginalized from within--the traditional medical field. For instance, women weren't even research subjects for medical research until the 1970s and today, it's still far too common for women to be treated with medicine and medical practices that were tested on male subjects. Knowing this, it unfortunately, became women's responsibility to take care of themselves, and to ensure that their own health was being taken seriously. However, this doesn't mean that we shouldn't also lobby the traditional medical fields to include women--it just means that we have to do both. Also, women's health also implies children's health since women remain the primary caregivers. So women taking control of their own health often means taking care of children's health, too.

Good luck with your essay.


--Amy

home | what's new | resources | ask amy | news | activism | anti-violence
events | marketplace | about us | e-mail us | join our mailing list

©1995-2002 Feminist.com All rights reserved.