Feminism : Women And Women

Better Essays
“I need feminism because my mother prays that I marry a successful man more than she prays that I become successful” (Unknown). According to Merriam-Webster’s dictionary, feminism is the belief that women and men should have equal rights and opportunities. The feminist revolution has come a long way, yet women are still being viewed as beneath men. Unfortunately, we are still living in a male dominated society. The work that is done by a man is still being seen as more significant and worthwhile than that of a woman. In society, being a wife and/or mother has been the most significant role of women. Society continues to view women as caretakers and nurturers. Though a woman may work and have a career, she still has most of the responsibility of taking care of the children and doing the housework. Our society has a tendency to assign specific gender roles such as men being the providers and women being the caretakers. However, women have continuously fought hard to change this image.
Throughout the world, every society and nation has some gender inequality. Women earn only 10 percent of the income and own 10 percent of the land, even though they put in an average of 60 percent of work. There has been immense political changes as well as economic growth in many countries, yet women are still subjected to discrimination and abuse. Regardless of the progress women have made, there is still a wide range of gender inequality. In every aspect of our economy, politics, corporate
Get Access