preview

The Role Of Radical Feminism In The United States

Decent Essays
Women have fought oppression in the past, and are still considered fighting today. Feminism has become very popular in the United States. Equality of the sexes has been accomplished in many areas, yet equality is only the beginning for radical feminists in the Untied States. The idea of equality of the sexes has been twisted into the fantasy of women as the dominant gender.

Feminism is defined as the belief of equality among the sexes, socially, politically, and economically. The idea of feminism originated as early as 18th century France, and continued to grow " with women’s public actions to acquire individual liberties in the 19th and 20th Centuries" (The Roots Of The Word 'Feminism'). For areas such as the Middle East, China, and France, feminism was a large rigorous battle. The United States did not join the bandwagon until women fully displayed it in 1848, fighting for their right to vote.Still active today, feminists wish to end the concept of men being the dominate gender. However, feminism, though needed in certain areas,
…show more content…
In 1848 all women in the U.S were striving for a say in their government. Now however, radical feminists in the U.S have broken off and developed a multitude of issues to solve with multiple types of feminism to solve them. "Whatever positive image the word feminist may have had, it has been tarnished by those who have made it their own, and I, for one, am content to leave the militants in full possession of the term." — Dale O'Leary states in her book "The Gender Agenda" (Feminism? You Want Feminism? Which Brand Would You Like?). Radical feminists have taken hold of the term and twisted it into a misunderstood word, in which multiple individuals find themselves arguing over. Feminism has grown in the United States, yet it has seemed to grow too
Get Access