preview

Gender Roles Of Women And Women

Better Essays

Gender roles are social constructs that determine how men and women are viewed, categorized, and stereotyped. In theory, gender roles seem benign: they are, after all, simply relics of cultural tradition; in practice however, they have proved malignant, and are the root of many obstacles women and men face in the United States today, confining individuals through archaic, sexist assumptions and stereotypes. At the heart of this stereotyping is the idea that true women are sensitive, nurturing, and submissive to the more aggressive, dominant men (Carlton). Gender roles and associated sexism are the glue that binds gender to societal expectations about personality characteristics, jobs, and colors. Despite the astounding progress women have made, equality between men and women has yet to be achieved, and it is no question that gender roles are present in today’s society, contributing to the inequality women face. The impacts of gender roles are not limited to restrictive societal roles and pressures, but also contribute to health-related, economic and political inequality. Gender roles encourage the dominance of men and restrict the freedom of women. Awareness of gender roles and their detrimental effects is absolutely necessary in order for women to achieve true social and political equality in the United States.
Gender roles became increasingly rigid with the rise of the cult of domesticity; a social phenomenon occurring in the early 1800s, which emphasized the role of

Get Access