preview

Sexism And Gender Roles Throughout America

Better Essays

In recent months few topics have gripped the nation quite like the idea of sexism and gender roles. While in the last few decades many strong leaders have stepped forth to reconcile the differences, American still has a long way to go. While we present ourselves as a model for other countries, doing so is unjust considering the misbalance of power, representation, perceived value and respect. While many continue to deny it, sexism is still a major problem in modern American society.
Division of the genders through gender roles has been the norm of society since the beginning of man. In order to evaluate the current state of sexism in America, one must first study our history. From the earliest of human evolution men were valued as the hunter, the protector, the leader. Gallantly, spears in hand, they would leave mates and children behind in search of meat for the clan. The whole of his family’s survival resided on his shoulders, and any miss step on his fault was seen as immense failure. Women were left behind to mend the camps, cook the meals, and watch the children. Viewed as the weaker sex, the were left at home with the “gentler” work, only leaving the home for simpler tasks as gathers of berries and roots. As these society grew more seditentary and complex, the introduction of organized societies only perpetuated these gender roles. Although these roles were clearly defined and strictly followed, both were seen as equal and necessary for survival. (CITE)
As different

Get Access