preview

Changes Of Women 's Gender Roles

Good Essays

Change in Women’s Gender Roles
Due to advances in technology, the discovery of new lands, and changing public opinion; people desire new things and the equality of man is constantly changing. For thousands of years males were considered superior to females in most cultures. In a civilized society it was often considered the duty of the man to work and provide for his family. Alternately, it was perceived that women should stay home and take care of the children and that they were not able to perform most the tasks that men did. Population expansion into the western part of the United States marked a change in the traditional role of women in daily life. The new gender roles that were pioneered during this time of expansion were very different from the previously defined roles. Gender roles again began to change at the turn of the nineteenth century. Unlike the 1800s, the 1900s held more independent development opportunities, however, at the same time women went back to governed by a strict common perception. Life was harsh for those who chose to travel west in nineteenth century America. Both men and women had to share in the work in order to survive. Due to this women were given greater opportunities than they had previously. On top of their traditional roles, women shared in the physical labor that was usually done by men. The first noticed example came from the Native American women early European trappers made contact with. “The women were responsible for trapping

Get Access