preview

Gender Roles In American Society

Decent Essays

In my opinion, I think that it is definitely more accepted for women to assume traditionally masculine roles than for men to assumes traditionally feminine roles in the American society. I think that this first began during WWI, when men were sent to war and women took over jobs that men held before they were drafted. This showed that women were able to leave the home and hold their own in a workplace setting- and to this day women are still out in the workforce working alongside men in all types of jobs. Women are even applauded as brave when they take roles as the 'bread winner' for the family and are successful in jobs where they are the boss/CEO/manager. Though sometimes this does backfire on women, because some people will judge them because they aren't at home 'taking care of their family' and judging them for other people raising their children if they are in day-care while the mother is at work. People also call women who are in charge 'b**chy' or 'bossy', because they do not like a woman as their superior. …show more content…

Stay at home dads are always made fun of, and even insulted, because the majority of society believes that it is the 'mans job' to take care of his family- so he is seen as weak if he isn't accomplishing this. Men are also demeaned if they take a position in a 'woman's job' (i.e. nurse, secretary, daycare provider), because it is not seen as a manly thing to do. Some people, especially other men, will even question these men's sexuality or 'manliness' when they hold these types of jobs. But on the flip side of this, some men are called 'sensitive' and 'caring' if they are a stay at home dad or hold a female job position- which can be a

Get Access