preview

The Role Of Women In The Hollywood Industry

Decent Essays
Open Document

For decades, the Hollywood industry has been one of the worlds’ most powerful exports for films, television shows, and magazines. Over the years, the industry has gained the attention of various cultures in different parts of the world and plays a profound role in shaping the social culture by shifting societies thoughts on social, political, and economic issues. However, since the early 1800’s male dominance in the Hollywood workplace has been evident and still exists in the 21st century. Men in Hollywood are considered more authoritative and influential compared to women. Two powerful men in mainstream Hollywood, Harvey Weinstein and Donald Trump both have been involved in sexual harassment scandals with of their regards status in Hollywood. Through Variety magazine, the following conclusions patriarchy, stereotypes, and minority groups demonstrate how Harvey Weinstein and Donald Trump, who are both powerful men in Hollywood abuse their public stature by reinforcing gender inequality. Weinstein and Trump are extremely successful in the industry creating a company and a brand that is worth billions of dollars however, both men abused their power, fame, and wealth to sexually harass women and objectify women. Weinstein had over 14 women speak out about the years of sexual harassment they were put through and Trump’s Access Hollywood Tape was released where he degraded women. Patriarchy played a role in the allegations for both men because they used their financial position

Get Access