preview

Portrayal Of American Medicine

Decent Essays

Based on the media clips in this week's module, how has the popular portrayal of American medicine evolved over the course of the 20th century? The popular portrayal of American medicine greatly evolved over the course of the 20th century in many ways. In the beginning of the century, doctors were considered hero with magical powers to heal everyone through their knowledge. This can be shown in the "men in white" clip, where doctors are portrayed as the highest level and they take charge in the operating room. Also, only men were in the medical field because being a doctor showed more power and men were considered more powerful in that era (Not As A Stranger). As the years progressed, women joined the healthcare field as a nurse and more focus

Get Access