preview

Explain How World War 1 Changed American Society

Decent Essays

World War I had drastically changed American society. Women’s roles in society were changed as men were drafted or had joined the army. Racial tensions between whites and blacks significantly increased as racism and segregation were implemented. The government begain to take general control over its people in both their social and economic life. Three factors that drastically affected the American society were the changing roles of women, the increase of racial tensions, and the greater government intervention of its people. Due to WWI, the supply of men had severely decreased, thus, it was the woman's responsibility to replace them in society and the workplace. Women replaced men in factories and other occupations. Approximately one million women were brought into the labor force through the course of the war. An example of a new job for women was a truck driver or a worker in the railroad industry. Women had no …show more content…

During this time period, many blacks had moved North to find work in the factories, but were greeted with skepticism as they arrived. The whites at the time were not use to numerous amounts of blacks and thus feared them due to the threats they posed on their jobs. The racial violence towards blacks also increased. Many blacks were lynched by white mobs and race wars broke out across numerous cities. In the summer of 1919, known as the “Red Summer” riots broke out in major cities such as Washington D.C., Chicago, and New York City where both black and white men were injured and killed. Blacks were also significantly discriminated against when they applied to join the army. The “IQ” tests were completely biased as the questions completely favored the whites, thus limiting their ability to join the war. Racial tensions changed American society as the two ethnic groups grew farther apart and tensions worsened relationships between the two

Get Access