The Great Depression Of America

Decent Essays

The Great Depression in America is often believed to have ended when the Japanese attacked Pearl Harbour and the US entered WWII in December 1941. However, while an exact end date is a matter of debate, it’s obvious the end of the Great Depression correlates somewhat with the beginning of the war, leading many to believe WWII must have ended the Great Depression and triggered the economic recovery of the United States. Many historians believe that the government and military spending restimulated the economy, and the employment needed as a result of the war meant the economic recovery of the United States was a result of WWII. However, throughout history, people have learnt that correlation isn’t enough to argue causation and generally one event rarely triggers such a major economic recovery. This suggests other factors also played a role in ending the Great Depression. Some also argue that war cannot be argued as a means to economic recovery because wars destroy wealth and give a false sense of how the economy is fairing. During the 1930s, Franklin D. Roosevelt’s New Deal laid the foundation for economic recovery and the federal government began taking a much larger role in decision making for the nation. In 1939, when WWII began, Americans certainly began to enjoy prosperity, with many pulled out of poverty and in 1941, when they themselves entered the war, prosperity increased further. By the end of the war, the American economy had indeed recovered, and they became the

Get Access