The United States And The American Century

989 Words Aug 5th, 2016 4 Pages
Throughout the 19th century, the United States concentrates on expanding its reach into foreign markets and colonization of foreign territories. After fighting Spain for Cuba and the Philippines, along with the annexation of Hawaii and other territories, the Unites States have become a minor imperial power. The 1920s is a time of wealth and prosperity for America and they try to avoid any more involvement in foreign entanglements, but WWI engages the U.S. in European affairs, leaving Americans overwhelmed by the Great Depression and a feeling of isolationism. With the war over, the United States refuses to join the League of Nations and withdrew once again. So, when did the American Century begin? At the beginning of the 20th century the United States was an international power grounded in economic strength, but their military influence is limited. The perceived power of the United States and its potential as a future power is not truly realized until after WWII and the beginning of the American Century. The phrase “American Century” describes a time in American history when the United States evolves into the world’s most powerful nation. The 20th century is the American Century because the United States emerges as the dominant force in the world through economic influence, military prowess and advanced weaponry, and cultural ideals. Other people model their nations after the values and virtues of the United States. First, during the 20th century the U.S. economy is the…
Open Document