preview

Imperialism In America During The Twentieth Century

Decent Essays

Imperialism is defined as a policy of extending a country's power and influence through diplomacy or military force. Over the nineteenth as well as the twentieth century America has allowed Imperialism to play an enormous role in our growth and economic success. Imperialism wasn’t America’s first resort in fact during world war one the American agenda closely resembled that of isolationism rather than imperialism.

However America did reach the point to which imperialism would benefit them. Over the course of a few years Europe’s economic status steadily declined, and resources were becoming depleted making America a key power in trade. America began adjusting to imperialist ideology in order to gain boundless resources as well as to impose

Get Access