preview

American Imperialism In The 1800s

Decent Essays

During the time of the 1800s and the early 1900s the United States of America became more imperialistic than it ever has before. Some of these places that America took control of later became actual states of the United States or ended up having good relationships with America. Throughout this time, there was a real thought again about the manifest destiny method. This manifest destiny was seen beyond the United States, but also seen outside of America by taking control or influencing other countries. The expansion of the American Navy, and the victory in the Spanish-American war effected the decision to become imperialistic in the eyes of America. This war forced Spain to give control over Cuba to the United States and grant Puerto Rico, Guam, …show more content…

This helped our nation grow stronger, grow our naval bases, and helped trade substantially. Even though taking over these places with violence maybe wasn’t the right thing to do, I still think it was the best decision at that time. Because of the United States taking over these places in the world I think the people of these areas benefited from it. In the Caribbean areas Spain was no longer in control over those nations for one. The United States later made Hawaii a state and therefore all the people benefited from that by becoming American Citizens. Also, it helped the Philippines and the people of that nation become its own self run country. Yet, the U.S. did watch over the country and look after it. Later, these people benefited from this because they were never an independent nation because of the Spanish rule and the U.S. granted them their independence as a nation. So yes, the U.S. may have taken some action that was unethical, but thus it gained many benefits for our nations and helped the people of other nations gain rights and freedoms that they didn’t have

Get Access