Best posts about this topicLoading . . .
American imperialism is the economic, military and cultural philosophy that the United States affects and controls other countries. Such influence is often closely associated with expansion into foreign territories. The concept of an American Empire was first popularized during the presidency of James K. Polk who led the United States into the Mexican–American War of 1846, and the eventual annexation of California and other western territories via the Treaty of Guadalupe Hidalgo and the Gadsden purchase.
No signin required
Sussle is the first, open visual encyclopedia. Anyone can use it.
It has beautiful images and viral videos that are way more fun than reading all the text in traditional encyclopedias.
Just click on the red module above.