Encyclopedia > American Empire

  Article Content

American Empire

American Empire is an informal term that was used to collectively describe the various colonies and territories that were either owned or governed by the United States government.

Unlike many other powerful western nations, the United States has historically not been a country that has participated in traditional imperialist expansion or conquests. Most of America's colonies were gained following the American victory in the Spanish American War, in which the Kingdom of Spain agreed to cede most of her colonial posesesions to the control of the United States.

Most of America's former colonies have since become independent countries, states of the American union, or self-governing commonwealths.

The following ares have at one time or another been part of the "American Empire:"

Today, the term "American Empire" is sometimes used as derogatory expression to personify America's military and cultural presence in many nations. American Empire[?] is a book by Andrew J. Bacevich that says the United States started to act like an Empire after the end of Cold War.

Some commentators and public figures instead insist that America is and should be an empire in every sense. The Project for the New American Century, which includes many of today's American statesmen, is representative of this view.

See also: American imperialism[?], Pax Americana



All Wikipedia text is available under the terms of the GNU Free Documentation License

 
  Search Encyclopedia

Search over one million articles, find something about almost anything!
 
 
  
  Featured Article
Thomas a Kempis

... in the highest key of Christian experience. It was meant for monastics and recluses. Behind and within all its reflections runs the council of self-renunciation. ...

 
 
 
This page was created in 25.8 ms