Encyclopedia > American Empire

  Article Content

American Empire

American Empire is an informal term that was used to collectively describe the various colonies and territories that were either owned or governed by the United States government.

Unlike many other powerful western nations, the United States has historically not been a country that has participated in traditional imperialist expansion or conquests. Most of America's colonies were gained following the American victory in the Spanish American War, in which the Kingdom of Spain agreed to cede most of her colonial posesesions to the control of the United States.

Most of America's former colonies have since become independent countries, states of the American union, or self-governing commonwealths.

The following ares have at one time or another been part of the "American Empire:"

Today, the term "American Empire" is sometimes used as derogatory expression to personify America's military and cultural presence in many nations. American Empire[?] is a book by Andrew J. Bacevich that says the United States started to act like an Empire after the end of Cold War.

Some commentators and public figures instead insist that America is and should be an empire in every sense. The Project for the New American Century, which includes many of today's American statesmen, is representative of this view.

See also: American imperialism[?], Pax Americana



All Wikipedia text is available under the terms of the GNU Free Documentation License

 
  Search Encyclopedia

Search over one million articles, find something about almost anything!
 
 
  
  Featured Article
Kings Park, New York

... The per capita income for the town is $26,906. 3.7% of the population and 2.6% of families are below the poverty line. Out of the total people living in poverty, 4.1% ...

 
 
 
This page was created in 21.8 ms