Unlike many other powerful western nations, the United States has historically not been a country that has participated in traditional imperialist expansion or conquests. Most of America's colonies were gained following the American victory in the Spanish American War, in which the Kingdom of Spain agreed to cede most of her colonial posesesions to the control of the United States.
Most of America's former colonies have since become independent countries, states of the American union, or self-governing commonwealths.
The following ares have at one time or another been part of the "American Empire:"
Today, the term "American Empire" is sometimes used as derogatory expression to personify America's military and cultural presence in many nations. American Empire[?] is a book by Andrew J. Bacevich that says the United States started to act like an Empire after the end of Cold War.
Some commentators and public figures instead insist that America is and should be an empire in every sense. The Project for the New American Century, which includes many of today's American statesmen, is representative of this view.