Encyclopedia > West Coast of the United States

  Article Content

West Coast of the United States

In general, the term "West Coast" is used to describe all of the coastal states of the Western United States. In practice, however, it is usually used to refer strictly to California. It has also come to be called "The Coast", especially by New Yorkers.

The term has been taken by rap music performers when used to refer to a particular school of artists. The East Coast/West Coast dichotomy has led to violence and much rhetoric.

See also: Geography of the Western United States



All Wikipedia text is available under the terms of the GNU Free Documentation License

 
  Search Encyclopedia

Search over one million articles, find something about almost anything!
 
 
  
  Featured Article
Charles V, Holy Roman Emperor

... defeated John Frederick I[?] of Saxony and imprisoned Philip of Hesse[?] in 1547. At the Diet of Augsburg[?] in 1547 he created a doctrinal compromise that he fe ...

 
 
 
This page was created in 27.1 ms