Encyclopedia > West Coast of the United States

  Article Content

West Coast of the United States

In general, the term "West Coast" is used to describe all of the coastal states of the Western United States. In practice, however, it is usually used to refer strictly to California. It has also come to be called "The Coast", especially by New Yorkers.

The term has been taken by rap music performers when used to refer to a particular school of artists. The East Coast/West Coast dichotomy has led to violence and much rhetoric.

See also: Geography of the Western United States

All Wikipedia text is available under the terms of the GNU Free Documentation License

  Search Encyclopedia

Search over one million articles, find something about almost anything!
  Featured Article

... Losses of lives (both military and civilian) and property resulted from the actions of the Israeli Army as well as the Palestinian fighters. The first Intifada ...