Encyclopedia > The west

  Article Content

West

Redirected from The west

West is most commonly a noun, adjective, or adverb indicating direction or geography.

West is the direction towards which the sun sets at the equinox. It is one of the four cardinal points of the compass, upon which it is considered the opposite of East, and at right angles to North and South.

"The West" also often refers to Western countries. When used in this sense, it could mean anything from NATO, Europe and North America with or without Japan to whole Judeo-Christian civilisation. This meaning of the word merges with the concept of Western society.

"West" can also refer to the American West, especially during the end of the 19th century. It is also called the "Old West". Compare The Frontier.

See also: Western movie

Quotes WEST, n. Part of world which lies to the west (or to the east) of East - Ambrose Bierce



All Wikipedia text is available under the terms of the GNU Free Documentation License

 
  Search Encyclopedia

Search over one million articles, find something about almost anything!
 
 
  
  Featured Article
Autocracy

...     Contents Autocracy Autocracy is a form of government which resides in the absolute power of a single individual. The term can b ...

 
 
 
This page was created in 120.7 ms