Encyclopedia > The West

  Article Content

Western countries

Redirected from The West

Western countries (or sometimes the West) is a somewhat imprecisely-defined term - derived from the old dualsim of East (Asia) and West (Europe) - now used to refer to wealthy and industrialised countries, as the inheritants of European societies, and their colonial legacies. The term is sometimes used as a synonym for the Western societies.

Depending on context, the Western countries may be restricted to the founding members of NATO in addition to Austria, Germany, Spain, Sweden and Switzerland. A broader definition might extend to Finland, Australia, New Zealand, Japan, South Korea, Taiwan, Israel and some of the more prosperous Warsaw Pact states.

Latin America is sometimes considered part of the West and sometimes not. Mainland China, the remainder of the Middle East, India, and Russia are generally not considered part of the West.

Western countries have in common a high (relative) standard of living for most citizens - compared to the rest of the world. They may also have democratic, (mostly secular) governments, and developed bodies of laws that have some expression of rights (for its own citizens) in law. Also, high levels of education, and a similar, "modern" popular culture may reflect the Western or Westernized society. Militarily and diplomatically, these "Western" societies have generally been allied with each other to one degree or another since World War Two. In fact, some would argue that this is the definition of the West and explains why Japan is usually considered Western while Ecuador is not.

More typically, the term "The West" contains a pejorative meaning - simply to describe and deliniate the wealthy and dominant societies from the poorer societies - those who are subjugated economically, miltarily, and otherwise, by deliberate restraints placed on them by the wealthier ones. "The West" then becomes simply a term to mean: "Wealthy, Colonial (slave-holding), Europe-decended (or allied) societies." The derived meaning of the above, in current use, tends to translate as: "Those who control the world" or "Those who seek to continue in domination of others and their lands."

The term The North has in many contexts replaced earlier usage of the term "the west", particularly in this critical sense. It is a little more coherent, because there is some absolute geographical definition of "northern countries", and this distinction statistically happens to capture most wealthy countries (and many wealthy regions within countries).

Evolution The concept of The West, of course, has changed over time. Japan in 1955, (immediately after its occupation by the US) would be considered by most to be part of the West - while Japan in 1750 would not. Similarly, North America in 1850 would be considered part of the West while it would not be in 1450, or 1500, even - before substantial colonization had occurred.

There are ideals that some associate with the West, and there are many who consider Western values to be universally superior. The author Francis Fukuyama argues that Western values are destined to triumph over the entire world.

However, there are many who question the meaning of the notion of Western values and point out that societies such as Japan and the United States are very different. Furthermore, they point out that advocates of Western values are selective in what they include as Western; usually including for example the concepts of freedom, democracy, and free trade, but not Communism and Nazism, both of which began in the West, or slavery, which reached massive levels in the West, and whose history in the West goes back millennia. Therefore by selecting what values are part of Western values one can tautologically show that they are superior, since any inferior values by definition are not Western.

A different attack on the concept of Western values is advocated by those who advocate Islamic values or Asian values[?]. In this view, there are a coherent set of traits that define the West, but those traits are inferior and are usually associated with moral decline, greed, and decadence.

Historically, one of the interesting questions is how did the societies associated with "the West" come to dominate the world between 1750 and 1950.


"The West" also refers to the Western United States, expecially during the period of settlement, see The Western Frontier

Compare: OECD, Group of Eight, Eastern bloc, Third World, Pacific Rim[?]



All Wikipedia text is available under the terms of the GNU Free Documentation License

 
  Search Encyclopedia

Search over one million articles, find something about almost anything!
 
 
  
  Featured Article
Museums in England

... Art Gallery[?] Liverpool Museum[?] Merseyside Maritime Museum[?] Museum of Liverpool Life[?] Tate Liverpool Walker Art Gallery[?] Northamptonshire Earls Barton ...

 
 
 
This page was created in 40.4 ms