Encyclopedia > Perceptrons

  Article Content

Perceptron

Redirected from Perceptrons

The perceptron was invented in 1957 at the Cornell Aeronautical Laboratory[?] by Frank Rosenblatt[?].

The perceptron consists of a one or more layers of artificial neurons; the inputs are fed directly to the outputs via a series of weights. In this way it can be considered the simplest kind of feedforward network. The sum of the products of the weights and the inputs is calculated in each node, and if the value is above some threshold (typically 0) the neuron fires and takes the value 1; otherwise it takes the value -1.

Artificial neurons with this kind of activation function are also called McCulloch-Pitts neurons or threshold neurons. In the literature the term perceptron sometimes also refers to networks consisting of just one of these units.

Perceptrons can be trained by a simple learning algorithm that is usually called the delta-rule[?]. It calculates the errors between calculated output and sample output data, and uses this to create an adjustment to the weights, thus implementing a form of gradient descent.

Although the perceptron initially seemed promising, it was quickly proved that simple perceptrons could not be trained to recognise many classes of patterns. This led to the field of neural network research stagnating for many years, before it was recognised that neural networks with three or more layers had far greater processing power than simpler perceptrons.

Simple perceptrons with one or two layers are only capable of learning linearly seperable patterns; in 1969 a famous monograph entitled Perceptrons by Marvin Minsky and Seymour Papert showed that it was impossible for these classes of network to learn an XOR function. They conjectured (incorrectly) that a similar result would hold for a perceptron with three or more layers.

The discovery in the 1980s that multi-layer neural networks did not, in fact, have these problems led to the resurgence of neural network research.

References:

  • Rosenblatt, Frank (1958), The Perceptron: A Probabilistic Model for Information Storage and Organization in the Brain, Cornell Aeronautical Laboratory, Psychological Review, v65, No. 6, pp. 386-408.
  • Minsky M L and Papert S A 1969 Perceptrons (Cambridge, MA: MIT Press)

External links



All Wikipedia text is available under the terms of the GNU Free Documentation License

 
  Search Encyclopedia

Search over one million articles, find something about almost anything!
 
 
  
  Featured Article
East Hampton North, New York

... from two or more races. 16.62% of the population are Hispanic or Latino of any race. There are 1,445 households out of which 27.3% have children under the age of 18 ...

 
 
 
This page was created in 37.6 ms