The
logit (pronounced with a long "o" and a soft "g") of a number
p between 0 and 1 is
- <math>{\rm logit}(p)=\log\frac{p}{1-p}.</math>
(The base of the
logarithm function used here is of little importance in the present article, as long as it is greater than 1.) The logit function is the inverse of the
"sigmoid", or "logistic" function.
If
p is a probability then
p/(1-p) is the corresponding
odds, and the logit of the probability is the logarithm of the odds. Logits are used for various purposes by statisticians. In particular there is the "logit model" of which the simplest sort is
- <math>{\rm logit}(p_i)=a+bx_i</math>
where
xi is some quantity on which success or failure in the
ith in a sequence of
Bernoulli trials may depend, and
pi is the probability of success in the
ith case. For example,
x may be the age of a patient admitted to a hospital with a heart attack, and "success" may be the event that the patient dies before leaving the hospital (another instance of the reason why the words "success" and "failure" in speaking of Bernoulli trials should be taken with large doses of salt). Having observed the values of
x in a sequence of cases and whether there was a "success" or a a "failure" in each such case, a statistician will often estimate the values of the coefficients
a and
b by the method of
maximum likelihood. The result can then be used to assess the probability of "success" in a subsequent case in which the value of
x is known. Estimation and prediction by this method are called
logistic regression.
The logit in logistic regression is a special case of a link function[?] in generalized linear models[?].
The logit model was introduced by Joseph Berkson in 1944.
See also
All Wikipedia text
is available under the
terms of the GNU Free Documentation License