  Encyclopedia > Likelihood principle

Article Content

Likelihood principle

In statistical theory, the likelihood principle asserts that the information in any sample can be found, if at all, from the likelihood function, that function of unknown parameters[?] which specifies the probability of the sample observed on the basis of a known model, in terms of the model's parameters.

Suppose, for example, that we have observed n independent flips of a coin which we regard as having a constant probability, p, of falling heads up. The likelihood function is then the product of n factors, each of which is either p or 1-p. If we observe x heads and n-x tails, then the likelihood function is

[itex]L(p)\sim p^x(1-p)^{1-x}[/itex]
i.e., proportional to the product.

No multiplicative constant of C(N,X) is included because only the part of the probability which involves the parameter, p, is relevant (According to some accounts, the fact that only that part is relevant is what the likelihood principle says. For example, in one experiment the number of successes in 10 Bernoulli trials is observed; in another the number of trials needed to get four successes is observed. In either case, the outcome could be four successes in ten trials. The probabilities of that outcome are different in the two experiments, but as function s of p, the probability of success on each trial, they are proportional. The Likelihood principle says the same statistical inference about the value of p should be drawn in each case.) In particular, this principle suggests that it does not matter whether you started out planning to observe N trials or you just decided to stop on a whim. The issue of the likelihood principle is still controversial.

A deeper discussion of the topic is available in the article about maximum likelihood.

All Wikipedia text is available under the terms of the GNU Free Documentation License

Search Encyclopedia
 Search over one million articles, find something about almost anything!

Featured Article
 Real ... mathematicians. The development of the calculus in the 1700s used the entire set of real numbers without having defined them cleanly. The first rigorous definition was ...  