Bayesian analysis

statistics
verifiedCite
While every effort has been made to follow citation style rules, there may be some discrepancies. Please refer to the appropriate style manual or other sources if you have any questions.
Select Citation Style
Share
Share to social media
URL
https://mainten.top/science/Bayesian-analysis
Feedback
Corrections? Updates? Omissions? Let us know if you have suggestions to improve this article (requires login).
Thank you for your feedback

Our editors will review what you’ve submitted and determine whether to revise the article.

Also known as: Bayesian estimation
Key People:
Thomas Bayes
David Blackwell
Related Topics:
estimation

Bayesian analysis, a method of statistical inference (named for English mathematician Thomas Bayes) that allows one to combine prior information about a population parameter with evidence from information contained in a sample to guide the statistical inference process. A prior probability distribution for a parameter of interest is specified first. The evidence is then obtained and combined through an application of Bayes’s theorem to provide a posterior probability distribution for the parameter. The posterior distribution provides the basis for statistical inferences concerning the parameter.

This method of statistical inference can be described mathematically as follows. If, at a particular stage in an inquiry, a scientist assigns a probability distribution to the hypothesis H, Pr(H)—call this the prior probability of H—and assigns probabilities to the obtained evidence E conditionally on the truth of H, PrH(E), and conditionally on the falsehood of H, Pr−H(E), Bayes’s theorem gives a value for the probability of the hypothesis H conditionally on the evidence E by the formula PrE(H) = Pr(H)PrH(E)/[Pr(H)PrH(E) + Pr(−H)Pr−H(E)].

One of the attractive features of this approach to confirmation is that when the evidence would be highly improbable if the hypothesis were false—that is, when Pr−H(E) is extremely small—it is easy to see how a hypothesis with a quite low prior probability can acquire a probability close to 1 when the evidence comes in. (This holds even when Pr(H) is quite small and Pr(−H), the probability that H is false, correspondingly large; if E follows deductively from H, PrH(E) will be 1; hence, if Pr−H(E) is tiny, the numerator of the right side of the formula will be very close to the denominator, and the value of the right side thus approaches 1.)

bar graph
More From Britannica
statistics: Bayesian methods

A key, and somewhat controversial, feature of Bayesian methods is the notion of a probability distribution for a population parameter. According to classical statistics, parameters are constants and cannot be represented as random variables. Bayesian proponents argue that, if a parameter value is unknown, then it makes sense to specify a probability distribution that describes the possible values for the parameter as well as their likelihood. The Bayesian approach permits the use of objective data or subjective opinion in specifying a prior distribution. With the Bayesian approach, different individuals might specify different prior distributions. Classical statisticians argue that for this reason Bayesian methods suffer from a lack of objectivity. Bayesian proponents argue that the classical methods of statistical inference have built-in subjectivity (through the choice of a sampling plan) and that the advantage of the Bayesian approach is that the subjectivity is made explicit.

Bayesian methods have been used extensively in statistical decision theory (see statistics: Decision analysis). In this context, Bayes’s theorem provides a mechanism for combining a prior probability distribution for the states of nature with sample information to provide a revised (posterior) probability distribution about the states of nature. These posterior probabilities are then used to make better decisions.

This article was most recently revised and updated by Erik Gregersen.