Difference between revisions of "A priori distribution"
(Added wiki links) |
m |
||
Line 1: | Line 1: | ||
{{TEX|done}} | {{TEX|done}} | ||
− | The [[probability distribution]] of a random variable, to be contrasted with the [[Conditional distribution|conditional distribution]] of this random variable under certain additional conditions. Usually the term "a priori distribution" is used in the following way. Let $(\Theta,X)$ be a pair of random variables (random vectors or more general random elements). The random variable $\Theta$ is considered to be unknown, while $X$ is considered to be the result of an observation to be used for estimation of $\Theta$. The [[joint distribution]] of $\Theta$ and $X$ is given by the distribution of $\Theta$ (now called the a priori distribution) and the set of conditional probabilities $\ | + | The [[probability distribution]] of a random variable, to be contrasted with the [[Conditional distribution|conditional distribution]] of this random variable under certain additional conditions. Usually the term "a priori distribution" is used in the following way. Let $(\Theta,X)$ be a pair of random variables (random vectors or more general random elements). The random variable $\Theta$ is considered to be unknown, while $X$ is considered to be the result of an observation to be used for estimation of $\Theta$. The [[joint distribution]] of $\Theta$ and $X$ is given by the distribution of $\Theta$ (now called the a priori distribution) and the set of conditional probabilities $\mathrm P_\theta$ of the random variable $X$ given $\Theta=\theta$. According to the [[Bayes formula|Bayes formula]], one can calculate the conditional probability of $\Theta$ with respect to $X$ (which is now called the a posteriori distribution of $\Theta$). In statistical problems, the a priori distribution is often unknown (and even the assumption on its existence is not sufficiently founded). For the use of the a priori distribution, see [[Bayesian approach|Bayesian approach]]. |
Revision as of 21:34, 1 January 2019
The probability distribution of a random variable, to be contrasted with the conditional distribution of this random variable under certain additional conditions. Usually the term "a priori distribution" is used in the following way. Let $(\Theta,X)$ be a pair of random variables (random vectors or more general random elements). The random variable $\Theta$ is considered to be unknown, while $X$ is considered to be the result of an observation to be used for estimation of $\Theta$. The joint distribution of $\Theta$ and $X$ is given by the distribution of $\Theta$ (now called the a priori distribution) and the set of conditional probabilities $\mathrm P_\theta$ of the random variable $X$ given $\Theta=\theta$. According to the Bayes formula, one can calculate the conditional probability of $\Theta$ with respect to $X$ (which is now called the a posteriori distribution of $\Theta$). In statistical problems, the a priori distribution is often unknown (and even the assumption on its existence is not sufficiently founded). For the use of the a priori distribution, see Bayesian approach.
Comments
References
[a1] | E. Sverdrup, "Laws and chance variations" , 1 , North-Holland (1967) pp. 214ff |
A priori distribution. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=A_priori_distribution&oldid=38864