Difference between revisions of "Loss function"
(Importing text file) |
(TeX done) |
||
Line 1: | Line 1: | ||
− | In a problem of statistical decision making, a non-negative function indicating the loss (cost) to an experimenter given a particular state of the world and a particular decision. Let | + | {{TEX|done}}{{MSC|62C05}} |
+ | |||
+ | In a problem of [[Statistical decision theory|statistical decision making]], a non-negative function indicating the loss (cost) to an experimenter given a particular state of the world and a particular decision. Let $X$ be a random variable taking values in a [[sample space]] $(\mathfrak{X},\mathcal{B},\mathsf{P}_\theta)$, $\theta \in \Theta$, and let $D = \{d\}$ be the space of all possible decisions that can be taken on the basis of an observed $X$. In the theory of statistical decision functions, any non-negative function $L$ defined on $\Theta \times D$ is called a loss function. The value of a loss function $L$ at an arbitrary point $(\theta,d) \in \Theta \times D$ is interpreted as the cost incurred by taking a decision $d \in D$, when the true parameter is $\theta$, $\theta \in \Theta$. | ||
====References==== | ====References==== | ||
− | <table><TR><TD valign="top">[1]</TD> <TD valign="top"> A. Wald, "Statistical decision functions" , Wiley (1950)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top"> E.L. Lehmann, "Testing statistical hypotheses" , Wiley (1986)</TD></TR></table> | + | <table> |
− | + | <TR><TD valign="top">[1]</TD> <TD valign="top"> A. Wald, "Statistical decision functions" , Wiley (1950) {{ZBL|0040.36402}}</TD></TR> | |
− | + | <TR><TD valign="top">[2]</TD> <TD valign="top"> E.L. Lehmann, "Testing statistical hypotheses" (2nd ed.), Wiley (1986) {{ZBL|0608.62020}}</TD></TR> | |
+ | </table> | ||
====Comments==== | ====Comments==== | ||
Line 10: | Line 13: | ||
====References==== | ====References==== | ||
− | <table><TR><TD valign="top">[a1]</TD> <TD valign="top"> J.O. Berger, "Statistical decision theory and Bayesian analysis" , Springer (1985)</TD></TR></table> | + | <table> |
+ | <TR><TD valign="top">[a1]</TD> <TD valign="top"> J.O. Berger, "Statistical decision theory and Bayesian analysis" (2nd ed.) , Springer (1985) {{ZBL|0572.62008}}</TD></TR> | ||
+ | </table> |
Latest revision as of 19:00, 12 April 2017
2020 Mathematics Subject Classification: Primary: 62C05 [MSN][ZBL]
In a problem of statistical decision making, a non-negative function indicating the loss (cost) to an experimenter given a particular state of the world and a particular decision. Let $X$ be a random variable taking values in a sample space $(\mathfrak{X},\mathcal{B},\mathsf{P}_\theta)$, $\theta \in \Theta$, and let $D = \{d\}$ be the space of all possible decisions that can be taken on the basis of an observed $X$. In the theory of statistical decision functions, any non-negative function $L$ defined on $\Theta \times D$ is called a loss function. The value of a loss function $L$ at an arbitrary point $(\theta,d) \in \Theta \times D$ is interpreted as the cost incurred by taking a decision $d \in D$, when the true parameter is $\theta$, $\theta \in \Theta$.
References
[1] | A. Wald, "Statistical decision functions" , Wiley (1950) Zbl 0040.36402 |
[2] | E.L. Lehmann, "Testing statistical hypotheses" (2nd ed.), Wiley (1986) Zbl 0608.62020 |
Comments
References
[a1] | J.O. Berger, "Statistical decision theory and Bayesian analysis" (2nd ed.) , Springer (1985) Zbl 0572.62008 |
Loss function. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Loss_function&oldid=13130