{"id":1225,"date":"2007-12-14T08:56:51","date_gmt":"2007-12-14T06:56:51","guid":{"rendered":"https:\/\/yehar.com\/blog\/?p=1225"},"modified":"2016-02-20T10:03:59","modified_gmt":"2016-02-20T08:03:59","slug":"chebyshev-type-one-sided-inequalities-for-bounded-distributions","status":"publish","type":"post","link":"https:\/\/yehar.com\/blog\/?p=1225","title":{"rendered":"One-sided Chebyshev-type inequalities for bounded probability distributions"},"content":{"rendered":"<p>2007-12-14\nChebyshev's inequality states that, for any probability distribution, at most $1\/k^2$ of the area of the probability density function lies more than $k$ standard deviations away from the mean. We can do better, if we know that the distribution is bounded and we know the bounds.<\/p>\n<p>Let $X$ be a random variable bounded by $0 \\le X \\le M$, where $M &gt; 0$. Given the first two moments, $E(X)$ and $E(X^2)$, of its probability distribution, a sharp lower bound to $P(X &lt;L)$, where $L &gt; 0$, is given by:<\/p>\n<p>$$P(X &lt; L)\\ge \\left\\{ \\begin{array}{ll} 0 &amp; \\mbox{if $E(X) &gt; L$ and $E(X^2) &lt; L E(X) + M E(X) - L M$,}\\\\\\\\ 1 - \\frac{L E(X) + M E(X) - E(X^2)}{L M} &amp; \\mbox{if $\\big (E(X) &gt; L$ and $E(X^2) \\ge L E(X) + M E(X) - L M \\big )$}\\\\ &amp; \\mbox{or $\\big(E(X) \\le L$ and $E(X^2) \\ge L E(X)\\big)$,}\\\\\\\\ \\frac{E(X)^2 - 2 L E(X) + L^2}{E(X^2) - 2 L E(X) + L^2} &amp; \\mbox{if $E(X) \\le L$ and $E(X^2) &lt; L E(X)$.} \\end{array} \\right. $$<\/p>\n<p>That's it. On a related note...<\/p>\n<p>Let $X$ be a random variable and $L$ a constant both bounded by $0 \\le X \\le M$ and $0 \\le L \\le M$, where $M &gt; 0$.<\/p>\n<p>Let $Y$ be a random variable otherwise equal to $X$, but collecting the tail of $X$ exceeding $L$ to $L$:<\/p>\n<p>$$Y = \\left\\{\\begin{array}{ll} X &amp; \\mbox{if $X \\le L$,} \\\\ L &amp; \\mbox{if $X &gt; L$.}\\\\ \\end{array}\\right.$$<\/p>\n<p>$E(Y)$ and $E(Y^2)$ cannot be determined knowing only $E(X)$, $E(X^2)$, $M$ and $L$. However, sharp upper bounds to $E(Y)$ and $E(Y^2)$ are given by:<\/p>\n<p>$$ \\begin{array}{c} E(Y) \\le \\left\\{ \\begin{array}{ll} L &amp; \\mbox{if $E(X) &gt; L$ and $E(X^2) &lt; L E(X) + M E(X) - L M$,} \\\\\\\\ \\frac{LE(X)+ME(X)-E(X^2)}{M} &amp; \\mbox{if $\\big(E(X) &gt; L$ and $E(X^2) \\ge L E(X) + M E(X) - L M\\big)$ or} \\\\ &amp; \\mbox{$\\big(E(X) \\le L$ and $E(X^2) \\ge L E(X)\\big)$,} \\\\\\\\ E(X) &amp; \\mbox{if $E(X) \\le L$ and $E(X^2) &lt; L E(X)$,} \\end{array}\\right.\\\\\\\\ E(Y^2) \\le \\left\\{ \\begin{array}{ll} L^2 &amp; \\mbox{if $E(X) &gt; L$ and $E(X^2) &lt; L E(X) + M E(X) - L M$,} \\\\\\\\ \\frac{L^2E(X)+LME(X)-LE(X^2)}{M} &amp; \\mbox{if $\\big(E(X) &gt; L$ and $E(X^2) \\ge L E(X) + M E(X) - L M\\big)$ or} \\\\ &amp; \\mbox{$\\big(E(X) \\le L$ and $E(X^2) \\ge L E(X)\\big)$,} \\\\\\\\ E(X^2) &amp; \\mbox{if $E(X) \\le L$ and $E(X^2) &lt; L E(X)$.} \\end{array}\\right. \\end{array} $$<\/p>\n<p>Beats me how to prove these formulas, but I tried with tens of thousands of randomly generated distributions and they always worked. The way I got the formulas was by trying to think of the \"worst-case\" distributions (that's why these are sharp bounds). There were a few of these, corresponding to the different conditions. So I think I know the worst cases, but I don't know how to show that these truly are the worst cases.<\/p>\n<p>Henry Bottomley has written more about <a href=\"http:\/\/www.se16.info\/hgb\/cheb.htm\">Chebyshev type inequalities<\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>2007-12-14 Chebyshev&#8217;s inequality states that, for any probability distribution, at most $1\/k^2$ of the area of the probability density function lies more than $k$ standard deviations away from the mean. We can do better, if we know that the distribution is bounded and we know the bounds. Let $X$ be a random variable bounded by &hellip; <a href=\"https:\/\/yehar.com\/blog\/?p=1225\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;One-sided Chebyshev-type inequalities for bounded probability distributions&#8221;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":3569,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[4],"tags":[],"_links":{"self":[{"href":"https:\/\/yehar.com\/blog\/index.php?rest_route=\/wp\/v2\/posts\/1225"}],"collection":[{"href":"https:\/\/yehar.com\/blog\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/yehar.com\/blog\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/yehar.com\/blog\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/yehar.com\/blog\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=1225"}],"version-history":[{"count":5,"href":"https:\/\/yehar.com\/blog\/index.php?rest_route=\/wp\/v2\/posts\/1225\/revisions"}],"predecessor-version":[{"id":3782,"href":"https:\/\/yehar.com\/blog\/index.php?rest_route=\/wp\/v2\/posts\/1225\/revisions\/3782"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/yehar.com\/blog\/index.php?rest_route=\/wp\/v2\/media\/3569"}],"wp:attachment":[{"href":"https:\/\/yehar.com\/blog\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=1225"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/yehar.com\/blog\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=1225"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/yehar.com\/blog\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=1225"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}