# One-sided Chebyshev-type inequalities for bounded probability distributions

2007-12-14
Chebyshev's inequality states that, for any probability distribution, at most $1/k^2$ of the area of the probability density function lies more than $k$ standard deviations away from the mean. We can do better, if we know that the distribution is bounded and we know the bounds.

Let $X$ be a random variable bounded by $0 \le X \le M$, where $M > 0$. Given the first two moments, $E(X)$ and $E(X^2)$, of its probability distribution, a sharp lower bound to $P(X , where $L > 0$, is given by:

That's it. On a related note...

Let $X$ be a random variable and $L$ a constant both bounded by $0 \le X \le M$ and $0 \le L \le M$, where $M > 0$.

Let $Y$ be a random variable otherwise equal to $X$, but collecting the tail of $X$ exceeding $L$ to $L$:

$E(Y)$ and $E(Y^2)$ cannot be determined knowing only $E(X)$, $E(X^2)$, $M$ and $L$. However, sharp upper bounds to $E(Y)$ and $E(Y^2)$ are given by:

Beats me how to prove these formulas, but I tried with tens of thousands of randomly generated distributions and they always worked. The way I got the formulas was by trying to think of the "worst-case" distributions (that's why these are sharp bounds). There were a few of these, corresponding to the different conditions. So I think I know the worst cases, but I don't know how to show that these truly are the worst cases.

Henry Bottomley has written more about Chebyshev type inequalities.