Chebyshev's inequality states that, for any probability distribution, at most of the area of the probability density function lies more than standard deviations away from the mean. We can do better, if we know that the distribution is bounded and we know the bounds.
Let be a random variable bounded by , where . Given the first two moments, and , of its probability distribution, a sharp lower bound to , where , is given by:
That's it. On a related note...
Let be a random variable and a constant both bounded by and , where .
Let be a random variable otherwise equal to , but collecting the tail of exceeding to :
and cannot be determined knowing only , , and . However, sharp upper bounds to and are given by:
Beats me how to prove these formulas, but I tried with tens of thousands of randomly generated distributions and they always worked. The way I got the formulas was by trying to think of the "worst-case" distributions (that's why these are sharp bounds). There were a few of these, corresponding to the different conditions. So I think I know the worst cases, but I don't know how to show that these truly are the worst cases.
Henry Bottomley has written more about Chebyshev type inequalities.