Talk:Chernoff's inequality
From Wikipedia, the free encyclopedia
I added an article on Chernoff bound which though a special case of the inequality here, merits special attention. Perhaps the articles could be combined in some form. CSTAR 23:08, 21 Jun 2004 (UTC)
Do they really have to be discrete random variables? (At this moment I'm too lazy to figure this out for myself, but I'll probably do so soon.) Michael Hardy 22:51, 24 Jun 2004 (UTC)
Am I right in thinking that μ is the mean of X? -- Anon
I always thought that "Chernoff inequality" refers to the inequality on gaussian variable ξ˜N(0,1) and differentiable function g such that Eg(ξ) = 0 and states that in this case . I promise to write on this subject later. Amir Aliev 11:18, 23 May 2006 (UTC)
[edit] Proof
Can we see a proof of this inequality? Mwilde 19:25, 5 Mar 2005 (UTC)
[edit] Mistake?
Is the inequality correct, as stated?
This article looks like a copy of Theorem 3 (page 6) from paper
http://www.math.ucsd.edu/~fan/wp/concen.pdf
But in that paper we have:
with an additional n. So which one is correct?
~Ruksis~
- The second one, I think. Clearly it has to depend on n. Michael Hardy 20:01, 21 July 2006 (UTC)
- No. The first one is correct. It depends on n through . Daniel Hsu, 25 August 2006
The referred to paper is likely the source of confusion. It talks about σ2 being the variance of Xi, but that doesn't make sense since the various variables Xi could have distinct variances. Earlier versions in Wikipedia have the same mistake. If all the Xi would have the same variance, say if they're identically distributed, and σ2 would denote the variance of (all the) Xi, then the Chernoff bounds needs the n in 2exp(k2 / (4n)). In its current, more general, form not. Peter van Rossum, 27 September 2006.