18.44 Lecture Notes - Lecture 27: Random Variable, Independent And Identically Distributed Random Variables, Aner

23 views2 pages
School
Department
Course

Document Summary

Let x be a random variable taking only non-negative values. Proof: consider a random variable y de ned by y = that e[x] e[y] = ap{x a}. Divide both sides by a to get markov"s inequality. Since x y with probability one, it follows. If x has nite mean , variable , and k > 0, then p{|x - | k} /k . Proof: note that (x - ) is a non-negative random variable and p{|x - | k} = p{(x - ) k }. Markov"s inequality: let x be a random variable taking only non-negative values with nite mean. Chebyshev"s inequality: if x has nite mean , variable , and k > 0, then p{|x - | k} /k . Inequalities allow us to deduce limited information about a distribution when we know only the mean (markov) or the mean and variance (chebyshev). Markov: if e[x] is small, then it is not too likely that x is large.

Get access

Grade+20% off
$8 USD/m$10 USD/m
Billed $96 USD annually
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
40 Verified Answers
Class+
$8 USD/m
Billed $96 USD annually
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
30 Verified Answers

Related Documents

Related Questions