STA261H1 Lecture Notes - Lecture 1: Parameter Space, Bias Of An Estimator, Estimation Theory

192 views9 pages

Document Summary

Sincere thanks to alex stringer, the professor of the course. This note is taken during and after his lecture, based on the lecture materials. This note should not be used for any purpose other than study and learn. E(g(x)) = g(e(x)) i g is linear: standard deviation is the euclidean distance from the random variable to its mean. V ar(x) = sd(x)2 = e(x 2) e(x)2: moment-generating function is mx (t) = e(etx ) X (0: compute moments e(xk) = m (k, two rv have the same distribution i x =d = y mx (t) = my (t, chebyshev: p(kx e(x)k > t) v ar(x) Markov: x 0 with probability 1, and e(x) exists, then p(x t) e(x: converges in probability: sequence zn converges in probability to if > 0, limn (p (kzn k) > ) Average converges to mean, for large samples, i. e. n = xn.

Get access

Grade+20% off
$8 USD/m$10 USD/m
Billed $96 USD annually
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
40 Verified Answers
Class+
$8 USD/m
Billed $96 USD annually
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
30 Verified Answers

Related Documents

Related Questions