STATS 426 Lecture Notes - Lecture 11: Maximum Likelihood Estimation, Likelihood Function, Independent And Identically Distributed Random Variables

22 views7 pages
9 Mar 2016
School
Department
Course

Document Summary

We discuss (a) estimation by the method of moments and (b) maximum likelihood. As before, our set-up is the following: x1, x2, . , xn are i. i. d. observations from some density or mass function f (x, ). Suppose we know how the rst k moments of x1 look like, as a function of . Thus, let i( ) = e (x i: for i = 1, 2, . , k (we tacitly assume that these moments exist). Now, suppose that this map is invertible, in the sense that we can express as a function of the rst k moments. Let this inverse map be denoted by h (note that implies that the distribution of x1 is completely determined by its rst k moments). Now, the i"s are also known as population moments. Then the mom principle, says, that we should estimate by where.

Get access

Grade+20% off
$8 USD/m$10 USD/m
Billed $96 USD annually
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
40 Verified Answers
Class+
$8 USD/m
Billed $96 USD annually
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
30 Verified Answers

Related textbook solutions

Related Documents