STAT 3201 Lecture Notes - Lecture 23: Marginal Distribution, Univariate, Conditional Expectation
Document Summary
Two lectures ago we introduced conditional probability distributions. Let y1 and y2 be two random variables. For a given value of y2, say y2, the distribution of y1 is univariate. Thus, we can define expectations of y1|y2=y2 using ideas from previous lectures. Univariate case: let y be a continuous random variable with density f(y). What is the expected value of a function g(y). Definition: if y1 and y2 are jointly continuous random variables, the conditional expectation of g(y1) given y2=y2: how about the case of jointly discrete random variables, example: let y1 and y2 have the following joint density function. What is the marginal density of y2: what is the conditional density of y1 given y2=y2, what is the conditional expectation of y1 given y2=y2. Definition: let y1 and y2 denote random variables. If p denotes the probability of observing a defective, then y has a binomial distribution, assuming that a large number of items are produced on the line.