ECON1203 Lecture Notes - Lecture 3: Bayes Estimator, Statistical Inference, Sample Space

57 views2 pages
18 May 2018
Department
Course
Professor
3 - Probability
The mathematical means of studying uncertainty
Provides the logical foundation of statistical inference
Helps us make judgment calls to support decisions on basis of partial info
Review: let e ad f e to eets i saple spae, S
If e ad f are mutually exclusive, then the addition rule states
P (e f) = P(e) + P(f)
Can also write P (e f) as P (e or f)
If e ad f are not mutually exclusive, then the general addition rule states
P (e f) = P(e) + P(f) P (e f)
Can also write P (e f) as P (e and f)
Sometimes easier to work with probability of event (or set of events)
complementary to eet e, ko as ot e or ec
P(ec) = 1 P(e)
Idea of joint probability is captured in the expression P (e f), written as P (e and f)
The multiplication rule shows how this is computed
Conditional probability
I a S here f has alread ee osered to our or ot our, the
argial proailit that e ours a hage
The conditional probability that e ours, gie that f has ourred is:
(a) P (e | f) = P (e and f) / P(f)
(b) Similarly, P (f | e) = P (e and f) / P(e)
(c) Rearranging, P (e and f) = P (e | f) * P(f)
i. This is the multiplication rule
(d) If P (e | f) = P(e) and P (f | e) = P(f), then:
i. Conditioning has no effect
ii. e ad f are said to e independent events
Baes ‘ule: P (e | f) = [ P (f | e) * P(e) ] / P(f)
(a) If e ad f are idepedet, this siplifies to: P (e | f) = P(e)
Independence
Covariance and correlation are measures of linear association or linear
dependence
(a) Dependence (and its opposite, independence) is a more general concept of
association between two variables
Probability distribution function (pdf):
P X = x ≥ 0 for all 
find more resources at oneclass.com
find more resources at oneclass.com
Unlock document

This preview shows half of the first page of the document.
Unlock all 2 pages and 3 million more documents.

Already have an account? Log in

Document Summary

Provides the logical foundation of statistical inference: helps us make judgment calls to support decisions on basis of partial info. Review: let (cid:858)e(cid:859) a(cid:374)d (cid:858)f(cid:859) (cid:271)e t(cid:449)o e(cid:448)e(cid:374)ts i(cid:374) sa(cid:373)ple spa(cid:272)e, s. If (cid:858)e(cid:859) a(cid:374)d (cid:858)f(cid:859) are mutually exclusive, then the addition rule states. P (e f) = p(e) + p(f) Can also write p (e f) as p (e or f) If (cid:858)e(cid:859) a(cid:374)d (cid:858)f(cid:859) are not mutually exclusive, then the general addition rule states. P (e f) = p(e) + p(f) p (e f) Can also write p (e f) as p (e and f: sometimes easier to work with probability of event (or set of events) complementary to e(cid:448)e(cid:374)t (cid:858)e(cid:859), k(cid:374)o(cid:449)(cid:374) as (cid:858)(cid:374)ot e(cid:859) or ec. Idea of joint probability is captured in the expression p (e f), written as p (e and f) The multiplication rule shows how this is computed: conditional probability.

Get access

Grade+20% off
$8 USD/m$10 USD/m
Billed $96 USD annually
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
40 Verified Answers
Class+
$8 USD/m
Billed $96 USD annually
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
30 Verified Answers

Related Documents