CIS 140 Lecture Notes - Lecture 18: Bayes Estimator, Forgetting Curve, Conditional Probability

13 views2 pages

Document Summary

Start with prior belief, observe data, update beliefs: learning, ai modeling, probability. Ex: fmri analysis, forgetting curve, recognition memory, understanding alzheimers. Conditional probability: p(x|y) = p(x and y) /p(y) Marginalization: p(x and y) + p(x and not y) = p(x) Bayes rule: prior- initial estimate of probability of hypothesis (initial beliefs, posterior- estimate of probability after seeing evidence (updating beliefs) Probability as a degree of belief, based off a model, outside knowledge no actual numbers, is a feeling. Why be subjectivist: often need to make inferences about singular events, want to represent degrees of beliefs. Cox axioms- if you want to measure belief by numbers, you"ll get something that basically is a probability. Ex: theory of evolution is probably tree (p = 0. 99)- if this was frequentists that means there were 99/100 models of evolution that we calculated from to get this number this is unrealistic.

Get access

Grade+20% off
$8 USD/m$10 USD/m
Billed $96 USD annually
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
40 Verified Answers
Class+
$8 USD/m
Billed $96 USD annually
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
30 Verified Answers

Related Documents