CSE 150 Lecture Notes - Lecture 13: Arg Max, Product Rule, Subsequence

49 views3 pages
HW 5.3
Z z = {1,2,...,k} type of “movie-goer”
/ | \ \
R1R2… R61
Review of Hidden Markov Models (HMMs)
Assumptions
P(St|S1,...,St-1) = P(St|St-1)
P(Ot|S1,...,ST) = P(Ot|St)
P(St+1 = s’|St = s) = P(St+1+k = s’|St+k = s)
P(Ot = o|St = s) = P(Ot+k = o|St+k = s)
BN: It is a polytree!
S1 S2 … → ST
| | |
O1O2… OT
Joint Distribution
P(S1,...,ST, O1,...,OT) = P(S1) Πt = 2
T
P(St|St-1) Πt = 1
T
P(Ot|St)
S’ O’
Parameters:
aij = P(St+1 = j|St = i) transition matrix (nxn)
bik = P(Ot = k|St = i) emission matrix (nxm)
πi = P(S1 = i) initial state distribution
Key questions
1) How to compute P(O1,...,OT) ?
2) How to compute most likely hidden state sequence?
Arg max S’ [P(S1,...,ST|O1,...,OT)] nT possible hidden state sequences
3) How to update beliefs for real time monitoring? How to compute (St = i|O1,...,Ot) ?
4) How to learn HMM from data? How to estimate {πi, aij, bik } from sequences of observations to
maximize P(O1,...,OT) ?
Note: 1-3 are problems in “inference”, 4 is estimating CPTs
1) How to compute likelihood P(O1,...,OT) ?
P(O1,...,OT) = Σs’ P(S1,...,ST,O1,...,OT) [marginalization]
= Σs’ P(S1) Πt = 2
T
P(St|St-1) Πt = 1
T
P(Ot|St)
Efficient Recursion
P(O1,...,Ot,Ot+1, St+1 = j)
= Σi = 1
n P(O1,...,Ot,Ot+1,St = i, Ot+1, St+1 = j) [marginalization]
= Σi = 1
n P(O1,...,Ot,Ot+1,St = i) P(St+1 = j|O1,...,Ot, St = i) P(Ot+1|O1,...,Ot,St = i,St+1 = j) [product rule]
= Σi = 1
n P(O1,...,Ot,Ot+1,St = i) P(St+1 = j|St = i) P(Ot+1|St+1 = j) [CI]
Unlock document

This preview shows page 1 of the document.
Unlock all 3 pages and 3 million more documents.

Already have an account? Log in

Get access

Grade+20% off
$8 USD/m$10 USD/m
Billed $96 USD annually
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
40 Verified Answers
Class+
$8 USD/m
Billed $96 USD annually
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
30 Verified Answers

Related Documents