Computer Science 4442A/B Lecture Notes - Lecture 22: Polymorphous Light Eruption, Wicket-Keeper, Machine Translation

37 views5 pages

Document Summary

Basic probability: p(x) is the probability that x is true, p(baby its a boy) = 0. 5, p(baby is named john) = 0. 001. Joint probabilities: p(x,y) is the probability that x and y are both true, p(brown eyes, boy) = (number of all baby boys with brown eyes) / (total number of babies) The markov assumption: consider, p(computer | instead of listening to this boring lecture, i would like to play on my, the probability that computer follows the first part is intuitively the same as the probability that. Computer follows the words in the second half: the probability of the next word depends most strongly on just a few previous words. Example: trigram approximation (n=3: each word depends only on the two previous words, three words total with the current two, tri = three, gram = writing. P(and) p(nothing | and) p(but | and nothing) p(the | nothing but) p(truth | but the)

Get access

Grade+20% off
$8 USD/m$10 USD/m
Billed $96 USD annually
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
40 Verified Answers
Class+
$8 USD/m
Billed $96 USD annually
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
30 Verified Answers