HSS 4303 Lecture Notes - Lecture 5: Relative Survival, Joseph L. Fleiss, Systematic Review

47 views3 pages

Document Summary

Week 5 agreement and natural history of disease. Agreement: most commonly used to determine whether a paper makes it into a systematic review. Re(cid:448)ie(cid:449) (cid:862)reliability: extent to which screening test will produce the same or similar results each time it is administered(cid:863) Inter-rater reliability (aka concordance): how much different raters can come to the same conclusion. Variation in measurements when taken by different persons with the same method or instruments. Differe(cid:374)t people use sa(cid:373)e sph(cid:455)g(cid:373)o(cid:373)a(cid:374)o(cid:373)eter (cid:894)it is the si(cid:373)ilarit(cid:455) of people"s results(cid:895) Fleiss"s kappa (cid:894)(cid:373)odifi(cid:272)atio(cid:374) of cohe(cid:374)"s kappa to appl(cid:455) to (cid:373)ultiple raters(cid:895) Cohe(cid:374)"s kappa (cid:894)(cid:449)e o(cid:374)l(cid:455) (cid:374)eed to k(cid:374)o(cid:449) this (cid:272)al(cid:272)ulatio(cid:374)(cid:895: kappa is generally thought to be more robust since it takes into account the chance of the agreement. Pr(e) = probability that agreement is due to chance. If the ratio is 1, they are in complete agreement. Discordant data: the ones that they disagreed on (3, 4) Concordant data: the ones that they agreed on (41, 27)

Get access

Grade+20% off
$8 USD/m$10 USD/m
Billed $96 USD annually
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
40 Verified Answers
Class+
$8 USD/m
Billed $96 USD annually
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
30 Verified Answers

Related Documents