PHYSICS 102 Lecture Notes - Lecture 13: Linear Discriminant Analysis, Euclidean Distance, Distance Matrix

11 views3 pages
School
Department
Professor

Document Summary

Discriminant analysis requires you to know group membership to derive the classification rule. Hierarchical clustering choose a statistic to quantify, select a method for forming the groups, determine how many clusters you need to represent your data: small data, easily examine solutions. K-means clustering select the number of clusters you want, algorithm estimates the cluster means and assigns each case to a cluster if you know how many clusters you want and moderate data size. Agglomerative: begins with every case being a cluster unto itself, algorithm ends with everybody in one useless cluster, once a cluster has been formed it cannot be split, only combined. Divisive: starts with everyone in one cluster and ends with everyone in individual clusters. Must select: criterion for determining similarity or distance between cases, criterion for determining which clusters are merged at successive steps, the number of clusters you need to represent your data.

Get access

Grade+20% off
$8 USD/m$10 USD/m
Billed $96 USD annually
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
40 Verified Answers
Class+
$8 USD/m
Billed $96 USD annually
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
30 Verified Answers

Related textbook solutions

Related Documents