COMPSCI 273A Midterm: 2013F-273-Mid

32 views7 pages
31 Jan 2019
School
Professor

Document Summary

Name of the person in front of you (if any): Name of the person to your right (if any): total time is 1:15. Consider the following set of training data, consisting of two-dimensional real-valued features and a binary class value, for a k-nearest-neighbors classi er. Positive data are shown as circles, negative as squares. (1) sketch the decision boundary for k = 1. Show your work and justify your answer in a few sentences (2-3). (2) sketch the decision boundary for k = 5, in the relevant part of the feature space (i. e. , near the training data). For the training error rate, indicate the values (error rates) of the endpoints (k = 2 t e a r r o r r. When training a linear classi er with gradient descent, we decrease the maximum number of iter- ations performed by the algorithm.

Get access

Grade+
$40 USD/m
Billed monthly
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
10 Verified Answers

Related Documents