BTECH- ELECTRICAL AND ELECTRONICS ENGGI EERING Study Guide - Kernel Principal Component Analysis, Dimensionality Reduction, Tesseract

9 views10 pages

Document Summary

The curse of dimensionality, main approaches for dimensionality. Reduction, pca, using scikit-learn, randomized pca, kernel pca. Many machine learning problems involve thousands or even millions of features for each training instance. Not only does this make training extremely slow, it can also make it much harder to find a good solution, as we will see. This problem is often referred to as the curse of dimensionality. Fortunately, in real-world problems, it is often possible to reduce the number of features considerably, turning an intractable problem into a tractable one. We are so used to living in three dimensions1 that our intuition fails us when we try to imagine a high- dimensional space. Even a basic 4d hypercube is incredibly hard to picture in our mind (see figure 8-1), let alone a 200-dimensional ellipsoid bent in a 1,000-dimensional space. Point, segment, square, cube, and tesseract (0d to 4d hypercubes) It turns out that many things behave very differently in high-dimensional space.

Get access

Grade+
$40 USD/m
Billed monthly
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
10 Verified Answers

Related Documents

Related Questions