In this lecture, we look at the popular K nearest neighbors algorithm, which is an example of instance based learning. We see how the nearest neighbors algorithm carves up the instance space into Voronoi maps and can express rather complicated decision boundaries. We will end with a discussion about the curse of dimensionality.
Links and Resources
-
Chapter 13 of The Elements of Statistical Learning discusses nearest neighbors. Section 5 of chapter 2 talks about the curse of dimensionality. (Contents available online)
-
Chapter 8 of Tom Mitchell’s textbook
-
Chapter 2 of Hal Daumé III, A Course in Machine Learning (available online)
Miscellaneous
- Flatland: A Romance of Many Dimensions, a 1884 novella that serves as a fun introduction to multiple dimensions. The link points to the Wikipedia page for the book. The actual source of the book is also available online. See the links at the bottom of the Wikipedia page.