|KNN||k nearest neigbhors to a given point|
|k||the number of neighbors|
|n||the number of data points, data is sorted to speed up the algorithm|
|distance||Euclidean distance shortest line connecting the points||OR a custom function that defines the distance|
|visualize||scatter plot, each point is a circle with a radius that include certain number of neighbors|
|KNN||a query intensive algorithm, not learning intensive|
|performance||big o notation, log(n) binary search, 1 is constant, n is linear|
|intuition||KNN stores all the data, then performs a binary search on the data when querying. Linear regression only stores the model y = mx+b. Key concepts: LEARN vs QUERY|
|1 NN 1-nearest neigbhor, 1 dimensional list e.g. [1 2 4 7 8 9]||learning||1||n||KNN all data to storage without learning, so running time is 1 which means constant in Big O notation, and storage space is n for the number of data points|
|query||log(n) binary search to find one point||1|
|query||log(n) + k binary search log(n) to find one point and the k items next to it in a sorted list||1|
Wednesday, February 8, 2017
Machine Learning K Nearest Neighbors KNN Algorithm
There are three key k means clustering parameters in sklearn that you will need to pay attention to: Number of centroids, aka center of c...
What is a domain name system (DNS)? How stuff works explains it in a very good graph I was very confused by the Wikipedia explanatio...
The bogus request from P2PU to hunt for HTML tags in real life has yielded a lot of good thoughts. My first impression was that this is stup...
Dilys Sun Got a question about web development dev bootcamps? Ask them here or @i_stanford Your question shall be answered by myself, othe...