• Sheikh Aman

# Introduction for K-Nearest Neighbor Algorithm. | For Beginners

Updated: Aug 10, 2020

### Topics you will learn now about KNN algorithm.

• What is the K-Nearest Neighbour (KNN) algorithm?

• What is the need of KNN algorithm?

• When we use KNN algorithm?

• KNN algorithm steps. | Working of KNN. | Pseudocode for KNN algorithm.

• How to select K for the KNN algorithm.

• Advantage and Disadvantage of KNN algorithm.

• How to improve performance of KNN?

• Conclusion.

### What is the K-Nearest Neighbor algorithm?

• K-Nearest Neighbour is one among the easiest and simplest Machine Learning algorithms, which is based on Supervised Learning technique.

• K-NN algorithm assumes the similarity between the new case/data and available cases and put the new case into the category that's most almost like the available categories.

• K-NN algorithm stores all the available data and classifies a brand new data point supported the similarity. this implies when new data appears then it will be easily classified into a good suite category by using K- NN algorithm.

• K-NN algorithm is often used for Regression also as for Classification but mostly it's used for the Classification problems.

• K-NN, a non-parametric algorithm, which suggests it doesn't make any assumption on underlying data.

• It is also called a lazy learner algorithm because it doesn't learn from the training set immediately instead it stores the dataset and at the time of classification, it performs an action on the dataset.

• KNN algorithm at the training phase just stores the dataset and when it gets new data, then it classifies that data into a category that's much almost like the new data.

### What is the need of K-NN algorithm?

Suppose there are two categories, i.e., Category B and Category C, and that we have a new data point a1, so this data will belong which of those categories. to resolve this sort of problem, we'd like a K-NN algorithm. With the assistance of K-NN, we will easily identify the category or class of a specific dataset.

### When we should use K-NN algorithm?

KNN is used for both classification and regression predictive problems. However, it's more widely utilized in classification problems within the industry. to judge any technique we generally check out 3 important aspects:

1. Ease to interpret the output.

2. Calculation time.

3. Predictive Power.

### KNN algorithm steps. | Working of KNN. | Pseudocode for KNN algorithm.

1. Load the dataset

2. Initialize K to your chosen number of neighbours.

3. for every example within the data

• Calculate the distance between the query example and therefore the current example from the data.

• Add the distance and also the index of the example to an ordered collection

4. Sort the ordered collection of distances and indices from smallest to largest i.e ascending order by the distances

5. Pick the primary K entries from the sorted collection

6. Get the labels of the chosen K entries

7. For regression, return the mean of the K labels

8. For classification, return the mode of the K labels

### How to select K for KNN algorithm.

Now, you understand the KNN algorithm working mechanism. At now, the question arises that a way to choose the optimal number of neighbours? And what are its effects on the classifier or regressor? the amount of neighbours (K) in KNN may be a hyperparameter that you simply need to choose at the time of model building. you'll think about K as a controlling variable for the prediction model.

Research has shown that no optimal number of neighbours suits all reasonably data sets. Each dataset has it's own requirements. within the case of a little number of neighbors, the noise will have a better influence on the result, and a large or greater number of neighbors make it computationally expensive. Research has also shown that a little amount of neighbors are most flexible fit which can have low bias but the high variance and a large or greater number of neighbors will have a smoother decision boundary which suggests lower variance but higher bias.

Generally, Data scientists choose as an odd number if the amount of classes is even. you'll also check by generating the model on different values of k and check their performance.

Below are some points to recollect while selecting the worth or value of K within the K-NN algorithm:

• There is no particular way to determine the most effective value for "K", so we'd like to do some values to search out the most effective out of them. the foremost preferred value for K is 5.

• A very low value for K like K=1 or K=2, are often noisy and cause the results of outliers within the model.

• Large values for K are good, but they should find some difficulties.

### Advantage and Disadvantage of KNN algorithm.

Advantage

The training phase of K-nearest neighbor classification is way faster compared to other classification algorithms. there's no need to train a model for generalization, that's why KNN is known for simple and instance-based learning algorithm. KNN is often useful just in case of nonlinear data. It is often used with the regression problem. Output value for the item is computed by average of k closest neighbors value.

Disadvantage

The testing phase of K-nearest neighbor classification is slower and costlier in terms of your time and memory. It requires large memory for storing the whole training dataset for prediction. KNN requires scaling of data because KNN uses the Euclidean distance between two data points to search out nearest neighbors. Euclidean distance is sensitive to magnitudes. The features with high magnitudes will weight much more than features with low magnitudes. KNN also not suitable for big dimensional data.

### How to improve performance of KNN?

For better results, normalizing data on a similar scale is extremely recommended. Generally, the range of normalization is considered between 0 and 1. KNN isn't suitable for the big or you can say massive dimensional data. In such cases, the dimension must reduce to improve or boost performance. Handling missing values will always help us in improving results.

### Conclusion

Great, Here comes an end of this discussion. I will not say this as a tutorial because a healthy discussion effects more than a tutorial. If you agree with my point then let me know in the comment section.

Here we have learned Following thing:

• What is the K-Nearest Neighbour (KNN) algorithm?

• What is the need of KNN algorithm?

• When we use KNN algorithm?

• KNN algorithm steps. | Working of KNN. | Pseudocode for KNN algorithm.

• How to select K for KNN algorithm.

• Pseudocode for KNN algorithm.

• Advantage and Disadvantage of KNN algorithm.

• How to improve performance of KNN?

I will always look forward to any feedback or question from your side. Ask a question in the comment section, I will try my best to answer. Till then stay tuned By filling Subscribe form present at end of this page(header section).

Thank you!

56 views0 comments

See All