This tutorial demonstrates the implementation of k Nearest Neighbor (kNN) from Scikit Learn library.

###Importing Libraries
from sklearn.model_selection import train_test_split
from sklearn.neighbors import KNeighborsClassifier as knn
from sklearn import metrics
from sklearn import datasets

###Importing Dataset
iris = datasets.load_iris()
data = pd.DataFrame({"sl":iris.data[:,0], "sw":iris.data[:,1], "pl":iris.data[:,2], "pw":iris.data[:,3], 'species': iris.target})

###Splitting train/test data
X=data[['sl','sw','pl','pw']]
y=data["species"]
X_tr, X_ts, y_tr, y_ts = tts(X,y, test_size=30/100)

###Creating kNN Classifier Model
KNN = knn(n_neighbors=5)

###Training the Model
KNN.fit(X_tr, y_tr)

###Making Predictions
y_pr = KNN.predict(X_ts)

###Evaluating Prediction Accuracy
print("Accuracy:",metrics.accuracy_score(y_ts, y_pr))

###Making Prediction with Foreign Data
print(KNN.predict([[1, 5, 3.5 ,6]]))

This tutorial aims to provide a simple, clear and reusable k-Nearest Neighbor implementation, so that, seasoned visitors can just take a look at it, comprehend what’s going on and copy/reproduce it for their own use cases without losing too much time.

On the other hand it gives an opportunity to visitors with less experience with Machine Learning and Python to experiment with algorithm.

If you’d like to see a step by step explanation of this algorithm you can check out this tutorial.

You can also see its history, read about the optimization parameters and find more examples in the main k-Nearest Neighbor page here.