This tutorial demonstrates the implementation of Linear Regression from Scikit Learn library.
###Importing Libraries
import numpy as np
from sklearn import datasets, linear_model
from sklearn.metrics import r2_score, mean_squared_error
from sklearn.model_selection import train_test_split as tts
###Importing Dataset
X, y = datasets.load_diabetes(return_X_y=True)
X = X[:, np.newaxis, 1] #Choosing one of the features for regression
###Splitting train/test data
X_tr, X_ts, y_tr, y_ts = tts(X,y, test_size=30/100, random_state=None)
###Creating Linear Regression Model (OLS)
linreg = linear_model.LinearRegression()
###Training the Model
linreg.fit(X_tr, y_tr)
###Making Predictions
y_pr = linreg.predict(X_ts)
# print(y_pr)
###Evaluating Prediction Accuracy
print('Coefficients: \n', regr.coef_)
print('Mean squared error: %.2f' % mean_squared_error(y_ts, y_pr))
print('Coefficient of determination: %.2f' % r2_score(y_ts, y_pr))
###Making Prediction with Foreign Data
linreg.predict([4.5555])
This tutorial aims to provide a simple, clear and reusable Linear Regression implementation, so that, seasoned visitors can just take a look at it, comprehend what’s going on and copy/reproduce it for their own use cases without losing too much time.
On the other hand it gives an opportunity to visitors with less experience with Machine Learning and Python to experiment with algorithm. You can also check out this step-by-step implementation where everything is explained in detail.
If you’d like to see a step by step explanation of this algorithm you can check out this tutorial.
You can also see its history, read about the optimization parameters and find more examples in the main Linear Regression page here.