There are different methods of Regression used in machine learning. The different techniques are listed below:
- Linear Regression
- Polynomial Regression
- Ridge Regression
- Lasso Regression
- Non Parametric Regression
- K-Nearest Neighbor Regression
- Kernel Regression
The types of the regression is dependent on the number of explanatory variables such as single (simple) and multiple.
In the next section, linear regression is discussed in detail.
Linear Regression is very popular modeling method. This method consists of dependent and independent variables. Dependent variables are continuous. Independent variables are continuous and discrete. In linear regression, independent variables (Z) and dependent variables are used for identifying relationship between them. The relationship used is a straight line which is a best fit. It is also referred as linear regression.
It is represented by an equation W=mZ + c + err, where c is intercept, m is slope of the line and err is error term. To predict the value of a variable, the function W is used. The linear regression has single independent variable.
Multiple linear regression has more than independent variables. If there are more than one independent variable, multiple linear regression addresses the finding the fit for the line which relates the dependent variable and independent variables.
Least Square method is used for finding the fit for multiple linear regression technique. The method tries to minimize the sum of the squares of the differences from each point to the line. The deviations are squared and added to ensure that the positive and negative values are not cancelled out.
Code Snippet : Linear_Regression.py
Instructions for Running the Code
pip install numpy
pip install tensorflow
Output of the code Execution