Scikit-Learn
54
This chapter focusses on the polynomial features and pipelining tools in Sklearn.
Introduction to Polynomial Features
Linear models trained on non-linear functions of data generally
maintains the fast
performance of linear methods. It also allows them to fit a much wider range of data.
That’s the reason in machine learning such linear models, that are trained on nonlinear
functions, are used.
One such example is that a simple linear regression can
be extended by constructing
polynomial features from the coefficients.
Mathematically, suppose we have standard linear regression model then for 2-D data it
would look like this:
𝑌 = 𝑤
0
+ 𝑤
1
𝑥
1
+ 𝑤
2
𝑥
2
Now, we can combine the features in second-order polynomials and our model will look
like as follows:
𝑌 = 𝑤
0
+ 𝑤
1
𝑥
1
+ 𝑤
2
𝑥
2
+ 𝑤
3
𝑥
1
𝑥
2
+ 𝑤
4
𝑥
1
2
+ 𝑤
5
𝑥
2
2
The above is still a linear model. Here, we saw that the resulting polynomial regression is
in the same class of linear models and can be solved similarly.
To do so, scikit-learn
provides a module named
Dostları ilə paylaş: