Scikit-Learn
61
array([2])
For the above example, we can get the weight vector with the help of following python
script:
SGDClf.coef_
Output
array([[19.54811198, 9.77200712]])
Similarly, we can get the value of intercept with the help of following python script:
SGDClf.intercept_
Output
array([10.])
We
can
get
the
signed
distance
to
the
hyperplane
by
using
SGDClassifier.decision_function
as used in the following python script:
SGDClf.decision_function([[2., 2.]])
Output
array([68.6402382])
SGD Regressor
Stochastic Gradient Descent (SGD) regressor basically implements a plain SGD learning
routine supporting various loss functions and penalties to fit linear regression models.
Scikit-learn provides
SGDRegressor
module to implement SGD regression.
Parameters
Parameters used by
SGDRegressor
are almost same as that were used in
SGDClassifier
module. The difference lies in ‘loss’ parameter. For
SGDRegressor
modules’ loss
parameter the positives values are as follows:
squared_loss:
It refers to the ordinary least squares fit.
huber:
SGDRegressor
correct the outliers by switching
from squared to linear
loss past a distance of epsilon. The work of ‘huber’ is to modify ‘squared_loss’ so
that algorithm focus less on correcting outliers.
epsilon_insensitive:
Actually, it ignores the errors less than epsilon.
squared_epsilon_insensitive:
It is same as epsilon_insensitive. The only
difference is that it becomes squared loss past a tolerance of epsilon.
Scikit-Learn
62
Another difference is that the parameter named ‘power_t’ has the default value of 0.25
rather than 0.5 as in
Dostları ilə paylaş: