Scikit-Learn
57
Here, we will learn about an optimization
algorithm in Sklearn,
termed as Stochastic
Gradient Descent (SGD).
Stochastic Gradient Descent (SGD) is a simple yet efficient optimization algorithm used to
find the values of parameters/coefficients of functions that minimize a cost function. In
other words, it is used for discriminative learning of linear classifiers under convex loss
functions such as SVM and Logistic regression. It has been successfully applied to large-
scale datasets because the update to the coefficients is
performed for each training
instance, rather than at the end of instances.
SGD Classifier
Stochastic Gradient Descent (SGD) classifier basically implements a plain SGD learning
routine supporting various loss functions and penalties for classification.
Scikit-learn
provides
SGDClassifier
module to implement SGD classification.
Parameters
Followings table consist
the parameters used by
SGDClassifier
module:
Parameter
Description
loss
: str, default =
‘hinge’
It represents the loss function to be used while implementing. The default
value is ‘hinge’ which will give us a linear SVM. The other options which
can be used are:
log
: This loss will give us logistic regression i.e.
a probabilistic
classifier.
modified_huber
: a smooth loss that brings tolerance to outliers
along with probability estimates.
Dostları ilə paylaş: