Python scikits learn - Separating Hyperplane equation
-
13-06-2021 - |
문제
A separating hyperplane's equation is W.X + b = 0
.
For a support vector machine in scikit-learn, how is the separating hyperplane derived? What do 'a
' and 'w
' signify?
해결책
In scikit-learn coef_
attribute holds the vectors of the separating hyperplanes for linear models. It has shape (n_classes, n_features)
if n_classes > 1
(multi-class one-vs-all) and (1, n_features)
for binary classification.
In this toy binary classification example, n_features == 2
, hence w = coef_[0]
is the vector orthogonal to the hyperplane (the hyperplane is fully defined by it + the intercept).
To plot this hyperplane in the 2D case (any hyperplane of a 2D plane is a 1D line), we want to find a f
as in y = f(x) = a.x + b
. In this case a
is the slope of the line and can be computed by a = -w[0] / w[1]
.