Question

I was recently reading about support vector machines and how they work and I stumbled on an article and came across Maximize the distance margin.

Can anyone tell me what do we have to minimize here? I wasn't able to understand this part I pasted below. What is $w$ and $m$ here in the formulas given below?

Maximize the margin

For the sake of simplicity, we will skip the derivation of the formula for calculating the margin, m which is $$m = \frac{2}{\|\vec{w}\|}$$

The only variable in this formula is w, which is indirectly proportional to m, hence to maximize the margin we will have to minimize ||w||. This leads to the following optimization problem:

$$\min_{(\vec{w}, b)} \frac{\|\vec{w}\|^2}{2}$$

subject to $$y_i(\vec{w}\cdot\vec{x}+b) \ge 1, \forall i = 1, \ldots, n$$

The above is the case when our data is linearly separable. There are many cases where the data can not be perfectly classified through linear separation. In such cases, Support Vector Machine looks for the hyperplane that maximizes the margin and minimizes the misclassifications.

Link to the article : Hackerearth

No correct solution

Licensed under: CC-BY-SA with attribution
Not affiliated with datascience.stackexchange
scroll top