Question

I am trying to train an SVM in scikit. I am following the example and tried to adjust it to my 3d feature vectors. I tried the example from the page http://scikit-learn.org/stable/modules/svm.html and it ran through. While bugfixing I came back to the tutorial setup and found this:

X = [[0, 0], [1, 1],[2,2]]
y = [0, 1,1]
clf = svm.SVC()
clf.fit(X, y) 

works while

X = [[0, 0,0], [1, 1,1],[2,2,2]]
y = [0, 1,1]
clf = svm.SVC()
clf.fit(X, y)

fails with: ValueError: X.shape[1] = 2 should be equal to 3, the number of features at training time

what is wrong here? It's only one additional dimension... Thanks, El

Was it helpful?

Solution

Running your latter code works for me:

>>> X = [[0,0,0], [1,1,1], [2,2,2]]
>>> y = [0,1,1]
>>> clf = svm.SVC()
>>> clf.fit(X, y)
SVC(C=1.0, cache_size=200, class_weight=None, coef0=0.0, degree=3, gamma=0.0,
  kernel='rbf', max_iter=-1, probability=False, shrinking=True, tol=0.001,
  verbose=False)

That error message seems like it should actually happen when you're calling .predict() on an SVM object with kernel="precomputed". Is that the case?

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top