Question

Is it possible to use histogram intersection /chi sauare kernels in LIBLINEAR?

My problem is I have a feature vector of size 5000 all are histogram features. I dont know how to train/ test with SVM.

How can I train this using SVM?

libSVM supports for 4 types of kernels.

    0 -- linear: u'*v
1 -- polynomial: (gamma*u'*v + coef0)^degree
2 -- radial basis function: exp(-gamma*|u-v|^2)
3 -- sigmoid: tanh(gamma*u'*v + coef0)

LibSVM supports for linear kernel in that case what is the difference between libSVM and linearSVM?

Was it helpful?

Solution

No, you can't use custom kernels in liblinear.

To do what you want to do, you'll need to use LibSVM and the "precomputed kernel" option, where you supply the gram matrix (this is described in the LibSVM README).

In the case of linear kernels, LibSVM and LibLinear produce similar results. The author says this:

Their predictions are similar but hyperplanes are different. Libsvm solves L1-loss SVM, but liblinear solves L2-regularized logistic regression and L2-loss SVM.

OTHER TIPS

A bit late, but might help others: machine-learning package scikit-learn (http://scikit-learn.org/stable/modules/generated/sklearn.metrics.pairwise.chi2_kernel.html#sklearn.metrics.pairwise.chi2_kernel) offers at least the chi2-Kernel.

You can use linear SVM solver only if you explicite map your features into non linear feature space, i recommended to read:

  1. "Max-Margin additive classifiers for detection" - http://www.cs.berkeley.edu/~smaji/papers/mcd-free-lunch-iccv-09.pdf
  2. "Random features for large-scale kernel machines" - http://berkeley.intel-research.net/arahimi/papers/rahimi-recht-random-features.pdf
  3. "Efficient Additive Kernels via Explicit Feature Maps" - http://www.vlfeat.org/~vedaldi/assets/pubs/vedaldi11efficient.pdf

I just use chi2 kernel in Libsvm these day .I paste the code here ,hope it can be useful.

function [chi2_ans]=chi2_kernel(x,y)
   f=@(x,y) 1-sum(((x'-y').*(x'-y'))./(x'+y'+eps)*2);
   [m, ~]=size(x);
   chi2_ans=zeros(size(x,1),size(y,1));
   for i=1:size(x,1)
        veci=x(i,:);
         for j=1:size(y,1)
            vecj=y(j,:);
            chi2_ans(i,j)=f(veci,vecj);
        end 
   end
end

and use it .

function [ acc ] = singleChi2Kernel(   trainData,testData,trainLabel,testLabel )

numTrain = size(trainData,1);
numTest = size(testData,1);


%# compute kernel matrices between every pairs of (train,train) and
%# (test,train) instances and include sample serial number as first column
K =  [ (1:numTrain)' , chi2_kernel(trainData,trainData) ];
KK = [ (1:numTest)'  , chi2_kernel(testData,trainData)  ];

%# train and test
model = svmtrain(trainLabel, K, '-t 4  ');
[predClass, acc, decVals] = svmpredict(testLabel, KK, model);

%# confusion matrix
%C = confusionmat(testClass,predClass)

end

the code from link

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top