Question

I achieved the computation of the original HMAX model, and I get the results at C2 layer. Now I still have the tuned-layer, in other words, to use the osusvm.

In my project, I have two directories. One containing the training images and other containing the test images.

Reference: lennon310's response in Training images and test images

Firstly, I would like to show you my results at C2 layer (surely that the results should be a vectors). Notice that I extracted only 2 prototypes in the S2 layer (in my project I used 256 prototypes, but only in this question, assume that I used only 2 prototypes), and four prototypes sizes:[4 8 12 16]. So for each image, we get 8 C2 units (2 prototypes x 4 patch sizes = 8).

C2res{1}: For the six training images:

0.0088    0.0098    0.0030    0.0067    0.0063    0.0057    
0.0300    0.0315    0.0251    0.0211    0.0295    0.0248           
0.1042    0.1843    0.1151    0.1166    0.0668    0.1134            
0.3380    0.2529    0.3709    0.2886    0.3938    0.3078           
0.2535    0.3255    0.3564    0.2196    0.1681    0.2827          
3.9902    5.3475    4.5504    4.9500    6.7440    4.4033          
0.8520    0.8740    0.7209    0.7705    0.4303    0.7687       
6.3131    7.2560    7.9412    7.1929    9.8789    6.6764 

C2res{2}: For the two test images:

0.0080    0.0132
0.0240    0.0001
0.1007    0.2214
0.3055    0.0249
0.2989    0.3483
4.6946    4.2762
0.7048    1.2791
6.7595    4.7728

Secondly, I downloaded the osu-svm matlab toolbox and I added its path:

addpath(genpath('./osu-svm/')); %put your own path to osusvm here


 useSVM = 1; %if you do not have osusvm installed you can turn this
            %to 0, so that the classifier would be a NN classifier
        %note: NN is not a great classifier for these features

Then I used the code below:

%Simple classification code
XTrain = [C2res{1}]; %training examples as columns 
XTest =  [C2res{2}]; %the labels of the training set
ytrain = [ones(size(C2res{1},2),1)];%testing examples as columns
ytest = [ones(size(C2res{2},2),1)]; %the true labels of the test set
if useSVM
  Model = CLSosusvm(XTrain,ytrain);  %training
  [ry,rw] = CLSosusvmC(XTest,Model); %predicting new labels
else %use a Nearest Neighbor classifier
  Model = CLSnn(XTrain, ytrain); %training
  [ry,rw] = CLSnnC(XTest,Model); %predicting new labels
end  
successrate = mean(ytest==ry) %a simple classification score

Does the code just above is true ? Why I always get a successrate=1 ? I think that I am wrong in some places. Please I need help. If it is true, does another way to compute that ? What can I use instead of successrate in order to get more sexy results?

Note:

The function CLSosusvm is :

function Model = CLSosusvm(Xtrain,Ytrain,sPARAMS);
%function Model = CLSosusvm(Xtrain,Ytrain,sPARAMS);
%
%Builds an SVM classifier
%This is only a wrapper function for osu svm
%It requires that osu svm (http://www.ece.osu.edu/~maj/osu_svm/) is installed and included in the path
%X contains the data-points as COLUMNS, i.e., X is nfeatures \times nexamples
%y is a column vector of all the labels. y is nexamples \times 1
%sPARAMS is a structure of parameters:
%sPARAMS.KERNEL specifies the kernel type
%sPARAMS.C specifies the regularization constant
%sPARAMS.GAMMA, sPARAMS.DEGREE are parameters of the kernel function
%Model contains the parameters of the SVM model as returned by osu svm





Ytrain = Ytrain';

if nargin<3
  SETPARAMS = 1;
elseif isempty(sPARAMS)
  SETPARAMS = 1;
else
  SETPARAMS = 0;
end

if SETPARAMS
  sPARAMS.KERNEL = 0;
  sPARAMS.C = 1;
end

switch sPARAMS.KERNEL,
  case 0,
    [AlphaY, SVs, Bias, Parameters, nSV, nLabel] = ...
    LinearSVC(Xtrain, Ytrain, sPARAMS.C);
  case 1,
    [AlphaY, SVs, Bias, Parameters, nSV, nLabel] = ...
    PolySVC(Xtrain, Ytrain, sPARAMS.DEGREE, sPARAMS.C, 1,0);
  case 2,
    [AlphaY, SVs, Bias, Parameters, nSV, nLabel] = ...
    PolySVC(Xtrain, Ytrain, sPARAMS.DEGREE, sPARAMS.C, 1,sPARAMS.COEF);
  case 3,
    [AlphaY, SVs, Bias, Parameters, nSV, nLabel] = ...
    RbfSVC(Xtrain, Ytrain, sPARAMS.GAMMA, sPARAMS.C);    
end

Model.AlphaY = AlphaY;
Model.SVs = SVs;
Model.Bias = Bias;
Model.Parameters = Parameters;
Model.nSV = nSV;
Model.nLabel = nLabel;
Model.sPARAMS = sPARAMS;

The function CLSosusvmC is :

function [Labels, DecisionValue]= CLSosusvmC(Samples, Model);
%function [Labels, DecisionValue]= CLSosusvmC(Samples, Model);
%
%wrapper function for osu svm classification
%Samples contains the data-points to be classified as COLUMNS, i.e., it is nfeatures \times nexamples
%Model is the model returned by CLSosusvm
%Labels are the predicted labels
%DecisionValue are the values assigned by the Model to the points (Labels = sign(DecisionValue))

[Labels, DecisionValue]= SVMClass(Samples, Model.AlphaY, ...
                                  Model.SVs, Model.Bias, ...
                  Model.Parameters, Model.nSV, Model.nLabel);
Labels = Labels';
DecisionValue = DecisionValue';
Was it helpful?

Solution

Your code looks good to me.

Since you have only 2 test images, the possible successful rate will be limited to 0, 0.5, 1. And it is expected to achieve the 100% accuracy with a 25% probability ([0 1],[1 0],[1 1],[0 0]). You can shuffle the data, and re-select 2 from the 8 as the test for some times, then observe the accuracy.

Also try to add more images to both training and test samples.

OTHER TIPS

Machine learning has little sense on the set of 8 images. Gather at least 10x more data, and then analyze the results. With such small dataset any results are possible (from 0 to 100 percent), and none of them is realiable.

Meanwhile you can try to perform repeated cross validation:

  1. Shuffle your data
  2. Split it to two-element parts ( [1 2][3 4][5 6][7 8] )
  3. For each of such parts: a) test on it, while training on the rest, so for example: train on [3 4 5 6 7 8] and test on [1 2] b) record the mean score
  4. Repeat the whole process and report the means score
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top