Question

I'm very new to machine learning, I'v read about Matlab's Statistics toolbox for hidden Markov model, I want to classify a given sequence of signals using it. I'v 3D co-ordinates in matrix P i.e [501x3] and I want to train model based on that. Evert complete trajectory ends on a specfic set of points, i.e at (0,0,0) where it achieves its target.

What is the appropriate Pseudocode/approach according to my scenario.

My Pseudocode:

  1. 501x3 matrix P is Emission matrix where each co-ordinate is state
  2. random NxN transition matrix values (but i'm confused in it)
  3. generating test sequence using the function hmmgenerate
  4. train using hmmtrain(sequence,old_transition,old_emission)
  5. give final transition and emission matrix to hmmdecode with an unknown sequence to give the probability (confusing also)

EDIT 1: In a nutshell, I want to classify 10 classes of trajectories having each of [501x3] with HMM. I want to sampled 50 rows i.e [50x3] for each trajectory in order to build model. However, I'v murphyk's toolbox of HMM for such random sequences.

Was it helpful?

Solution 2

The statement/case tells to build and train a hidden Markov's model having following components specially using murphyk's toolbox for HMM as per the choice:

  1. O = Observation's vector
  2. Q = States vector
  3. T = vectors sequence
  4. nex = number of sequences
  5. M = number of mixtures

Demo Code (from murphyk's toolbox):

    O = 8;          %Number of coefficients in a vector
    T = 420;         %Number of vectors in a sequence
    nex = 1;        %Number of sequences
    M = 1;          %Number of mixtures
    Q = 6;          %Number of states



data = randn(O,T,nex);

% initial guess of parameters
prior0 = normalise(rand(Q,1));
transmat0 = mk_stochastic(rand(Q,Q));

if 0
    Sigma0 = repmat(eye(O), [1 1 Q M]);
    % Initialize each mean to a random data point
    indices = randperm(T*nex);
    mu0 = reshape(data(:,indices(1:(Q*M))), [O Q M]);
    mixmat0 = mk_stochastic(rand(Q,M));
else
    [mu0, Sigma0] = mixgauss_init(Q*M, data, 'full');
    mu0 = reshape(mu0, [O Q M]);
    Sigma0 = reshape(Sigma0, [O O Q M]);
    mixmat0 = mk_stochastic(rand(Q,M));
end

[LL, prior1, transmat1, mu1, Sigma1, mixmat1] = ...
    mhmm_em(data, prior0, transmat0, mu0, Sigma0, mixmat0, 'max_iter', 5);


loglik = mhmm_logprob(data, prior1, transmat1, mu1, Sigma1, mixmat1);

OTHER TIPS

Here is a general outline of the approach to classifying d-dimensional sequences using hidden Markov models:

1) Training:

For each class k:

  • prepare an HMM model. This includes initializing the following:
    • a transition matrix: Q-by-Q matrix, where Q is the number of states
    • a vector of prior probabilities: Q-by-1 vector
    • the emission model: in your case the observations are 3D points so you could use a mutlivariate normal distribution (with specified mean vector and covariance matrix) or a Guassian mixture model (a bunch of MVN distributions combined using mixture coefficient)
  • after properly initializing the above parameters, you train the HMM model, feeding it the set of sequences belong to this class (EM algorithm).

2) Prediction

Next to classify a new sequence X:

  • you compute the log-likelihood of the sequence using each model log P(X|model_k)
  • then you pick the class that gave the highest probability. This is the class prediction.

As I mentioned in the comments, the Statistics Toolbox only implement discrete observation HMM models, so you will have to find another libraries or implement the code yourself. Kevin Murphy's toolboxes (HMM toolbox, BNT, PMTK3) are popular choices in this domain.

Here are some answers I posted in the past using Kevin Murphy's toolboxes:

The above answers are somewhat different from what you are trying to do here, but it's a good place to start.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top