문제

I am trying to do the following with weka's MultilayerPerceptron:

  1. Train with a small subset of the training Instances for a portion of the epochs input,
  2. Train with whole set of Instances for the remaining epochs.

However, when I do the following in my code, the network seems to reset itself to start with a clean slate the second time.

mlp.setTrainingTime(smallTrainingSetEpochs);

mlp.buildClassifier(smallTrainingSet);

mlp.setTrainingTime(wholeTrainingSetEpochs);

mlp.buildClassifier(wholeTrainingSet);

Am I doing something wrong, or is this the way that the algorithm is supposed to work in weka?

If you need more information to answer this question, please let me know. I am kind of new to programming with weka and am unsure as to what information would be helpful.

도움이 되었습니까?

해결책

This thread on the weka mailing list is a question very similar to yours.

It seems that this is how weka's MultilayerPerceptron is supposed to work. It's designed to be a 'batch' learner, you are trying to use it incrementally. Only classifiers that implement weka.classifiers.UpdateableClassifier can be incrementally trained.

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top