Question

Im new to the Neural Network field(to tell the truth i just started few days back). I want to use neural network in my OCR application to recognize handwritten text.

what i want to know is, is it possible to train the network after the initial training. In other words im going to train few characters in the beginning but i want to add more characters to the network later without affecting the existence of the previously trained data.(suppose i've created the neural network with adequate out put neurones for additional characters). If this is possible how can i use encog to get this done.

thank you

Was it helpful?

Solution

Yes and no. If you train the same neural network to recognize new characters, the weights (θ) will certainly change between the layers to accommodate the new characters. Since your X / Y values have changed, the cost function may also need to change to fit the new data with more accuracy. However as long as your error rate is within acceptable values, you should have no trouble.

On the other hand, you could use 2 neural networks - one for your initial set and the other for your new set of characters. Neuroph allows you to save each neural network into a file and you can load the appropriate one based on your needs.

PS: I assume here that characters refers to 'A' / 'B' / 'C' and not neural network variables such as x1 / x2 / x3 (characteristics of the network)

OTHER TIPS

You can save the neural-net to disk and retrain it later using the TriningContinuation class in Encog. Below is a code sample from Encog examples.

public void TestRPROPContPersistEG()
{
    IMLDataSet trainingSet = XOR.CreateXORDataSet();
    BasicNetwork net1 = XOR.CreateUnTrainedXOR();
    BasicNetwork net2 = XOR.CreateUnTrainedXOR();

    ResilientPropagation rprop1 = new ResilientPropagation(net1, trainingSet);
    ResilientPropagation rprop2 = new ResilientPropagation(net2, trainingSet);

    rprop1.Iteration();
    rprop1.Iteration();

    rprop2.Iteration();
    rprop2.Iteration();

    TrainingContinuation cont = rprop2.Pause();

    EncogDirectoryPersistence.SaveObject(EG_FILENAME, cont);
    TrainingContinuation cont2 = (TrainingContinuation)EncogDirectoryPersistence.LoadObject(EG_FILENAME);

    ResilientPropagation rprop3 = new ResilientPropagation(net2, trainingSet);
    rprop3.Resume(cont2);

    rprop1.Iteration();
    rprop3.Iteration();


    for (int i = 0; i < net1.Flat.Weights.Length; i++)
    {
        Assert.AreEqual(net1.Flat.Weights[i], net2.Flat.Weights[i], 0.0001);
    }
}
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top