Training the network
You should use each instance of the training set once per training epoch.
A training epoch is a complete cycle through your dataset.
After you've looped through the dataset and calculated the deltas, you should adjust the weights of the network. Then you may perform a new forward pass on the neural network and do another training epoch, looping through your training dataset.
Graphical representation
A really great graphical representation of backpropagation may be found at this link.
Single-step training
There are two approaches to train you network to perform classification on a dataset. The easiest method is called single-step or online learning. This is the method you will find in most litterature, and it is also the fastest to converge. As you train your network you will calculate the deltas for each layer and adjust the weights for each instance of your dataset.
Thus if you have a dataset of 60 instances, this means you should have adjusted the weights 60 times before the training epoch is over.
Batch training
The other approach is called batch training or offline learning. This approach often yields a network with a lower residual error. When you train the network you should calculate the deltas for each layer for every instance of the dataset, and then finally average the individual deltas and correct the weights once per epoch.
If you have a dataset of 60 instances, this means you should have adjusted the weights once before the training epoch is over.