The maximum amount of patterns that can be learned in a hopfield network while giving an acceptable number of mistakes is called its capacity. The capacity is a function of the logarithm of the total number of neurons in the net, meaning that if you want more patterns, you have to increase the amount of neurons in the network. Also, the composite blobs are called mixed states (or sometimes spin-glass states, depending on what type of blob is shown). When a Hopfield network is put into a starting state, it tends to drive itself to the local energy minimum. Sometimes that minimum is not the trained pattern, but a state that is a mix of several patterns that were used in the training. Usually, these mixed states have a higher energy than the trained patterns, but if the starting state is nearer the mixed state, it will tend to drive itself to that local minimum. Sometimes including some noise to the network could avoid these local minimums and get the network to the most approximate trained state. You could include noise by generating a random number and only perform the sign operation if that number is above certain threshold.
In conclusion, adding neurons and noise to the network could help you solve your problem.