One thing that's a memory hog is Solr's caches. Take a look at your solrconfig.xml file inside the "conf" dir of each of your Solr cores, and look at the value configured for caches such as:
<filterCache class="solr.FastLRUCache"
size="100"
initialSize="0"
autowarmCount="0"/>
There may be multiple entries like this one. Make sure that, at least the autowarmCount and initialSize are set to 0. Further more, lower the "size" value to something small, like 100 or something. All these values refer to number of entries in the cache.
Another thing that may help is configuring Solr to do hard-commits more often. Look for an entry such as:
<!-- stuff ommited for brevity -->
<autoCommit>
<maxDocs>5000</maxDocs>
<maxTime>15000</maxTime>
<openSearcher>false</openSearcher>
</autoCommit>
The above settings will commit to disk each time 5000 documents have been added or 15 seconds have passed since the last commit, which ever comes first. Also set openSearcher to false.
Finally, look for these entries and set them as follows:
<ramBufferSizeMB>16</ramBufferSizeMB>
<maxBufferedDocs>5000</maxBufferedDocs>
Now, making all this modifications on Solr at once will surely make it run a lot slower. Try instead to make them incrementally, until you get rid of the memory error. Also, it may simply be that you need to allocate more memory to your Java process. If you say the machine has 4 Gb of RAM, why not try your test with -Xmx2g or -Xmx3g ?