Question

Can you spot the bug? This will throw an java.lang.OutOfMemoryError.

import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;

public class TestTheads {

    public static void main(String[] args) {

        ExecutorService executorService = Executors.newFixedThreadPool(1);
        while(true) {
            executorService.submit(new Runnable() {
                public void run() {
                    try {
                        Thread.sleep(10);
                    } catch (InterruptedException e) {
                    }
                }
            });
        }
    }

}

The bug is that I call executorService.submit() instead of executorService.execute(), because submit() returns a Future object I'm ignoring. With execute(), this program will actually run forever.

However, one does not always have the luxury of having an execute() method, like when using a ScheduledExecutorService:

public static void main(String[] args) {
    // this will FAIL because I ignore the ScheduledFuture object
    ScheduledExecutorService executorService = Executors.newScheduledThreadPool(2);
    while(true) {
        executorService.scheduleWithFixedDelay(new Runnable() {
            public void run() {
                try {
                    Thread.sleep(10);
                } catch (InterruptedException e) {
                }
            }
        }, 1, 1, TimeUnit.SECONDS);
    }
}

What is one supposed to do with tasks that don't return anything, only compute?

Any ideas would be grateful appreciated!

EDIT: ThreadPoolExecutors purge() looked promising, but it only purges cancelled tasks.

Was it helpful?

Solution

The Future object that is returned is strongly referenced by the ExecutorService only until it is executed. (It is actually a FutureTask instance that delegates to your Runnable.) Once it has executed, it will be garbage collected, because the caller has no reference to it. In other words, the memory problem has nothing to do with the treatment of the Future.

If you are running out of memory, it is because the work queue has millions of tasks queued up. As with any queue, unless the average rate of consumption exceeds the average rate of production, the queue will fill up. The contents of the queue consume memory.

Use a bounded queue, which will effectively, throttle task queuing, or get more memory.

This code will run "forever":

  ExecutorService executorService = Executors.newFixedThreadPool(1);
  while(true) {
    executorService.submit(new Runnable() {
      public void run() {
        try {
          Thread.sleep(10);
        } catch (InterruptedException e) { }
      }
    });
    Thread.sleep(12);
  }

The difference is not in the treatment of the resulting Future instances, but that tasks are queued at a rate at which they can be processed.

OTHER TIPS

It's not so much the Futures that are being returned that is your problem. The problem is, for each Runnable you submit, the ExecutorService will store them to be able to process at a later time. Each time you invoke the submit method with a Runnable (or Future), the ExecutorService will push that runnable to a worker queue. The Runnable will sit there until a Thread can pick that Runnable off the queue (the later time). If all worker threads are busy then the ExecutorService will simply put the runnable to said queue.

So your problem is that you have only one thread trying to pull of a queue that is infinititely added to by another Thread. Its being added much faster then the worker thread can process each Runnable.

Edit: The code example I gave as nos eluded to does in fact throw a RejectedExecutionException, so the mechanism for throttling would have to be slightly different if you were to choose.

As far as a better solution, like I mentioned in my comment; If you are expecting to fill up the ExecutorService in such a way that the worker threads cannot keep up with the queue, you can serialize and deserialize requests as they come in (building your own ThreadPoolExecutor) but I would make sure the need for such a case is absolutely necessary.

Keep in mind after the work has been done those Future's will be disgarded and garbage collected. So if you do one Future per second and it executes in under a second the Future itself will be removed and you will not have a memory issue. But if you are doing one Future a second and the threads do a Future every 3 seconds, that will draw and issue.

Edit: I profiled the heap of the program you are running and the issue is exactly that. The FutureTask being created by the ExecutorService is sitting on the worker queue until the worker thread picks it off

Class Name                                                       | Shallow Heap | Retained Heap | Percentage 
------------------------------------------------------------------------------------------------------------
java.util.concurrent.ThreadPoolExecutor @ 0x78513c5a0            |          104 | 2,051,298,872 |     99.99% 
|- java.util.concurrent.LinkedBlockingQueue @ 0x785140598        |           80 | 2,051,298,216 |     99.99% 
|  |- java.util.concurrent.LinkedBlockingQueue$Node @ 0x785142dd8|           32 | 2,051,297,696 |     99.99% 
------------------------------------------------------------------------------------------------------------

The heap analysis goes on a bit, there are many LinkedBlockingQueue$Node's as you can imagine

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top