Domanda

I am not a Multithreading Expert but I am seeing some performance issues with my current code which is using ExecutorService.

I am working on a project in which I need to make a HTTP URL call to my server and if it is taking too long time to respond then timeout the call. Currently it is returning simple JSON String back..

Current requirement I have is for 10 ms. Within 10 ms it should be able to get the data back from the server. I guess its possible since it is just an HTTP call to server within the same datacenter.

My client program and actual servers are within same datacenter and ping time latency is 0.5 ms between them so it should be doable for sure..

I am using RestTemplate for this to make the URL call.

Below is my code which I have wrote for me which uses ExecutorService and Callables -

public class URLTest {

    private ExecutorService executor = Executors.newFixedThreadPool(10);

    public String getData() {
        Future<String> future = executor.submit(new Task());
        String response = null;

        try {
            System.out.println("Started..");
            response = future.get(100, TimeUnit.MILLISECONDS);
            System.out.println("Finished!");
        } catch (TimeoutException e) {
            System.out.println("Terminated!");
        } catch (InterruptedException e) {
            e.printStackTrace();
        } catch (ExecutionException e) {
            e.printStackTrace();
        }

        return response;
    }
}

Below is my Task class which implements Callable interface -

class Task implements Callable<String> {

    private RestTemplate restTemplate = new RestTemplate();

    public String call() throws Exception {
        //  TimerTest timer = TimerTest.getInstance();  // line 3
            String response = restTemplate.getForObject(url, String.class);
        //  timer.getDuration();    // line 4

        return response;

    }
}

And below is my code in another class DemoTest which calls the getData method in URLTest class 500 times and measure the 95th percentile of it end to end -

public class DemoTest { 
   public static void main(String[] args) {

        URLTest bc = new URLTest();

        // little bit warmup
        for (int i = 0; i <= 500; i++) {
            bc.getData();
        }

        for (int i = 0; i <= 500; i++) {
            TimerTest timer = TimerTest.getInstance(); // line 1
            bc.getData();
            timer.getDuration(); // line 2
        }

        // this method prints out the 95th percentile
        logPercentileInfo();

    }
}   

With the above code as it is, I am always seeing 95th percentile as 14-15 ms (which is bad for my use case as it is end to end flow and that's what I need to measure).

I am surprised why? Is ExectuorFramework adding all the latency here?. May be Each task is submitted, and the submitting thread is waiting (via future.get) until the task is finished..

My main goal is to reduce the latency here as much as possible.. My use case is simple, Make a URL call to one of my server with a TIMEOUT feature enabled, meaning if the server is taking lot of time to response, then Timeout the whole call. Customer will call our code from there application which can be multithreaded as well..

Is there anything I am missing or some other flavors of ExecutorService I need to use? How can I improve my performance here? Any suggestions will be of great help..

Any example will be greatly appreciated.. I was reading about ExecutorCompletionService not sure whether I should use this or something else..

È stato utile?

Soluzione

As for your observation that you are measuring 15 ms on the outside, but only 3 ms on the inside, my bet is that the construction of the RestTemplate takes the difference. This could be fixed by refactoring.

Note that RestTemplate is a heavyweight, thread-safe object, and is designed to be deployed as an application-wide singleton. Your current code is in critical violation of this intent.


If you need asynchronous HTTP requests, you should really use an asynchronous HTTP library such an AsyncHttpClient, based on Netty underneath, which is again based on Java NIO. That means that you don't need to occupy a thread per an outstanding HTTP request. AsyncHttpClient also works with Futures so you'll have an API you are used to. It can also work with callbacks, which is preferred for the asynchronous approach.

However, even if you keep your current synchronous library, you should at the very least configure a timeout on the REST client instead of letting it run its course.

Altri suggerimenti

then run the program again it will start giving me 95th percentile as 3 ms. So not sure why end to end flow gives me 95th percentile as 14-15 ms

You are generating the tasks faster than you can process them. This means the longer you run the tests, the further behind it gets as it is queuing them up. I would expect if you made this 2000 requests you would see latencies up to 4x what you do now. The bottlneck could be on the client side (in which case more threads would help) but quite likely the bottleneck is on the server side in which case more threads could make it worse.


The default behaviour for HTTP is to establish a new TCP connection for each request. The connection time for a new TCP connection can be up to 20 ms easily even if you have two machines side by side. I suggest looking at using HTTp/1.1 and maintain a persistent connection.

BTW You can ping from one side of London to the other in 0.5 ms. However, getting below 1 ms with HTTP reliably is tricky as the protocol is not designed for low latency. It is designed for use on high latency networks.

Note: you cannot see latencies below 25 ms and 100 ms is plenty fast enough for most web requests. It is with these sort of assumption that HTTP was designed.

Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top