Question

This is not the first time I've had problems with clock times in a programming language. Basically I'm measuring how fast a function is running by calling it in a while loop. Problem is that for some reason the elapsed time keeps getting shorter the longer the while loop runs. Can anyone explain? Code below.

DescriptiveStatistics stats = new DescriptiveStatistics();
while(true) {
    long startTime = System.nanoTime();
    executeSaxonXsltTransformation();
    long stopTime = System.nanoTime();
    long elapsedTime = stopTime-startTime;
    stats.addValue((double)elapsedTime);
    System.out.println(stats.getN()+" - "+elapsedTime+ " - "+stats.getMean());
}

So after about 1,000 runs the elapsed time is 750k to 850k. But after about 100,000 runs the elapsed time drops to 580k to 750k. The continual decrease is best noticed by watching the average (stats.getMeans()), which after 108k loops has an average of ~632k compared to 3k loops with an average of ~1million. Switching to currentTimeMillis instead of nanoTime doesn't change anything.

Was it helpful?

Solution

This is totally supposed to happen, because Java's JIT optimizes code that gets run extensively -- the more it's run, the more effort the JIT puts into optimizing it.

If you're trying to benchmark, you should "warm up" the benchmark by running the method for a few seconds without timing, and only then start doing timing. Alternately, you could use a library that knows how to do consistent benchmarking in Java -- warming up the JIT, getting accurate measurements whether your method takes nanoseconds or seconds -- like Caliper.

OTHER TIPS

Problem is that for some reason the elapsed time keeps getting shorter the longer the while loop runs. Can anyone explain?

That's Hotspot for you - the more you run the code, the more aggressively it will optimize. Eventually it will have done all it can, and you'll see a plateau in results.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top