Question

For a reaction time study (see also this question if you're interested) we want to control and measure the display time of images. We'd like to account for the time needed to repaint on different users' machines.

Edit: Originally, I used only inline execution for timing, and thought I couldn't trust it to accurately measure how long the picture was visible on the user's screen though, because painting takes some time.

Later, I found the event "MozAfterPaint". It needs a configuration change to run on users' computers and the corresponding WebkitAfterPaint didn't make it. This means I can't use it on users' computers, but I used it for my own testing. I pasted the relevant code snippets and the results from my tests below.
I also manually checked results with SpeedTracer in Chrome.

// from the loop pre-rendering images for faster display
var imgdiv = $('<div class="trial_images" id="trial_images_'+i+'" style="display:none"><img class="top" src="' + toppath + '"><br><img class="bottom" src="'+ botpath + '"></div>');
Session.imgs[i] = imgdiv.append(botimg);
$('#trial').append(Session.imgs);

// in Trial.showImages
$(window).one('MozAfterPaint', function () {
    Trial.FixationHidden = performance.now();
});
$('#trial_images_'+Trial.current).show(); // this would cause reflows, but I've since changed it to use the visibility property and absolutely positioned images, to minimise reflows
Trial.ImagesShown = performance.now();

Session.waitForNextStep = setTimeout(Trial.showProbe, 500); // 500ms    

// in Trial.showProbe
$(window).one('MozAfterPaint', function () {
    Trial.ImagesHidden = performance.now();
});
$('#trial_images_'+Trial.current).hide();
Trial.ProbeShown = performance.now();
// show Probe etc...

Results from comparing the durations measured using MozAfterPaint and inline execution.

This doesn't make me too happy. First, the median display duration is about 30ms shorter than I'd like. Second, the variance using MozAfterPaint is pretty large (and bigger than for inline execution), so I can't simply adjust it by increasing the setTimeout by 30ms. Third, this is on my fairly fast computer, results for other computers might be worse.

Boxplot of durations

Relationship of durations measured using two methods

Results from SpeedTracer

These were better. The time an image was visible was usually within 4 (sometimes) 10 ms of the intended duration. It also looked like Chrome accounted for the time needed to repaint in the setTimeout call (so there was a 504ms difference between the call, if the image needed to repaint). Unfortunately, I wasn't able to analyse and plot results for many trials in SpeedTracer, because it only logs to console. I'm not sure whether the discrepancy between SpeedTracer and MozAfterPaint reflects differences in the two browsers or something that is lacking in my usage of MozAfterPaint (I'm fairly sure I interpreted the SpeedTracer output correctly).

Questions

I'd like to know

  1. How can I measure the time it was actually visible on the user's machine or at least get comparable numbers for a set of different browsers on different testing computers (Chrome, Firefox, Safari)?
  2. Can I offset the rendering & painting time to arrive at 500ms of actual visibility? If I have to rely on a universal offset, that would be worse, but still better than showing the images for such a short duration that the users don't see them consciously on somewhat slow computers.
  3. We use setTimeout. I know about requestAnimationFrame but it doesn't seem like we could obtain any benefits from using it:
    The study is supposed to be in focus for the entire duration of the study and it's more important that we get a +/-500ms display than a certain number of fps. Is my understanding correct?

Obviously, Javascript is not ideal for this, but it's the least bad for our purposes (the study has to run online on users' own computers, asking them to install something would scare some off, Java isn't bundled in Mac OS X browsers anymore).
We're allowing only current versions of Safari, Chrome, Firefox and maybe MSIE (feature detection for performance.now and fullscreen API, I haven't checked how MSIE does yet) at the moment.

Was it helpful?

Solution

Because I didn't get any more answers yet, but learnt a lot while editing this question, I'm posting my progress so far as an answer. As you'll see it's still not optimal and I'll gladly award the bounty to anyone who improves on it.

Statistics

New results

  • In the leftmost panel you can see the distribution that led me to doubt the time estimates I was getting.
  • The middle panel shows what I achieved after caching selectors, re-ordering some calls, using some more chaining, minimising reflows by using visibility and absolute positioning instead of display.
  • The rightmost panel shows what I got after using an adapted function by Joe Lambert using requestAnimationFrame. I did that after reading a blogpost about rAF now having sub-millisecond precision too. I thought it would only help me to smooth animations, but apparently it helps with getting better actual display durations as well.

Results

In the final panel the mean for the "paint-to-paint" timing is ~500ms, the mean for inline execution timing scatters realistically (makes sense, because I use the same timestamp to terminate the inner loop below) and correlates with "paint-to-paint" timing.

There is still a good bit of variance in the durations and I'd love to reduce it further, but it's definitely progress. I'll have to test it on some slower and some Windows computers to see if I'm really happy with it, originally I'd hoped to get all deviations below 10ms.

I could also collect way more data if I made a test suite that does not require user interaction, but I wanted to do it in our actual application to get realistic estimates.

window.requestTimeout using window.requestAnimationFrame

window.requestTimeout = function(fn, delay) {
    var start = performance.now(),
        handle = new Object();
    function loop(){
        var current = performance.now(),
            delta = current - start;

        delta >= delay ? fn.call() : handle.value = window.requestAnimationFrame(loop);
    };
    handle.value = window.requestAnimationFrame(loop);
    return handle;
};

Edit:

An answer to another question of mine links to a good new article.

OTHER TIPS

Did you try getting the initial milliseconds, and after the event is fired, calculate the difference? instead of setTimeout. something like:

var startDate = new Date();
var startMilliseconds = startDate.getTime();

// when the event is fired :
(...), function() {
    console.log(new Date().getTime() - startMilliseconds);
});

try avoiding the use of jQuery if possible. plain JS will give you better response times and better overall performance

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top