I’ve got another crazy-weird setTimeout/setInterval behavior that you may not know about. However, unlike my previous discovery, this one may actually be useful.

Observe the following, seemingly innocuous, code:

var count = 0; var interval = setInterval(function(off){ document.body.innerHTML += " " + off; if ( ++count == 10 ) clearInterval( interval ); }, 100);

In particular pay attention to the use of the first argument within the callback function. Running this code in any browser, but a Mozilla based one, will give you the expected “undefined undefined undefined …”.

However, running the code in Firefox gives you some interesting results – something like the following:

Results: 4 -8 -7 -3 6 1 -1 -3 0 0

What are these numbers? Simply, they represent the time, in milliseconds, the callback was called away from the desired callback rate. In the above results the first call was called at 104ms, the second 92ms later, the third 93ms later – and so on.

The immediate advantage to this is in the ability to create ultra-precise animations and renderings. Typically, in order to do this, you would have to keep running logs of the timer calls (restricted by the extra overhead of the JavaScript execution). Whereas with this extra offset information you can easily determine the exact timer differences, as specified by the browser.

I’m curious to see how this could be used. Timer quality analysis? Detecting simultaneous timer execution? Dunno – there’s definitely potential, though.