Tuesday, September 29, 2009

Measuring timeline alignment in a DAW

How to measure for unadjusted tracking/timeline misalignment in a DAW:

Generate a 'ping' on one track. (I use a test tone that goes from -inf to, say, -18 dB -- don't use a full volume tone, in case it goes out over the speakers, which you shouldn't probably even have turned on/up for this.)

Basically you just want a tone where you can precisely identify the timeline position (to sample accuracy if you can zoom in that close), so a test tone, with its abrupt beginning, is perfect.

Making sure that your monitors are down and source monitoring is turned off (so as not to create the feedback loop from hell), take a cable and route the analog output of your audio interface/soundcard back into an input and (monitors down, source monitoring off) record that onto a new track.

Now you'll have two test tones on two tracks.

In a perfect world, these two tones would line up to the sample.

But in this world, unless you're otherwise compensating, it's very likely that the two tones will not have the same precise position on the timeline.

If you subtract the sample number position of the new track from that of the original, you'll have the amoung of tracking misalignment your rig is throwing at that point. (In my experience, most prosumer interfaces tend to have a fairly steady alignment offset. But I did have a USB mic whose misalignment varied from session to session, forcing repeated ping tests; it was purchased for location work, anyhow, so no biggie, but it would be a TPITA if you had to work around it on a daily basis.)

In previous loopback testing over a large handful of DAW based recordists, the closest to perfect was a single sample off (this is, IIRC, also the case if one ping loopback tests a PT HD rig). But most were more like 2-16 ms off, with a few as high as 35+ms off. (The latter is pretty much unusable in an overdub situation without correction, as you can imagine.)

Nowadays, most of the major DAW makers include some form of tracking alignment adjustment, from a simple manually determined timing offset (IOW, you have to do the ping loopback test and the sample position arithmetic and input it into the alignment offset adjustment) to a completely automated ping loopback calibration that, on your command, repings the system, measures the unadjusted delay and sets the alignment offset appropriately.

Sonar now has such an automated function, as I believe Cubsase, Logic, and, likely, others, are supposed to have as well. In some DAWs, it may be called hardware delay compensation, since the same techniques that measure timeline alignment of just the AD/DA routing can also be used to adjust for additional latency introduced by outboard digital signal processing gear.


I'm glad to see some interest in this. .

My hardware is now pretty old (the aforementioned MOTU 828mkII with an ~8ms turnaround and misalignment and an even older Echo Mia, which, being a PCI interface, has a lower roundtrip, only about 4.5 or 5 ms)... and the informal testing to which I referred was mostly done several years ago when I began realizing that other people had the same problem.

I'd like to know if it's still a widespread problem -- and I'm dying to know if many folks recognize it and take the steps to compensate for it.