Latency when recording

Hi,

I have a question about latency I am seeing in ardour when recording guitar directly from a USB audio interface. From the documentation I’ve been able to find, using “agressive” (low latency) jack settings should not be necessary when just recording (I am using hardware monitoring so not concerned about that).

Recently I’ve been using Ardour 3.3. The default jack settings on my machine are 1024 frames/period with 2 periods (theoretical latency = 46.4 ms). My understanding was this shouldn’t affect recording, but I did some very careful tests recording both direct and with a separate microphone and then aligning based on a metronome reference signal. With my default jack settings, my guitar recording has a latency of ~ 28 ms, which is noticeable. I can confirm that this latency is related to the jack settings and is not just a property of my audio interface, because when I change the jack settings to 256 frames/period (theoretical latency of 11.6 ms), the actual latency in my recording is reduced to 7 ms, which is reasonable.

My question is whether these results make sense, since I thought that the jack latency settings shouldn’t affect tracking/recording, but they clearly are on my system.

Thanks,
John

I may not be commenting on the internals of Jack latency, but looks like its all as expected… I practise a low latency setting while recording and more buffers while mixing… and occasionaly tweak settings to make it perfect…

Latency should not be a problem during recording - if you are using hardware monitoring - e.g. irrespective of buffer / JACK settings, you should be able to record a new track in sync with existing audio, and (any) correctly designed DAW should align the newly recorded track with the existing audio. If you use software monitoring, the recorded audio should still align correctly, but obviously the latency in the monitored audio may be an issue.

@abhayadevs: Perhaps counterintuitively (as most people seem to mistakenly regard low latency as a measure of quality) I think you should always use the longest acceptable latency (rather than the shortest the system is capable of) both during recording and mixing - e.g. During recording, you want to try to avoid xruns because you may not be able to fix them later, and during mixing, although you won’t permanently spoil a mix, you should use the longest latency acceptable for ‘live’ mix adjustments. Possibly the only time low latency is important is when playing soft-synths live (and that is also dependent on the type of music and the skill of the musician)

@linuxdsp: Thanks for the reply, this is what I thought (that latency should not be a problem during recording if using hardware monitoring). However, I tested with Ardour 3.3 and jack and confirmed that there is latency when recording with a backing track (about 30 ms with my default jack settings mentioned above), and not only that, but the amount of latency is related to my jack buffer settings. All of this has nothing to do with DSP or software monitoring. I am using hardware monitoring and no signal processing on the software side. I don’t remember having this issue with Ardour 2 (although I didn’t do any careful testing like I did here). Is this possibly a known issue with Ardour 3? This seems to indicate to me that the usual advice of using the longest acceptable latency during recording may not be suitable here.

This seems to indicate to me that the usual advice of using the longest acceptable latency during recording may not be suitable here.
In this case clearly the latency would present a problem, the ardour devs would be better able to clarify, but on the evidence so far, I would class that as a bug in ardour. I don't know of any other DAW which would (intentionally) misalign tracks in this way during recording.

@mcfarljm: it is not an issue with ardour3. however, the program gives you enough rope to hang yourself with, and that is, metaphorically, what you’ve probably done here. there are far more ways of messing up ardour’s view of latency and recording than there are of getting it right. i am unwilling to explain them all on a web forum. even in the manual, it is hard stuff to write. IRC is better because it is more of a dialog. check the #ardour channel (see the support page above for details) if you want to get into it.

most people seem to mistakenly regard low latency as a measure of quality
Agreed setting low latency when pure recording (not overdubs) and mixing doesn't achieve anything useful, but the ability of a particular hardware/software setup to support low latency is a measure of something useful, just so it's there for the times when you do need it.

I take the point that one might as well set larger buffer sizes during mixing. I hadn’t thought of that, partly because I didn’t realize that low settings greatly increase the CPU load, and partly because my system seems to work happily down to less than 2ms. I shall be tweaking by buffer settings next time I use Ardour…

In one sense the ability to achieve low latency on a particular system is a measure of quality, but only in the sense that in a well designed realtime audio system (not necessarily a PC) that is what you should expect to be able to acheive. However, a general purpose PC is not normally designed specifically for realtime audio, and in that sense I see little point in focussing on low latency as a measure of quality (especially as in a lot of cases, people don’t know exactly what they’ve done to achieve it, and when in all but a few cases it is not as ‘required’ as it may first appear). I would always favour system stability over low latency, especially in a general purpose PC architecture, and the most important thing is to know when latency is important.

In one sense the ability to achieve low latency on a particular system is a measure of quality, but only in the sense that in a well designed realtime audio system (not necessarily a PC) that is what you should expect to be able to acheive.
@linuxdsp: I also tried to achieve this one only and some time when i do the overdubs i could see some sync issues which I might be editing the track to cover it.

Also, there is no single solution to this which can be just set and forget, we might be tweaking things for each and every sessions.

Did you measure the latency and set the latency in the jackd arguments? That is required before Ardour can request the latency value and use that value to adjust track placement accordingly.

“Also, there is no single solution to this which can be just set and forget, we might be tweaking things for each and every sessions.”

What I do is have separate jack settings configured in qjackctl, with the appropriate latency values entered for each configuration. So for example, I have a configuration for my Lexicon interface at 44.1k sample rate and certain number of buffers, a setting for my Lexicon interface at 48k sample rate and particular number of buffers, separate settings for my m-audio interface at different sample rates and numbers of buffers, etc.

So you don’t have to tweak for each session, but you have to tweak for each interface setting.

To close the loop on this regarding my original post: I was in fact using the default (0) settings for Input/Output latency adjustemnts to jack when I made my original comparisons of misalignment between recorded material and backing track. I used jack_delay as advised to measure the round trip latency for my interface and made the appropriate adjustments to my jack settings. When I repeated the measurements, the alignment I got was dead on, so that did resolve my issue.

@ccaudle: could you please details your workflows/tricks in balancing the latency in using the outboard gears like you mentioned. those jack settings are for making the latency minimal? any particular workflow you use for compensating the latency?

Sorry, didn’t see that reply, I must not have my settings set to email me notification of new replies.
Anyway, when I referred to “my Lexicon” I meant a Lexicon Lamda USB audio interface, not a Lexicon processor. I have the Lexicon USB interface which is nice because it has mic preamps, headphone amp, hardware monitoring, and I also have a M-Audio PCI card. I just have different presets in QJackCtl for each interface, so I can switch back and forth between the Lexicon USB interface for recording, or the M-Audio card for mixing.
I’m not sure there is a general solution for latency compensation for outboard effects. If you are using a multi-channel audio interface then the I/O latency settings apply to all the inputs/outputs as far as I know, but that just compensates for the audio interface itself (assuming you measured with loopback), it can’t compensate for an attached outboard processor. You could measure the loopback delay going through the outboard processor, but then the latency compensation values would be incorrect for direct inputs, or for any different outboard processor that you used. I think what you would need is something like an insert point that lets you measure or enter the latency through the send/insert path (which as far as I know does not exist).

oh… ok. So do you use different latency setting because of the recroding and playbacking requirements? How is the USB and PCI interfaces handling the latency? thanks.

The different latency settings are primarily just because the latency through the different devices is different, and for any one device the latency is different for different sample rates, and for different buffer settings in JACK. So you need a setting for each sample rate and buffer size combination of each device you use.

It occurred to me later that the issue I brought up regarding latency for send/insert is really limited to a few specific cases. Rare enough it is almost not worth mentioning. If you are using a multichannel interface with analog I/O, and an analog effects processor, then the latency of the I/O interface can be compensated, so no problem. If you are using a multichannel interface and go to a digital processor which has its own A/D and D/A conversion, you can’t compensate for that, but it is no different than using that same effects processor attached to an analog mixer.

If you have a multichannel I/O with digital outputs, and connect to an effects device with digital I/O, you cannot compensate for that latency, but in principle it should be possible, which is the only reason I pointed it out. Really it isn’t different than any other setup (stand alone digital mixer, or analog mixer using digital effects devices), and would only matter for a few very special cases where you are mixing the return back with the main signal, and want it to be lined up. Reverb won’t matter, it would just add a slight additional delay to the reverb. If the signal from the return completely replaces the original signal (e.g. you send out to a compressor or EQ and the returned signal is all you want to use), then no problem. The signal is delayed a few milliseconds, probably won’t matter, but if it matters at all you can slide the track a bit and line it back up by hand. The only time I can think of where it would matter is if you use the send to go to e.g. a compressor, then bring that back in on a different channel so you can blend compressed and uncompressed together. A few people like to do that, and with digital processors you might get enough delay through that path that you can get comb filtering when you mix them back together.

So in short I would say it is an advanced topic, 95% of the time you don’t need to worry about it at all, but like a lot of things in audio it helps to understand the basics of the signal routing so you can guess when the corner cases might apply.

yeah… the a/d and d/a conversions are the problem area for me too… now what i am doing is i have some good plugins in windows and i use them for mastering and some time for Reverb also. now as you said the mastering signal chain is just a stereo send and return which is really at the end of the chain so no issues and the Reverb bus (so far) is not causing an issue. Thanks for the simplified details…

Hi All,

Perhaps this is not the correct thread, but what does Ardour’s displayed latency represent? Is it the total amount of time it takes for audio to make it from the usb interface (in my case) to Ardour or does it measure something else? Also, how does it differ from the displayed latency in Jack? Are the Ardour and Jack latencies two different measures?

Thanks.

Ardour shows (typically) half of the “through” latency (i.e. playback latency only). QJackctl shows (a guess at) the through latency (i.e. capture latency + playback latency). There’s no particularly good justification for what Ardour does, though there are some.