Ardour vs REAPER vs Bitwig Studio vs Tracktion

In both Ardour 6.6 and 6.7 (I assume previous versions too) I’m finding a massive difference between Ardour’s DSP/x-run count and other Linux native DAWs. My brain and fingers prefer 64 samples / 2 period wherever possible and I can achieve this in every DAW I’ve tried aside from Ardour. Two simple examples are sfizz + dragonfly hall reverb send (native) or sforzando + Liquidsonics Cinematic Rooms send (via yabridge). In either case REAPER, Bitwig Studio and Tracktion perform flawlessly at low latencies whereas Ardour chokes immediately with hundreds of x-runs and DSP hitting 100%. To be clear, I don’t ever run Bitwig Studio or Tracktion but downloaded them just to check I wasn’t crazy and that REAPER was doing some weird anticipatory fx voodoo via in the 11MB it takes up on my SSD.

Anyway, I’m wondering if a debug would help or providing my current laptop system specs or whether it is just a fact of life that I simply can’t get that low a latency with those particular plugins on Ardour and on this particular system? I will say that the difference is night and day and that my system seems highly optimized for realtime audio work based on realtimeconfigquickscan, wine-tkg-git, fsync settings and the performance in the other three DAWs.

It doesn’t sound like this is the issue but have you checked your denormal settings in Ardour just in case?

    Seablade

Hardware/distro specs?

Yes, I checked that.

Latest Manjaro Plasma with 5.11.19-1 (preempt with threadirqs option enabled), performance governor enabled and all “good” with realtimeconfigquckscan. Running a Behringer UMC202HD. Here are the rest of the specs: https://termbin.com/w5nqg

Here is Bitwig Studio’s DSP meter running just sfizz at 64/2:

Meanwhile with the exact same plugin Ardour is jumping between 20 and 60% DSP.

I will also say that even at higher latencies the DSP in Ardour seems to be far less stable than the other DAWs so again here is Bitwig at 128/2 running sfizz plus Liquidsonics Cinematic Rooms:

And the same in Ardour causes x-runs and spikes of 100% DSP.

I have made similar observations comparing Ardour, Reaper and Bitwig but I came to the tentative opinion that these kinds of head-to-head comparisons may be flawed due to a few reasons:

  1. Ardour is the only DAW I know of that prominently displays the x-run count. If there are a lot of x-runs then they are easy to hear sure, but it’s quite possible that the occasional x-run goes by in other DAWs without the user ever being notified or realizing. This might lead to the impression that Ardour is “worse” when in fact it may just be more honest. I couldn’t find any way to see a count of x-runs in Reaper or Bitwig.

  2. Different DAWs could be measuring/calculating DSP load differently, I don’t know if the closed source DAWs publish how they calculate load. I think I read somewhere on this forum that even JACK and Ardour have different DSP load algorithms that presents JACK in a slightly more flattering light.

  3. How many buffer periods is Bitwig using? In the ALSA backend setup it only gives the option to set buffer size, and I kind of just assume it’s using 2 but can’t be sure without measuring round trip latency. Reaper and Ardour both give the option to select at least 2 or 3 periods, and I found that at very small buffer sizes the extra period can be the difference between hundreds of x-runs and almost none. So who knows maybe there are some sneaky extra buffers behind the scenes in Bitwig?

  4. Ardour provides a tool to measure round trip latency on the audio setup dialog. I couldn’t find any built in tool to measure round trip latency in either Reaper or Bitwig. So again that makes it difficult to know for sure whether they are all running under similar time constraints.

  5. With threadirqs enabled along with the various other realtimeconfigquickscan recommendations and an RME usb interface, I found that up to a point, while Ardour’s DSP load meter might be spiking close to or even hitting 100%, there will be very few/no x-runs. I think the limit on my machine is buffer size 32 with 3 periods. I can load plugins and watch the load meter hit 99% but without “x-running”. Dropping to 2 periods at the same buffer size unleashes the torrent of x-runs. Also, I noticed Ardour’s DSP load meter does not increase linearly as I add more plugins. So while I might see the meter at 50% with one plugin, at first I think “wow that seems high” - but then I add like 10 more instances of the same plugin and hey - it’s still 50%. So… maybe no need to read too much into that number? The threadirqs option made the biggest difference here for me by the way, before enabling that I had never seen Ardour hit 100% CPU for a prolonged period without encountering x-runs.

Anyway I would love to hear from developers who know a lot more about this. It’s a bit of a guessing game otherwise!

2 Likes

So the deal is that Ardour goes into the 100s of x-runs at 64/2 and there are glitches galore. It is unusable for seconds at a time, cutting out and then all of a sudden playing all the missing notes simultaneously. It’s definitely not just a DSP reporting issue otherwise Ardour would also have glitch-free playback and recording.

I’m running 64/2 on every DAW to ensure reasonable testing comparisons.

Keep in mind that most DAWs internally run at (much) higher buffersize and have a special path for live instruments and monitored tracks. This helps significantly on under-powered systems.

Ardour does not do this, by design the mixer works like a live mixer that always processes all signals and allows to route each of them externally.

1 Like

Keep in mind that most DAWs internally run at (much) higher buffersize and have a special path for live instruments and monitored tracks. This helps significantly on under-powered systems.

Oh now that is an interesting nugget of information!

This alone may explain the issues. I remember ages ago I posted about getting super low latencies on JACK using Grandorgue + Audacity but I’ve never managed that in Ardour + ALSA on any machine.

I had this on recent Ardour6 releases, with no plug-ins. Initially I thought I had an issue with a plug-in because I kept getting drop-outs in the audio and I noticed some DSP spikes of 90% or more, so I created an empty session with just a single audio track and no plug-ins (this is on my test machine so I actually un-installed all the plug-ins just to be sure nothing really odd was happening). I was using the ALSA back-end with a buffer of 256. Compared with reaper using the same configuration / audio I/F, and / or lower buffer sizes and had no issues at all. I just thought it must be one of those things that ardour does. I have been able to work around it for testing by just increasing ardour’s buffer size to 512. This isn’t the solution but I’m not using Ardour (I hesitate to use the phrase - ‘in anger’, at the moment - so to speak…)

Exactly. For mixing/mastering I always increase the buffer to 512 or higher. But for harpsichord VSTi work, I have to use 64 samples or at a push 128. Any higher is noticeable to me even though I’m used to playing various real pipe organs where the latency is sometimes ridiculously large due to the console being separated from the pipes or due to older cabling technology but you get used to it after a few minutes. Strangely playing VSTi over headphones or near-field monitors without usual reflection cues do not lend themselves as easily to that adjustment. More importantly, harpsichords obviously never have huge delays in the action so it just feels weird if it’s present.

Just for clarity - it appears that the odd behaviour and DSP spikes I was seeing were artifacts of accidentally using pulse-audio instead of real ALSA. Reaper is pre-configured to automatically suspend pulse and connect direct to the sound-card via ALSA. Unfortunately it seems I had ardour connected to the ‘default’ audio interface, which is really a pseudonym for pulse pretending to be the ALSA interface. Doing the same with Reaper caused erratic behaviour in reaper too. I’m not sure exactly what is going on with pulse - but I can only attribute the config issues to my own carelessness (and possibly having to manage a multitude of audio / video conferencing apps on the same machine - such is the unfortunate nature of our times…) Sorry for the noise.

Ardour does the same, except more subtly: it actually negotiates with PA for access to the device chosen by Ardour’s user.

But if PulseAudio is present on your system, then the device named “default” is invariably provided by PulseAudio, and should never be used fo “real work”.

1 Like

To recap on #ardour IRC we discovered that @anon60445789 had not configured IRQ thread priorities.

By default all Hardware IRQ threads use priority 50. One should usually configure the soundcard’s IRQ to have precedence over other hardware (e.g. rtirq or udev-rtirq).

Bitwig’s process priority was set to 49 and Reaper to 5 (both lower than the soundcard),
while Ardour’s main thread uses 81, and process-threads 79, leading to priority inversion…

Some further performance improvement was to disable audio input (which is not needed here), Ardour unconditionally grabs all physical I/O, and to switch from ALSA raw to ALSA-sequencer for MIDI I/O.

2 Likes

My internal sound card is an HDA Intel PCH. REAPER will not output sound without changing the inputs to 0 and it turns out it also helps Ardour by changing input to “none”. With my UMC series interface, that is not necessary.

1 Like

Agreed - it was just an accidental misconfiguration, I normally make sure I use ALSA directly and / or manually suspend pulse if necessary. I tried Reaper with the ‘Default’ (e.g. ALSA emulation provided by pulse), only as a test to confirm if a similar issue manifested - which it did. I don’t know enough about - or am not sufficiently interested in - pulse-audio’s internals to identify the reasons, but I’m not particularly concerned as this is not what pulse was designed for.

Are there good instructions for how to configure IRQ thread priorities for the sound card? And what would make good values? Did @anon60445789 change the priority to something higher than 81?

Assuming Linux, there is the rtirq package. It generally needs to be configured as it comes with a generic set of priorities. For example: RTIRQ_NAME_LIST=“rtc snd usb i8042”. The rtc is no longer needed (and may not be there any more I am using kubuntu 20.04). so “snd usb” means your motherboard audio will have a higher priority than any usb audio, it also tends to mean that you HDA audio will have a higher priority than another PCI(e) card (I have an ICE based card). I have found that being more specific by putting snd_ice usb snd_hda for example will put my ice card higher than the usb stuff and my on board audio below that. This is important because I have found most HDA on bard audio has minimum latency of 128/2 and I like my ice to be able to do at least 64/2 but it can do 16/2 as well. These are good for live synth or guitar effects.

USB2.0 (1.1 as well) are a problem on all new motherboards. The older OHC USB generally kept the USB interrupts separate for different physical ports (at least somewhat) and so it would be possible to do prioritize USB bus2 over USB bus3 and use USB bus2 for audio leaving USB bus3 for mouse, keyboard, USB drive, etc. Newer motherboards do internal USB routing for USB1.1 and 2.0 devices placing all of them on the same USB bus. This would mean your mouse movements might cause xruns (yes I have experienced this). There are starting to be USB3 audio devices but most are still USB2.0 even if they have a USB3.0 plug. The easiest way around this is to buy a PCIe USB card to use only for audio and then prioritize that card. This can be done by finding out which irq your pcie card uses (run audio and look at cat /proc/interrupts for the xhci with many interrupts on only one core) So it would be possible to use 20-xhci if your pcie card was running on irq 20 for example, ahead of snd usb.

Laptops are more of a problem because sometimes internal bits like webcams, wifi, etc use usb internally. So finding a clean USB port is harder or may not be possible. Perhaps thunderbolt to usb converters may work… I don’t know because I have not had to deal with this yet.

With intel cpus (I think AMD does this too) setting performance may not be enough. It is not the speed of the cpu that matters so much as using a single speed. Setting the cpu speed to 800mhz gives me less xruns than ondemand running mostly over 3ghz. Intel now has Boost which will also change cpu speed to higher than performance sets even with performance set as governor. So it is best to turn that off. I have noticed that when the cpu increases speed, there are no xruns but on speed step down, I often see xruns. So performance on, boost off.

The thing to remember is that CPU performance for audio is not the same as high throughput that the makers and media testers advertise, having more cores that run faster is not always the best thing for audio. Manufactures all play games with the way their CPUs run to get the highest throughput rating they can with the heat dissipation allowed by cooling. For audio that is not good and something slower that allows audio keep it’s schedule is better. This why the last CPU I bought was an i5 rather than an i7 (back when they were both 4 core). The i5 had no hyper threading and with the i7 I would just be turning it off. I have an old audiofire12 audio device that can run at very low latencies (16/2) for days with no xruns at all. (using the ffado jack backend) No I don’t use this setting for normal use but testing at that latency gives me some assurance that use at 64/2 will also be clean.

1 Like

I set it to 95 automatically by installing udev-rtirq. I believe regular rtirq sets it to 90 which is also fine. As a wise person told me, the numbers are all relative. So in my case both REAPER and Bitwig Studio were set to lower than my sound devices so all was fine. Only when trying to get super low latencies in Ardour for VSTi work did it come to light that Ardour’s rt priority was set higher than my sound devices by default.

1 Like

Surely that’s the real issue - rather than people obsessing about IRQ priorities to win bragging rights for achieving the lowest latency for individual hardware devices - important thought that may be if you want to spend more time tweaking settings than actually making music. I would class the original priority inversion issue as a bug with ardour. I would expect that the default application configuration should be “reliable, and gets you where you need to go” rather than tuned to win races.

(Apologies in advance for the rant, but I’m on something of a mission to try and spread the word that you really no longer need to tweak linux in order to do audio - sure, you can and that’s a good thing, if you want to, but there is a popular misconception that you have to - and that this involves all manner of command line shenanigans. It doesn’t and you don’t need to. A stock install will ‘get you where you want to go’ at least as well if not better than other operating systems. Unfortunately every time I mention this it seems to trigger a discussion about how you only have to edit this or that file once - it may or may not be where it was last time, you could have to change permissions, but it might be systemd dependent or whatever but once you’ve installed a custom IRQ tweak tool and optimised your kernel configuration everything will be fine… pffft - you don’t need to :slight_smile: )

I’m currently running an experimental predictive quantum entangled negative-latency driver which has captured all the combinations of notes I’m ever likely to play during my life - simultaneously - it then simply collapses to the appropriate state required to output the selected one a matter of milliseconds before I actually play it :slight_smile:

2 Likes