My internal sound card is an HDA Intel PCH. REAPER will not output sound without changing the inputs to 0 and it turns out it also helps Ardour by changing input to “none”. With my UMC series interface, that is not necessary.
Agreed - it was just an accidental misconfiguration, I normally make sure I use ALSA directly and / or manually suspend pulse if necessary. I tried Reaper with the ‘Default’ (e.g. ALSA emulation provided by pulse), only as a test to confirm if a similar issue manifested - which it did. I don’t know enough about - or am not sufficiently interested in - pulse-audio’s internals to identify the reasons, but I’m not particularly concerned as this is not what pulse was designed for.
Are there good instructions for how to configure IRQ thread priorities for the sound card? And what would make good values? Did @anon60445789 change the priority to something higher than 81?
Assuming Linux, there is the rtirq package. It generally needs to be configured as it comes with a generic set of priorities. For example: RTIRQ_NAME_LIST=“rtc snd usb i8042”. The rtc is no longer needed (and may not be there any more I am using kubuntu 20.04). so “snd usb” means your motherboard audio will have a higher priority than any usb audio, it also tends to mean that you HDA audio will have a higher priority than another PCI(e) card (I have an ICE based card). I have found that being more specific by putting snd_ice usb snd_hda for example will put my ice card higher than the usb stuff and my on board audio below that. This is important because I have found most HDA on bard audio has minimum latency of 128/2 and I like my ice to be able to do at least 64/2 but it can do 16/2 as well. These are good for live synth or guitar effects.
USB2.0 (1.1 as well) are a problem on all new motherboards. The older OHC USB generally kept the USB interrupts separate for different physical ports (at least somewhat) and so it would be possible to do prioritize USB bus2 over USB bus3 and use USB bus2 for audio leaving USB bus3 for mouse, keyboard, USB drive, etc. Newer motherboards do internal USB routing for USB1.1 and 2.0 devices placing all of them on the same USB bus. This would mean your mouse movements might cause xruns (yes I have experienced this). There are starting to be USB3 audio devices but most are still USB2.0 even if they have a USB3.0 plug. The easiest way around this is to buy a PCIe USB card to use only for audio and then prioritize that card. This can be done by finding out which irq your pcie card uses (run audio and look at cat /proc/interrupts for the xhci with many interrupts on only one core) So it would be possible to use 20-xhci if your pcie card was running on irq 20 for example, ahead of snd usb.
Laptops are more of a problem because sometimes internal bits like webcams, wifi, etc use usb internally. So finding a clean USB port is harder or may not be possible. Perhaps thunderbolt to usb converters may work… I don’t know because I have not had to deal with this yet.
With intel cpus (I think AMD does this too) setting performance may not be enough. It is not the speed of the cpu that matters so much as using a single speed. Setting the cpu speed to 800mhz gives me less xruns than ondemand running mostly over 3ghz. Intel now has Boost which will also change cpu speed to higher than performance sets even with performance set as governor. So it is best to turn that off. I have noticed that when the cpu increases speed, there are no xruns but on speed step down, I often see xruns. So performance on, boost off.
The thing to remember is that CPU performance for audio is not the same as high throughput that the makers and media testers advertise, having more cores that run faster is not always the best thing for audio. Manufactures all play games with the way their CPUs run to get the highest throughput rating they can with the heat dissipation allowed by cooling. For audio that is not good and something slower that allows audio keep it’s schedule is better. This why the last CPU I bought was an i5 rather than an i7 (back when they were both 4 core). The i5 had no hyper threading and with the i7 I would just be turning it off. I have an old audiofire12 audio device that can run at very low latencies (16/2) for days with no xruns at all. (using the ffado jack backend) No I don’t use this setting for normal use but testing at that latency gives me some assurance that use at 64/2 will also be clean.
I set it to 95 automatically by installing udev-rtirq. I believe regular rtirq sets it to 90 which is also fine. As a wise person told me, the numbers are all relative. So in my case both REAPER and Bitwig Studio were set to lower than my sound devices so all was fine. Only when trying to get super low latencies in Ardour for VSTi work did it come to light that Ardour’s rt priority was set higher than my sound devices by default.
Surely that’s the real issue - rather than people obsessing about IRQ priorities to win bragging rights for achieving the lowest latency for individual hardware devices - important thought that may be if you want to spend more time tweaking settings than actually making music. I would class the original priority inversion issue as a bug with ardour. I would expect that the default application configuration should be “reliable, and gets you where you need to go” rather than tuned to win races.
(Apologies in advance for the rant, but I’m on something of a mission to try and spread the word that you really no longer need to tweak linux in order to do audio - sure, you can and that’s a good thing, if you want to, but there is a popular misconception that you have to - and that this involves all manner of command line shenanigans. It doesn’t and you don’t need to. A stock install will ‘get you where you want to go’ at least as well if not better than other operating systems. Unfortunately every time I mention this it seems to trigger a discussion about how you only have to edit this or that file once - it may or may not be where it was last time, you could have to change permissions, but it might be systemd dependent or whatever but once you’ve installed a custom IRQ tweak tool and optimised your kernel configuration everything will be fine… pffft - you don’t need to
)
I’m currently running an experimental predictive quantum entangled negative-latency driver which has captured all the combinations of notes I’m ever likely to play during my life - simultaneously - it then simply collapses to the appropriate state required to output the selected one a matter of milliseconds before I actually play it 
Ardour cannot do anything about users not configuring their system correctly.
This issue happen if a user manually adds the “threadirqs” kernel boot option, but does not install a tool to manage those IRQ threads. Kernel IRQ threads use default priority 50.
Reaper inherited the priority from JACK, which was also configured by the user (default is 70).
So does this mean the issue would not have arisen if a user simply installed e.g. a standard Ubuntu distro and made no custom tweaks?
(Do you see what I’m getting at here…)
It is an unfortunate fact that much of the troubleshooting on audio software forums and IRC channels ends with a similar conclusion, that it is not the software that’s the problem but rather how the users distro is set up
(from Workflow | Libre Music Production step 1: The Advantages of Choosing an Audio Orientated Linux Distribution)
It depends on the use. For much studio recording where the monitoring is done outside the DAW, latency is not an issue. However, if the player is using the internal synth for monitoring or guitar effects in the DAW to help them play with the proper ferocity, latency is important and a generic ubuntu kernel will not get many users what they want.
I would tend to agree but might call it an “oddity” versus a bug. There seems to be no reason to have Ardour’s RT priority so high compared to other DAWs. As I said, all the numbers are relative and this is why I’m able to run REAPER with RT priority 5 with absolutely zero issues. Given sound devices seem to default to 50, it makes sense to have Ardour lower than that by default. Would it make sense to have an accessible preference option for those who absolutely must have a higher value?
Right, any number between 1 and 49 would have the same result. I expect you’ve used JACK with priority of 10 (which results in the default application callback to be 5).
If you use threadirqs you should elevate the soundcard’s IRQ priority as well as the audio application callback to above 50. Otherwise you might just as well not use that feature (and no have this issue in the first place).
But if I didn’t use threadirqs, in the case of Ardour wouldn’t I still have an “inversion” between Ardour’s high RT priority and my sound device?
In that case the only process with rt privileges would be Ardour, and there’d be no issue.
However 64/48kHz ~= 1.3ms IRQ scheduling of the USB ALSA driver would likely not work reliably (but 64 * 3 / 48k might).
Alas, even with everything now set up optimally with regards udev-rtirq, while improved in Ardour, I still can’t get the same level of performance as I can with REAPER et al. It is no longer night and day but the difference is occasional audio glitches even at 128/3 vs none in REAPER at 64/2 while running what I would consider quite intensive reverb plugins. I suppose it is what it is at this stage.
There is a popular idea that you should always try to get the lowest possible latency. I always try to set the highest possible latency consistent with acceptable performance for the task. My use case is probably different from those who use mainly software instruments, in that when I do record, I’m playing / recording guitar through an amp mic’d into the DAW in a very ‘traditional’ recording setup - so I’m not overly dependent on ‘in the box’ monitoring or effects. But surely 64 samples is really in the realm of ‘move closer to the speakers’ if it makes that much difference… 
I’m wearing headphones
All I can say is that I can tell the difference between 64 and 128 when playing harpsichord VSTi but both are acceptable.This all said, watch this space as I believe the devs will soon look into potential optimizations based on some of their own side-to-side comparisons with DSP in REAPER vs Ardour.
As a final note… Since Ardour 6.7-124 there is now an option in Preferences > Performance to request a min cpu-dma latency. This prevents the CPU from reaching deeper idle states which can significantly improve performance.
Is that set as default, or do we have to set it manually?
It is off by default because there is also the possibility that the CPU may overheat if it never enters deep sleep states, and the system may be thermal throttled (or overheat).
Furthermore the user needs to have permissions write to /dev/cpu_dma_latency.
The Arch Linux realtime package, sets these permissions, alternatively check out ardour/tools/udev at master · Ardour/ardour · GitHub and copy 99-cpu-dma-latency.rules to /etc/udev/rules.d/.