I’ve been troubled by xruns for some time. I recently got a new brand new pc but was dismayed to find that I got xruns every 10 minutes or so even when idling (sample rate 4800, frames 256, buffer 3)
I had already added my user account to Audio Group, disabled hyperthreading, made sure the scaling governor is in performance mode and checked irq configuration.
By chance this evening I launched Qjackctl instead of Cadence. I noticed the settings were different so I stopped jack and relaunched from Qjackctl. Result: xruns have gone.
However I find Cadence useful so I wonder if there is anyway to configure Cadence to use the Qjackctl settings. Or perhaps I should just steer clear of Cadence?
You can configure Cadence to run Jack with the same settings that QJackctl uses. You will have to look at QJackctl’s setup; the most important settings being Frames/Period, Sample Rate, and Periods/Buffer. In Cadence, hit the “Configure” button, and make it match the settings that you found in QJackctl, then restart the Jack server.
Thank you for your reply. I’m sorry I forgot to mention that I tried Cadence with the settings found in Qjackctl but it still produces xruns. The differences were only Port Max and Priority, but I adjusted these in Cadence. If I restart with Cadence, the DSP load goes up to 2% when idling (Qjackctl at 0.5%ish) and I get xruns within 10 minutes. I left my PC idling for 9 hours having started jack with Qjackctl, and there have been zero xruns! It seems there is some setting that Cadence is missing but I just can’t find any differences in the GUIs.
Another thing I noticed is that Qjackctl gives a latency reading 16ms while Cadence gives a Block Latency reading of 5.3ms (where Jack was started by Qjackctl). Does this mean there is something wrong with the system set up?
Start a terminal and enter ‘ps axww | grep jackd’ to see which flags Cadence and QJackCtl have used to start Jack.
Maybe they’re using different abstractions for the output device and that’s what’s causing the difference in xruns.
Jack in itself doesn’t account for Periods/Buffer when it does its latency calculation, and that seems to be the value Cadence reports.
QJackCtl multiplies that value by Periods/Buffer in its Latency display, that’s why it shows 16 (3*256/48000).
There are 2 (or 3) definitions of latency. Input latency (time from a signal arriving at the interface’s connector(s) until it is available to the CPU), and Output latency (time from a signal being delivered to the interface by the CPU until it emerges from the connectors) and also “through” latency which is sort of both of them except it usually is just output
input latency doesn’t care about periods/buffer, output (and through) latency does.
peder: This is the output of ’ ps axww | grep jackd ’ when running qjackctl:
1919 ? SLsl 0:05 /usr/bin/jackdbus auto
2201 pts/0 S+ 0:00 grep --color=auto jackd
When running Cadence it outputs:
2288 ? SLsl 0:00 /usr/bin/jackdbus auto
2361 pts/0 S+ 0:00 grep --color=auto jackd
Being a non techie I’m afraid I don’t understand it, I don’t know what flags means. Is there anything I can do?
And thank you.
paul. Thank you. I’m slowly getting to grips with what latency is about. Slowly
Thanks. I still don’t get an output like above. I got:
simon 2879 3.5 0.9 246204 73936 ? SLsl 08:32 0:00 /usr/bin/jackdbus auto
simon 2916 1.3 0.6 638508 54040 ? S<Ll 08:32 0:00 pulseaudio --daemonize --high-priority --realtime --exit-idle-time=-1 --file=/usr/share/cadence/pulse2jack/play+rec.pa -n
simon 2926 0.0 0.0 15240 928 pts/0 S+ 08:33 0:00 grep --color=auto jack
But what I have discovered is that pulseaudio is responsible for the xruns. If I disable it in Cadence I can get down to a latency of 128 @48000HZ with no xruns at all. If I activate it then the xruns come fast at that latency setting. I noticed that pulseaudio has a high priority in the above output. Should I reduce it?
Thanks. I still don’t get an output like above. I got:
simon 2879 3.5 0.9 246204 73936 ? SLsl 08:32 0:00 /usr/bin/jackdbus auto
simon 2916 1.3 0.6 638508 54040 ? S<Ll 08:32 0:00 pulseaudio --daemonize --high-priority --realtime --exit-idle-time=-1 --file=/usr/share/cadence/pulse2jack/play+rec.pa -n
simon 2926 0.0 0.0 15240 928 pts/0 S+ 08:33 0:00 grep --color=auto jack
But what I have discovered is that pulseaudio is responsible for the xruns. If I disable it in Cadence I can get down to a latency of 128 @48000HZ with no xruns at all. If I activate it then the xruns come fast at that latency setting. I noticed that pulseaudio has a high priority in the above output. Should I reduce it?
We can start jack in two ways; either using the legacy “jackd -dalsa …” or the new and shiny “jackdbus …”, if it’s compiled with D-Bus support. That’s why we’re seeing different ‘ps’ outputs.
Assuming “jackdbus auto” uses the same settings whether it’s started by Cadence or by QJackCtl it should behave the same way.
The settings are supposedly kept in ~/.config/jack/conf.xml
Maybe Cadence talks to Jack after it’s started in a way that sometimes causes xruns. The fact that the DSP load is much higher using Cadence would suggest that.