3.2 and 3.3 produce considerably more xruns than 3.1

I must admit I haven’t diff’d things out, but 3.1 is working pretty durn good, except for a few glitches. 3.2 and 3.3 xrun a lot in the same sessions, whereas my compile of 3.1 works hard without a hiccup (assuming no crashes).

Any thoughts?

Very odd behaviour.

Im running ardour 3.2 then 3.3 on KX studio now and i actually get very few x runs running at lower latencies than i did on ubuntu studio where id get obvious x runs regularly when bringing up plug in guis when working on projects with lots of tracks.

KX studio seems to have the best jack and alsa config i have seen. In ubuntu id have to work with 23ms latency as any lower and x runs would be too much, kx studio 8ms and i no x runs most of the time working on small projects and thats when using my internel sound card.

i’m running a hand tooled slackware bootable cd. with 3.0 and 3.1 i run pretty solid at 32 frames/period at 88.2k if i want to (like 0.3ms latency!!!) but my machines are old so I opt for 128 and 44.1k, where with 3.1 I have no problems. provided binaries and my compiles of both 3.2 and 3.3 just don’t run near as good in the exact same environment. I can’t help but feel something changed in the code.

yup 3.2 and 3.3 are way slower for me, in overall feel. i’d be willing to bet the videotimeline stuff is screwing me up. any way to shut it off at compile time?? i’d love to diff out all the code but my skills are still rather meek and i should be doing 800 other things.

I have noticed that guis are not as smooth and responsive, especially when i have more tracks running all with plugins. My biggest project which only has 8 tracks and a couple of buses really starts getting sluggish especially when moving between ardours main window and mixer strip window, sometimes taking 4-5 seconds to load all the graphics.

I dont remember it being like that in ardour 2.x ( i went straight from 2.x versions to 3.2)

3.2 and 3.3 are a step back in performance. i’ll stick with 3.1. hopefully the development team gets it back together. i guess the meter bridge updates and video support are slightly useful. i’ll take functionality and performance over bloat though.

sorry for the bitchy mood. i’m about sick of computers.

The amount of CPU usage goes up almost exponentially with decreasing latency / buffer size - (it is certainly not a linear function) I find this is particularly noticeable below about 128 frames. That said, there is almost no reason (certainly during mixing) for such low latencies e.g. 32 frames at 88.2 or e.g. 96K. However, I have also noticed that ardour just doesn’t run so well now on older (but still quite powerful) systems. I hope in future we are not all going to need multi-processor / multi-terabyte monsters just to be able to get it to start… It does feel like that at the moment though… :slight_smile:

it’s not even a load issue on my machines. at 128/2 i run with little problems with 3.1, and can go down to 32 at 88.1k just for kicks. i agree, unnecessary, but it did do it, which is quite an awesome feat. there is something comforting about knowing it could be used as a live mixer (with a controller or possibly a touch screen), as well as a DAW and effects processor. the direction things seem to be going now might preclude that.

3.2 and 3.3 added little in functionality, i feel, and decreased robustness. i hope we don’t start seeing new montly releases that have glam and bling but make the core “directive” weaker.

all that said i need to get back to reading gcc and gdb manuals and try to fix the thing instead of throwing laptops across the room. but i’m fairly confident 3.2 and 3.3 screwed up some of the core timings for the worse.

@otherone23: you’re arguing that the video timeline is “little in functionality” or “bloat” ?

we have a clear list of major new features we are working towards, as well as a reasonably clear picture of workflow improvements. the goal is to try to move incrementally towards them, which is necessary because some of them require major re-engineering. that does mean that some releases along the way won’t appear to feature much in the way of functionality changes. at some point, for example, the entire way that the editor’s track area is drawn and managed is going to completely change, and there is a good chance you won’t consciously notice much at all. but under the hood will be an entirely new graphics engine that opens up possibilities that we want to be able to offer.

most sluggishness with GUIs that has been investigated so far has come down to users’ video drivers, which are buggy in ways that only some of the new drawing code in 3.x is exposing (i.e. 2.x didn’t trigger the issues).

I have an issue with my system, with 4 cores AMD at 3.2Ghz, 8GB Ram at 1600Mhz and a Focusrite USB2.0 interface, i’m getting few xruns at 128/3 (which is below 10ms and below 10ms is always ok and almost unnoticeable) and thats with only a midi track in Ardour but i know it is my system at some point, since i also have a RME 9652 and also get a few xruns, i think the kernel and Ubuntu (Dream Studio) persé have a lot to do with that, much more than Ardour.

My point is, i don’t use low latency ever unless i’m using the midi keyboard since most of today’s Interfaces have their own direct monitoring system, and add to that most users wanting to eventually mix a soundtrack in Ardour will already be editing video also and that means a lot of CPU power, and a lot of high speed Ram, if someone use all that then probably have also or could have a Direct monitoring equipped interface and already should have a fairly good low latency capability for a MIDI / Live audio situation.

I’ve just recorded a whole song in 3.3 and it is great, meters are great, software is very stable, much more friendly than the Protools HD 9 i use at work and much more easier to use, i only miss a few features of Logic and PT but those are much more resourced software in every way, i think Ardour is great and is headed in the right direction. i only hope Mixbus eventually gets more of the benefits of A3 like … being based on it!

3.3 works good here. Actually better than previous versions - I compiled it with ‘optimize’ flag - it’s not debuggable build - that made it much faster, no xruns at low latency.

oskar48: what kernel are you running? glad to hear it.
paul: again, sorry for the bitchiness. videotimeline may not be bloat to a lot of folks, myself include in certain applications; but not if it means added instability.

i guess what i’m saying is re-engineering things under the hood for enhanced features is unavoidable, but if adding a dvd player to my car dropped me 3mpg and 15hp i’d be pretty pissed. (and honestly this software is cooler than my truck, if i can get back to it’s prior performance)

i can run without xruns mostly now at 256/2, which when played alongside the input sound (analog mixer) the latency isnt -too- bad, is still not near as awesome as the 32/2 i could get before.(this is/was with --optimize and non-debuggable)

for sure i should be reading the gdb manual instead of having this discussion, but hopefully it’s food for thought.

something definitely changed, but then i dont trust the integrity of computers i get near in general these days, growing up not too far from ft meade. shrug

sometimes i just get stux on things and bitchy, i guess. my apologies yet again. hopefully whatever is making 3.1 run better for me will turn up.


otherone23: linux kernel 3.3.8, jackd 0.121.3, Athlon 2 X4, M-Audio Audiophile 2496
I’ve never worked with 32/2, but often work with 64/2 and with good results.

You will be very unlikely on any PC on any OS to get 32 samples latency with any degree of predictability / reliability. You may stumble upon some settings that work, and that may be great, but if you change anything (anything at all) don’t be surprised if you get problems. PC hardware / operating systems are simply not designed to do this. The google android devs have some interesting things to say about reliable audio performance here:

most of it is equally applicable to PC audio. (and note that most of their tweaks were very specific to the hardware they were using, which serves to highlight how unrealistic it is to expect different combinations of PC hardware to be comparable / predictable in performance / reliability).
It might be instructive for anyone obsessed by low latency audio to consider just how far the sound travels in e.g. 32 samples at 88200 or 96000 samples / second?

I have a Acer Aspire 7741Z-5731. Can I use Ardour for my laptop?

I was referring specifically to extemely low latency - for most normal settings, e.g. >= 128 samples most PCs / laptops will be stable. You may have some issues with A3 on machines with limited screen height (netbooks etc) as it may refuse to run / display the mixer window.

i’m not obsessed, its running at 256/2 right now and it’s fine. my point is, ardour 3.0 and 3.1 all run at 32 with no problems, other then excessive cpu usage for no real reason then status points (and perhaps future experiments with stringed instruments and magnetic resonance and feedback noises – latency gets exponential quick when dealing with infinity).

regardless, they ran great at 128 and thats where i kept it. i’m of the school of thought that says tune it to as good as it and then back off a few steps.

3.2 and 3.3 xrun unacceptably at 128. something changed, and for the worse. sure, it’s good enough to do a lot of, if not most, things. but it’s a limiting step backwards that should be examined. at the least it is a flag that should be raised in case things end up continuing to back slide in the performance area; something changed with 3.2 that causes the system to not load balance as well as it used to.

that’s just my take, and i could always (hopefully) find it was some random library somewhere or other setting that changed during a hot day and screwed with all compiles past a certain point, but i’ve spent a lot of hours coming to no conclusion other than something rather large changed in 3.2. i rebuild my development environment from the same script every few days, so i think a fluke there unlikely, but it sure is possible.

i’m not looking for arguments stating that ‘this is good enough or that is good enough’ or ‘statistics show humans cant tell the difference’ or what not; i’m well aware of all that. i’d just as soon stick with tape and argue that mylar sounds better than cellophane or ear wax cylinders sound better than nosewax if i was looking for that kind of discourse.

just making the point that stability got worse, and of course it’s exponentially worse the lower you go in latency. but multiple machines that were still quite solid at 88.1/32buffers (really only bound by cpu load) are now only workable at 256/44.1. they fall apart at 128/44.1 while not anywhere near peak load. worth a note is all i’m saying (except i will admit i was in a nasty mood the first time i said it).

otherwise, consider it marked for posterity and hopefully i or someone else down the road actually figures it out before we get 8 revisions in and find it worse as more features fight for THISCURRENTSLICEOFTIME.

in this world, there’s always a chance it’s the feds or the chinese or martian ghosts from the future or a nano-capacitor gone wrong or the wrong color spray can antenna, so i can only speak for my own experiences.

i probably record with the thing 6 hours a day and spend most of the rest recompiling stuff.

on the upside i’ve got a groovy little sd card that boots in about 30 seconds on any old random 32bit laptop with a firewire card shoved into it; and under 3 minutes if all it’s got is a cd drive. i can throw things across the room all day and just trash pick another laptop and plug it into my rack and be back in business. 30 seconds later. with all my controllers patched in, and ardour working both as a daw and a mixer.

fortunately my rack is presumably too heavy to heave easily.

so damn cool i hate to see it backslide! pains my heart

@otherone23: I completely agree, I was just making a general point about low latency, for the benefit of those who do obsessively compete in the “latency olympics” - I see quite a lot of discussions about it, and thought that it might be useful to put it into context. For the most part, for every guaranteed recommendation, there is someone who will say the same made their system worse. I realise that is not the case here - and generally I agree with the observation that A3 seems to be more heavy on system resources.

jack with ffado 2.1.0, here, for what it’s worth. testing has been on various 32-bit intels.