100% CPU usage at rest

I have a fairly large, but not huge, project. As soon as I open the project the CPU indicator spikes to 95-100% CPU usage, stays like that forever. Computer struggles. No playback or anything.

Should I not expect virtually zero CPU usage? Even if there’s bad or heavy plugins, I would expect them to not do anything (and Ardour to enforce that behavior). Is that not how it works?

Unlike some other DAWs, Ardour does not “play games” to try to reduce DSP load. Whatever plugins you have in the session are running at all times, whether you are playing or not.

Why do we do this? We do not want users (particular users using Ardour in live situations) to be surprised by the DSP load “spiking” when some particular combination of things is true.

The DSP load will vary a little dependent on playback or not, for a variety of subtle reasons. But in general, the DSP load is constant - it will not spike (or noticeably reduce) regardless of what you’re doing.

You may be comforted by that, or frustrated, dedending on your outlook and needs.

5 Likes

I’ve seen people on other forums say that such constant load is immorally wasteful or something like that, when it’s possible to know that a given signal path isn’t used at all. So I’ll (try to?) further defend that practice.

I come from the analog world, where it’s impossible to pause the execution of anything. (you can’t pause physics itself) So everything keeps running regardless (and consuming power, as a required part of normal operation: just account for it and move on), and the operator’s choice is to allow it to have an effect or not. Thus, when a channel is muted on an analog sound board / mixing console / whatever you call it:

  • That channel’s meter still works
  • Its insert loop still works, along with anything you have plugged in there
  • You can still send it to the headphones
  • Only its output to the mixing busses is disconnected, which often requires multiple mute points that all operate together from the same control signal

The very cheap end of analog consoles skimps on that functionality to make the circuitry simpler. Instead of having multiple remote-controlled mute points to kill all the outputs while the channel strip keeps running, it has a single direct mute point early in the channel processing. Those boards are annoying to run! Mostly because the headphones don’t work.

Anyway, if all of that behavior (the correct version) were to be emulated as an analog engineer expects, then it would require the DSP to calculate everything all the time, just like analog physics does. Even if it’s only to drive a post-FX meter. Because I might want to put it in my headphones to make sure that it’s still working right, before I unmute it, and because some FX take a while to stabilize on their present signal. (reverb, for a common example)

It might be possible to have a “special mute point” that stops the processing between itself and the first mixing point, but that’s also a point of forgetfulness, and another source of bugs. Not what you want live.

And the logic to figure that out automatically, can easily be more complex than the rest of the software combined! Again, bugs.

So generally, if some processing exists, it needs to be running. Period. A side-effect of that is a constant system load, at the maximum that it would ever be if it tried to reduce itself. This is also welcome because I can know immediately, and not find out in the middle of a gig, that my system can’t handle it.

Those who only do studio sessions, might have a valid reason to only process audio while the transport is running, so their system load would effectively be binary - running or not - but that would absolutely kill any live use, which is what I do, because my transport is usually not running. And I still need all the processing.

4 Likes

I’m unsure that the analogy to the analog world is applicable here. We can carry over the good things but not the limitations if they do not apply in the digital domain.

I do not think that it is either this or that. Unless there is a technical limitation, or unless it is very difficult to implement, it could be configurable globally or per session. Work on a live session? Start one from a template configured accordingly and get the desired behavior.

Not a deal breaker for me, but I find the following annoyances.

  • The fans go crazy and stay like that until the session is closed. While it may seem that when I have Ardour open I should be spending a good percentage of time with active transport i.e. edit something, listen etc., this is not always the case. I find myself frequently leaving Ardour to research something e.g. how do I do this or that, watch a mixing tutorial on something specific that came up while mixing, or read the manual of a plugin. That might be anywhere from 5 mins to 1 hour. Now I’d have to close Ardour so that I don’t have to bear the noise.

  • If I want to open another program for whatever reason, it’s … like … super slow …

If you want to stop the load, then going to Window > Audio/MIDI Setup and stopping the “engine” ought to reduce the DSP load to zero (nothing will be executed). If it does not, then the source of your load is not the DSP taking place inside Ardour

THIS!

Sadly some plugins (notably synth plugins) don’t have a constant load. Often the load increases with number of notes played.
It is a hallmark of a good plugin if the load is constant regardless of input.

1 Like

What buffersize do you use? Does increasing it (Menu > Window > Audio/MIDI Setup) help?

Yes to an extent, but having studied the mathematics behind it, it’s amazing how many things are format-agnostic! The rules of processing signals in general, apply to both worlds. Almost universally.

Of course, that also makes sense when you consider that the analog world was already thinking for decades in terms of “perfection + noise”, and keeping those two concepts separate, even if they’re practically impossible to separate physically. Digital is no different. “Perfection + noise” is very much alive and well in the digital world, as are pretty much all of the other nitty-gritty concepts.

Once you start treating digital signals with an analog mindset, a whole world of tried-and-true techniques comes out! When the discreteness is fine enough to not matter, it’s analog! We simply have less noise now, is all.


Computers do also have a near inescapable trope of only making it appear as if they’re doing something simple and straightforward, when they’re actually doing something far more complex under the hood because it has some huge performance gains.

Saving a file, for example, doesn’t actually write to the storage device right away, but builds up a cache in RAM that eventually gets stored. That’s where the drive corruption comes from if you just yank the power without shutting down first. Eventually, things will be as you expect, and the shutdown procedure forces that to happen now, but in the meantime, you don’t have a clue! And it works a lot faster because it reads back from RAM and not the slow drive.

Likewise for almost everything else that the machine does. The deeper you dive into it, the wackier and more convoluted it becomes, all with eventual consistency with what the user expects. But almost never immediate.

Anyway, all of that complexity had to be designed and debugged profusely, before it became taken for granted. And because it’s not straightforward at all, few people can debug it.

It’s generally a bad idea to introduce more complexity than necessary, even if it does help. Moore’s Law made today’s system-complexity necessary, and it is pretty well bulletproof now. No need to add any more when it’s cheap enough to just buy what would have been a monster a few years ago, to just blast through the bloat and wonder what the fuss was all about.

Sure, it’s possible to produce good music with software designed for Windows 98 on a 300MHz AMD K6 and 256MB of RAM, because there really is that much room to optimize the routines…but why bother? That much optimization also tends to create unmaintainable code, which I’ve also done myself to cram a project into somewhere that it shouldn’t have fit but had to.

Again, just buy a machine that can blast through the bloat and wonder what the fuss was all about. They’re not that expensive anymore!

1 Like

Thanks, didn’t think about that, that should help.

Agree, stopping the engine like Paul suggested alleviates the annoyance, happy with it.

I have a machine with i9-13980HX i.e. 24 cores/32 threads. What more can I get? I do have in my to-do list to optimize my system a bit better, but that’s not the cause of the problem.

That’s an important detail that never came out until now!

I’d say more than a bit! What are you doing?! How big is “fairly large, but not huge”?

Going along with, “if it’s present it needs to be running,” and, “giving the full system load all the time so you know it works, right up front and not in the middle of a gig,” is another rule to not have things present that you’re not using. Because they’re running too.

Do you have a bunch of unused bloat to just delete? Maybe you’re only using 5% of what you have, because you thought that the unused stuff doesn’t run and is therefore “free convenience” to keep around?

The “free convenience” thing is generally true for analog gear, since it generally doesn’t draw that much power, compared to the full rig (mostly the PA amps for a lot of what I do), and by the time you get to that level, you probably have a rack and cabling solution too, so it’s not unwieldy there either. But like analog needs a dedicated bit of hardware for each copy of each function, digital needs a dedicated bit of processing time for each copy of each function. Both can run out.


I do keep some “free convenience” stuff in some of my rigs, but it might account for maybe 10% of the total, and the system load is still low enough to easily be okay.

I didn’t forget and leave it behind; I know it’s there and why it’s there, and I still agree with it being there. I’ve seen that kind of forgetfulness cause problems too. Mostly weird, unexpected behavior that people try to fix with even more processing, forget about that too, and then swear they don’t have any and it must be the tool itself that’s acting up.

This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.