CPU usage issues with VSTs, maybe GUI related?

Hi! I’m new here, wanted to inquire about this issue I was noticing when testing Ardour. I did a quick search of the forums, but didn’t spot topics that seemed to be about this (correct me if I’m wrong and this has already been covered somewhere, obviously). Some relevant stats: I’m on a 64 bit Debian testing system, jack, Ardour version 5.12. I’m using the LinVst wrapper to run Windows VST plugins.

Now, I was recently comparing Ardour and a demo version of Bitwig Studio, and I was noticing a discrepancy in CPU usage between the two. So I did some testing with a simple project, just importing one wave file and putting an EQ plugin on it (the MEqualizer from Melda Production, to be precise). (Since the plugin runs in its own Wine process, it’s easy to spot in a task manager.) The CPU use is initially negligible. When I activate more visually demanding elements, like spectrum analysis, in the VST’s GUI, CPU usage naturally increases a little (a couple % on my system—small, but noticeable). The CPU usage here seems to be identical between Ardour and Bitwig. Now, where it diverges is when I close the plugin’s own GUI. In Bitwig, the CPU usage immediately appears to drop back down to the base level. However, in Ardour, it continues at the same level as when the GUI was open. If I quit and re-open the same project, the CPU use again initially appears to be negligible, until I open the plugin’s GUI, after which it is once again stuck on a higher level.

So, would I be correct in making the assumption here, that even though the GUI is not visible in Ardour, it is not actually closed, but rather still being updated by the plugin’s process, whereas Bitwig is actually disabling the GUI while it’s not visible? If there’s some way of fully closing the plugin GUI that I’m not seeing, or some other workaround, please do correct me.

This wouldn’t be a big deal with a small number of plugins, but of course in modern music production we’re easily talking dozens of plugins. And I imagine it’s particularly an issue for GUI-intensive plugins, like analysis tools. The fact that the CPU usage is normal until the GUI is opened somewhat mitigates the issue, but it feels like this is still far from ideal behavior. I can’t really currently afford the full version of Bitwig (and I’d rather use Free Software anyway, when possible) so I really hope this issue won’t come back to bite me in the butt if I try creating a larger project in Ardour…

(Note: as I was writing this, I saw that LinVst has had some updates since I installed it last year. However, I don’t feel like this is an issue with LinVst, since the exact same plugins are working fine in Bitwig.)

I noticed exactly the same thing - I did file a bug a few years ago about this. It appears that Ardour doesn’t ‘close’ the VST plug-in UI in the way the other hosts do, by calling effEditorClose (or some such - I can never remember the exact API call and I don’t have the code in front of me) but merely unmaps (hides) the plug-in UI Window. So, if you have a plug-in UI which is regularly doing some quite intensive redrawing (such as a graphical EQ, FFT etc) it could quite easily continue doing this on all the UIs / Windows which have ever been opened, even if only one is visible. This is an empirical assumption as to what Ardour is doing - based on what I see happening in my plug-in code, I haven’t recently looked at Ardour’s VST code.

As an additional note, it does look like this is specifically an issue with VSTs, and not all plugins in general. I was testing the Calf LV2 plugins a little (the EQ plugins look fairly equivalent to the Melda VST plugin I was testing with earlier) and noted that CPU usage drops noticeably when the plugin UI is closed, as one would expect.

(Also, I posted this on the Linux sub-forum because that’s what I’m using, but I assume the VST behaviour may be same on other platforms as well?)

I think its important to point out that this is not an issue with VST specifically, but appears to be an issue with Ardour’s implementation of VST (for Linux). For clarity, its not simply a case of VST bad, LV2 good. I don’t know if the issue affects Ardour for other OS, I don’t test with Ardour on other OS.

We made a deliberate design decision with our VST GUI support to NOT interpret the “close window” button (typically at the upper right of a window) as “delete the GUI and release all of its resources”. This is in keeping with the way that most applications interpret the “close window” button. We leave the GUI in existence after creation until the plugin is deleted.

Many/most other VST hosts do not do this, and instead interpret “close window” as “delete and destroy window”.

Ironically, and not particularly intentionally, we do “delete and destroy” LV2 plugin GUIs in this way.

There was a long and rather vigorous debate on these forums about this in 2016, which you are free to read.

Personally, I find the idea of using a GUI toolkit that doesn’t allow the detection of on-screen/not-on-screen rather disturbing, though not quite as disturbing as a plugin GUI that doesn’t bother to use this distinction to decide whether or not to keep doing expensive drawing-related computation.

We may or may not change this aspect of our VST support in the future, but for now there’s nothing to be done or being done about it.

Looks can be deceiving.

As workaround you can use “Edit with generic controls” (context menu of the processor) that will destroy the custom GUI and show a generic control panel.

1 Like

I think most toolkit(s) - I don’t know what Melda use, but the behaviour is evidently the same with that and with mine - draw to an offscreen buffer first. Typically this is just some block of memory. The actual ‘putting it in a window’ (which is the bit which doesn’t happen when the window is not shown) is most probably the least computationally intensive part (from the plug-in’s point of view).
Its possible to go poking around and try to figure if the host has unmapped its window without telling us and then try to do something about it, because, Ardour, but the need for rendering the UI might not be entirely dependent on this, which is why there is the effEditorClose API call, which Ardour doesn’t appear to use.

If its a deliberate design decision, then it appears to be a deliberate decision to not support a part of the API, for exactly the reason that the API function may actually be useful.

We had this discussion in 2016. To briefly reiterate:

we do call effEditClose but since its declared semantics are to destroy the GUI, we do so when we believe the GUI should be destroyed, not when the user indicates that they want to (merely) close it.

Qt and GTK [ Not that anyone should use either of them for plugin GUIs ] as well as Cocoa and the various layers of GUI stuff on Windows all provide an easy way to know that a window is mapped or unmapped. Why you would burn CPU cycles drawing when the user cannot see the result escapes me.

If I close the Window, by whatever means, clicking the ‘X’ in the corner, or some other mechanism, I expect the behaviour to be consistent, especially if, from a user point of view it appears consistent e.g. the window disappears.
What you’ve created is a mechanism whereby different methods of closing the window appear to the user to do the same thing, but actually do something manifestly different - as illustrated by the fact that this issue has surfaced again as a bug. You’ve (accidentally?) created a new ‘close the window but don’t destroy it really, just hide it’ button. Which I don’t think exists in any other application.

I don’t know about other toolkit(s) but I just draw (offscreen) in response to any input which changes the state of the UI, that is then relayed to the user via the standard expose mechanism, which inevitably takes account of whether the window is visible - this is quite normal for this kind of toolkit design. I’m not aware of another method which would actually work reliably.
(If the UI has really been closed - e.g. by effEditClose, then the no drawing takes place, since the UI no longer exists until the user opens it again)

The “close button” under all X11, Windows and Cocoa toolkits does not, by default, cause the deletion of the window to which it is attached.

If it did there would be no way to hide a window without destroying its associated resources. That would be fairly absurd as general toolkit behaviour.

So are you telling me that, for example, if I have two browser windows open, and I close one by clicking the ‘X’ in the corner, the expected behaviour would be for that window to still exist, together with the page it contains and all the significant resources it consumes, but just hidden so I can’t see it, and have no way of re-opening it. And for this to happen for every discarded window until I close the entire application, or the system crashes because it runs out of resources? That’s plainly ridiculous, and not what happens.

Thanks for the replies, folks! Sorry if I’m poking at territory that’s already been covered (like I said, I did search the forum, but alas, searching for generic terms like VSTs and CPU usage are something of a needle in a haystack situation…)

My own two cents on this is that if there’s an easy way to save some CPU, it’s probably worthwhile. (But I’m not a dev here, and it’s not my call, obviously.) In the meanwhile, at least I’m aware of this possible issue, and I know there are ways to mitigate it, should it come up.

Every other host (that I’m aware of) it seems does what you would expect. (I think what’s being missed here is that if there are several different ways for the user to do something - such as close the UI - which appear to the user to have the same result, e.g. the UI disappears - then that should be consistent in its implementation. From what I can tell, in Ardour it is not).

So which button do I press to hide a window irretrievably and leak resources? I’ve only got the normal close, min and maximize…
Of course hiding / showing or unmapping / mapping a window is a valid thing to do in some cases, which is why there is a (functional / internal) mechanism to do so, but exposing that mechanism to the user under the guise of closing the window - I don’t buy that.

I think that what paul is telling is that most (if not all) toolkits I worked with signal you (read call an overloaded method in OO languages/toolkits) when the user asks for “close” via the window manager (presumably the X in the corner, though my own window manager has no such X).

In response to that action, and unless you take counter measures, the window is hidden, but not destroyed, which is critical in the cases where the window was a dialog and you still need to access the contents of some fields, say like in a file dialog.

Sometimes you want to trigger a destruction of the window after handling the event, but sometimes not. At the very least, letting the toolkit destroy your object would be strange, since in an OO world the event is handled via a method on the instance…

Most toolkits have a separate destroy event, which is not triggered when “closing” the window, but rather when the toolkit is about to free all the resources. I guess it fires just before the class destructor is called in OO toolkits, unless the toolkit actually intercepts the destroy event to trigger the instance destruction and you are notified via the destructor itself.

The fact that the “destructor” is called “close” in VST is unfortunate (I don’t know VST enough to assert that effEditClose really mandates the destruction in VST, but I have no reason to doubt paul on that point, esp. since you confirm that hosts calling effEditClose actually destroy the UI).

Re windows that use the “close” button “in disguise” without closing themselves: most apps that have a background function do that. In particular transmission does that. One could argue that DSP plugins do exactly that: hide the UI while continuing to perform work nonetheless. The fact that the developer chose to free the UI resources or not does not matter to the user (but the fact that the UI still gobbles oozes of CPU or memory might).

I think the issue I’m most concerned about is that there appears to be a lack of consistency - we can debate the semantics of how or what the application should do in response to various window manager messages on various OS back and forth forever - so much of that is application dependent, but that brings me back to - what Ardour appears to do (and why the bug has appeared again) is inconsistent.
If there are several ways in which the user can accomplish the same action, then the result should be the same, especially if the feedback to the user suggests it is the same (such as closing the window, and it disappearing). What we seem to have is a several ostensibly identical ways to ‘close’ the window, all of which make it disappear, but some of which continue to consume resources. That’s not what I would expect.
When you call effEditorClose on the plugin, it knows the UI has been closed. At that point it has a (platform / window manager / independent) way to make intelligent decisions about how to free resources. If the host doesn’t call effEditorClose - or sometimes decides it won’t because the developers know best - the plug-in has to rely on some window manager / OS specific method (and on Linux, not all messages from host / parent Windows filter through reliably, because X11 / Window managers, host applications, GTK, Qt, user specific configurations, compositors etc etc). The latter is just ugly, especially when there is a (VST) API call specifically to accomplish the required behaviour.
I’m not advocating for this to be fixed right now - I get that the Ardour developers have more important demands on their time, but it would be nice if some common sense would prevail and we didn’t have to argue points like this forever (especially as its essentially just a case of ‘please make it work like every other host application’)

It might also be interesting to note, from the Microsoft documentation:

“Recall that DefWindowProc executes the default action for any window message. In the case of WM_CLOSE , DefWindowProc automatically calls DestroyWindow . That means if you ignore the WM_CLOSE message in your switch statement, the window is destroyed by default.”

The default behaviour on X11 appears to be the same - clicking the close button destroys the XWindow.

The important point is that the default behaviour is not to hide the window and leak resources.

The other important thing to note is that effEditOpen and effEditClose provide a nice solution to the problem of who owns the plug-in window (particularly important in X11, since any attempt to reference an XWindow which no longer exists will trigger X11s error handling procedure, which consists of deliberately crashing your application, because that’s what X11 does…).

For example

Opening the plug-in UI:

  1. Host creates the ‘parent’ window, and calls effEditOpen on the plug-in, supplying the parent window ID (which must be valid when it makes the call - otherwise it will trigger X11s default error handler and deliberately crash the entire application, because that’s what X11 does…)
  2. Plug-in creates its UI window, and reparents it into the host supplied window.
  3. Plug-in maps or shows its window.
    The plug-in UI window is now a ‘child’ of the host application’s parent window.

Closing the UI:

The user clicks the ‘X’ in the corner, or triggers the UI to close in some other way.

  1. The host calls effEditorClose on the plug-in.
  2. In response the plug-in destroys its UI window (and resources if it chooses)
  3. The host can now (safely) destroy the parent window.

This neatly avoids the issue of e.g. the host destroying its ‘parent’ window first, and taking the plug-in’s (now) child window with it, such that the plug-in could then try and destroy its own now non-existent window in response to a subsequent effEditClose, and, in X11 at least, trigger the default X11 error handler to intentionally crash the entire application, because (repeat after me…) that’s what X11 does…

I do not understand why you speak of inconsistencies. You mean comparing VST and LV2 ? And again, I do not see why this concerns the user at all who should not see the difference between “hidden” and “freed” in normal circumstances. There might even be some kind of LRU cache going on, meaning that most-recently-closed UIs of plugins get kept around, so that they can be reopened quick, and then afterwards they are destroyed if deemed uneeded.

I totally agree with the fact that effEditorClose is needed for safety when the host destroys the windows, but again Ardour does call effEditorClose when destroying. The disagreement is about when to destroy an UI.

BTW, I am not advocating against or for “destroy the UI as soon as it is closed”, I am just pointing out that it is not as clear-cut as you make it seem and that it is a legitimate design decision (that Ardour might have gotten wrong), not a stupid mistake. The fact that other hosts do destroy the window might also have to do with the name of the event name which hints toward “close” but mandates “destroy” (but without ever mandating that it is fired on each “close”, if I understand what paul and robin said).

Again, maybe that is a wrong decision on Ardour’s part, and consistency with other hosts is worthwhile, but the fact that the VST API leaves that leeway to hosts means that you cannot assume that every host will behave like that. And it is true for a lot of the spec. As they say “be strict in what you do, and liberal in what you accept”. Every host has to fight hard to cope with all kinds of strangely behaving plugins, and every plugin should fight hard to cope with hosts.

Lastly, I don’t see why being double-buffered implies that you cannot avoid refreshing the UI when not visible. Sure, double-buffering means that if you need to redraw part of the UI that is just exposed, you just have to bitblt your buffer into the window, but you should nevertheless keep the “what is visible” info around to avoid quick-redraws of the parts that are invisible. Sometimes that is hard, but at least an “all or nothing” switch to bypass the redraw altogether when completely invisible seems important to me.

No the inconsistency is that the user can ‘close’ the UI (e.g. make it disappear) by different mechanisms, and it seems as if the actual behaviour is different (though not in a way that is immediately evident to the user). E.g. in some cases Ardour will destroy the window and call effEditClose, and in others it just hides it.
If the control presented to the user purports to do a thing, and the feedback to the user is that the thing has happened, then the underlying behaviour should also be the same. I have several light switches in the room, they all look similar, are intended for the same function and they all switch the light on and off. It is not the case that one boils the kettle as well (because someone thought that might be useful).

That’s irrelevant to this discussion and overthinking the issue. Its really very simple when I close the window I expect it to close, as the default behaviour leads me to expect it should.

I’m not saying its a stupid mistake - how about we compromise and say its a stupid decision…

Unfortunately that is all too true - I spend a lot of time dealing with why host A does this, host B does that etc, but in most cases a quick email to one of the developers will sort the issue - only with Ardour do I have to fight. (Hint, listen to the man who has 30 years experience in this industry, and writes plug-ins every day for a living, I might know something… :slight_smile: )

Its not that I can’t, its just a horrible foul smelling code hack if I do. As it is, I only draw if the state of the UI changes, I draw to an offscreen buffer and when I get an expose event on the plug-in window I relay that image to the user.
If I stop drawing anything while the UI is ‘hidden’ - then one effect is that when Ardour unhides it, the old state of the UI will be presented to the user, before I can detect that its visible again, and redraw my controls to their current state etc etc.
The fact that this presents a brief image of the old state of the UI is a big warning that this is the wrong way to do things.
Using effEditorOpen and effEditorClose allows me to build the UI in its correct (current) state, and then present it to the user properly initialised. Its simple, its elegant and it works.

If I’ve read this right, are you saying that if I have, for example, a 128 track session with 4 plugins per track, and I open and then close the GUI of every plugin in the session I will have 512 instances of plugin GUIs that I can’t see but are none the less using resources ?

I think the inconsistency that is being alluded to is that there are a couple of different ways in which a plugin can be “no longer visible” but each way does a different thing, one just hides one actually destroys. This is just for a particular plugin type, not an inconsistency between LV2 and VST.

That’s exactly the issue as far as I can tell, and I think what is being referred to in the original bug. If this is a deliberate design decision by Ardour developers then - I’ve provided the long explanation of why its a bad idea, the short version is “Reaper doesn’t do this…” (and it also runs on Linux now too)