GL accelerated desktop or not?

I’m wondering whether running a desktop with GL acceleration and a light wm (I’m thinking Beryl skinned with fluxbox) with all the special FX turned off would increase performance for ardour or decrease it? My assumption being that if more drawing processes are being off-loaded to the GPU then there is more CPU available for ardour. Of course I could be wrong…

Older documents regarding latency suggest that running on the open source 2d driver is better than running on a closed open gl driver (e.g. nv driver in preference to nvidia driver). The main reasons were latency due to bandwidth usage on the motherboard, noise due to poorly programmed drivers) and compatibility with rt patched kernels.

I have never really had a problem with any of the above as a result of the driver and stuck with the nvidia driver as it uses less CPU resources when running multiple screens compared with the nv driver, and handles multiple screens more elegantly. IMO the handling of multiple screens was more noticeable than any performance increase or decrease.

As the PC that runs ardour is a dedicated DAW machine, I’m looking for the best possible video setup with 2 screens (without buying an expensive gaming graphics card). If it makes any difference I’m running an Asus nvidia 7600 gs 256MB (fanless - silent). Whether that be with a propietary driver or open source is not an issue.

However, as I didn’t experince problems with the propietary driver in the past, I’m more interested to learn if an accelerated desktop would make some kind of improvement to performance? If so, does the video card need to have some kind of minimum spec before it is a benefit rather than a hindrance?


I also have beryl and a nearly identical video card (eVGA 7600gs 256MB), so I could do some experiments… there’s one thing that would be a problem with the accelerated desktop approach: It seems that in order to preserve compatibility with practically all X-dependent applications, the beryl system needs X to do all the actual rendering as usual. The main difference is that each window goes into an invisible buffer instead of the screen, and the actual screen is then drawn up from these using the OpenGL layer. Since this adds an extra step, I believe it’s definitely less efficient.

If GL drawing was going to improve the situation, it would probably need to occur in the toolkits such as GTK that use X (and the CPU) to draw everything into those buffers. The good news is that [someone] is working on some of that already. The other good news is that those operations are already somewhat accelerated through the XRender extension, which the closed nVidia driver seems to power pretty well.

However, I have very little proof of what I’m saying and am wide open to corrective comments… What I can tell you is that I fired up a second monitor once during a sort of practice session and ran Ardour without any performance issue – also without beryl, because it’s been a bit flaky :slight_smile: Having the mixer and editor side-by-side was refreshing. Only the location of many new dialogs (split across 2 monitors, at the center of a 2304x864 screen) was annoying. This may actually be the WM’s fault, KDE (kwin) at the time. Beryl will probably give you better control. But in the end, Beryl uses the same or more CPU power.

Thanks for that…

What you say makes perfect sense, so it is unlikely that Beryl would increase performance as there is no decreased demand on X.

I’ll get my system up an running as usual with fluxbox and then maybe play around with a Beryl implementation later, but it doesn’t look likely this will make a difference, at least not a positive one.

BTW, I think it is your install of KDE that is having issues with menus popping up in the centre of 2 screens. If you compile KDE yourself there maybe an option for xinerama … something like ./configure --xinerama = yes … I don’t know for sure as I use Gentoo with fluxbox. Gentoo has global flags which tell the system how to compile apps and “xinerama” is one of those flags. I’ve never really bothered with KDE but I imagine it’s probably an option not enabled rather than KDE not having the functionality…


You’re welcome… (fwiw, as I am not a guru). And of course, I always USE=xinerama just in case I get that pair of sweet LCDs anytime soon :wink:

So it’s probably the lack of my ever actually tuning KDE for 2 monitors before trying to use it, since I activated the second on impulse. One weird thing, though-- I returned to fluxbox recently and qjackctl misbehaved badly, as if some part of the xinit process was broken and the environment was incomplete. I just got lazy and went back to KDE… but I recall XFCE4 was really fast and friendly a while back, when I used Ardour 0.99.2 to start recording for two of my friends. I once used fluxbox exclusively… it was pretty sweet, and I hope it works out better for you.

hi I run beryl on screen one and metacity on a tv-screen, two diff. xservers on one card so gl-fun on screen1 and nice video quality on the tv, the downside is that i can’t move windows across screens.
I don’t experiance lot of decrease of sabality in ardour2 when running it under opengl, only the mixer can sometimes be quite heavy for the cpu…
It would be nice if i could find a way to open the mixer via script and xterm to run the mixer on the tv (non gl) and the main ardour-window in gl accelerated desktop…

gentoo rocks with ardour2