Very slow display in editor window

Hello all.

I recently upgraded to Ardour 5 and am experiencing some very slow performance from the editor window. So far I’ve got as far as importing a midi file, setting up drumgizmo with the Aasimonster kit, and then recording the drumgizmo output to audio files. During the recording phase I noticed that the drawing of the waveforms looked pretty painful (3 inch blank gaps appearing then filling as a block) but that isnt too big of a deal. The thing is, when I hover over a midi note and scroll the mouse wheel to change the velocity theres a second or two gap between me scrolling and the value changing. That is a bit of an issue practically.

I remember when A3 came out I had a similar issue which was worked around with a couple of commands (something to do with export-buggy-gradients??). I also remember at the time Paul saying the issue was something to do with the driver module not giving correct info about the card capabilities (or something similar).

I’m currently using ardour 5.3.0, installed on AV Linux 64bit. Graphics card is Nvidia Geforce 9600 GT. Processor is i5 2500k (not overclocked), 8G RAM.

Anyway, the questions:

  1. Is this a bug or the same type of issue as I had before?

  2. if it’s similar to before, can it be worked around like with A3 by a couple of commands?

  3. If not, does anyone have any suggestions for graphics cards that have 2no VGA (or DVI/VGA combined outputs) and wont cause me this grief?

Thanks.

EDIT: My graphics card is Nvidia Geforce 7900 GTO - don’t know how the hell I made the mistake above. Sorry for the mix up.

Try Preferences > GUI: “Disable Graphics Hardware Acceleration” and “Possibly improve slow graphical performance”
Note that those options are only available with binaries from ardour.org that include patches to libcairo (buggy gradients).

What driver do you use for the NVidia card? Some users reported similar issues with Nouveau which went away with the nvidia binary driver.

I had that problem and solved it by installing as Asus GeForce GT 710 based card. Cheap and passively cooled so no fan noise, and doesn’t have the slow Ardour GUI problem.

To get 2560x1440 with that card, I had to mess around with arandr to set a non standard mode and 30Hz refresh rate, but it works out of the box on nouveau at all usual resolutions below that.

X42 - Thanks for the suggestion. I didn’t realise that we now had these options in Ardour itself. I’ll give that a shot as soon as I get back to the man cave.

Driver wise I am using the nouveua driver. If I recall correctly the NVidia drivers have to be patched to work with an RT kernel so wouldn’t be usable unless the nice guys at AV Linux have packed one up. I think they have actually got one that I tried but for some reason it wouldn’t use my second display. I seem to recall I this was part of some issues I had when I installed AVL and I had to stick modprobe-nouveau into etc/rc.local to get the nouveau driver to load and give me my dual display.

Cheers

Anahata - Thanks for the suggestion. Does your card have DVI-I or DVI-D? I just had a quick google of that card and it came up as DVI-D which (if I’m correct) only does digital signal and cant be readily converted to VGA (if I’m right DVI-I has additional pins that allow the VGA signal to be carried which means you just need a £2 adaptor plug to run VGA from it).

Also - that looks like a wicked big heat sink on it! Would have to see if I could even fit that into my case! Looks awesome though - much more cool look than one with a fan on!

Cheers

The one I’m looking at on www.scan.co.uk (my supplier) says it has a 15pin VGA D-sub connector, as well as DVI and HDMI. I’m using the HDMI. I can’t look now (posting from work) but I’ll check when I get home and make sure it is that card. It’s definitely a Geforce 710 graphics chip though.

X42 - I couldnt find the option for “disable graphics hardware acceleration” under the preferences. I tried the “possibly improve slow graphics…” option and it made a bit of a difference to scrolling and waveform drawing but it’s still pretty bad. Altering note velocities is also still badly delayed.

Anahata - had a look on scan.co.uk and it seems the card is only DVI-D. As I have 2 VGA monitors I would need 2 D-sub or 1 D-sub and 1 DVI-I. Thanks for the suggestion anyway. It’s worth noting that the 710 chip plays nicely at least in that regard.

Any other suggestions?

Cheers

I have found the Gigabyte GV-N730-2GI GT730 which seems to fit the bill output wise. If anyone has any experience with this card (or the 730 series in general) I would appreciate any info on how well it works.

Cheers

I note from reading http://nvidia.custhelp.com/app/answers/detail/a_id/221/~/what-is-the-difference-between-dvi-i-and-dvi-d%3F that the presence of a DVD-I and D-sub connector does not necessarily mean you can run two analog monitors:

If your NVIDIA based graphics card features two video out connectors (ie VGA + DVI), it does not necessarily mean that it will support dual monitors at the same time.
If the supplier claims support for two monitors, you might still have to ascertain whether that can be any combination or has to be both analog/both digital/one of each.

Also I tried a Palit card with GeForce 730 on it, and it only came up in very low-res VESA modes, though admittedly I didn’t try very hard at fixing the X config to make it work. Elsewhere the GT730 is described as supported by Xorg/nouveau, so you might have better luck.

Anahata -Thanks for the heads up - thats not something I would have thought of. Looking at elsewhere on the Nvidia FAQ there is a post that suggests all the modern cards they have support dual display - but thats from 2010 and doesn’t necessarily mean that any combination of the outputs will be possible.

I’ve noticed that none of the silent fanless cards seem to have the DVI-I port - only some of the cards with fans. And from googlng around it seems the cheaper end cards suffer from some turbo-fan noise and dont have fan speed controllers People on some of the forums were talking about yanking the fans off and either replacing them with quieter ones or just sticking a big fat heatsink on. Dont really fancy that idea too much but it may have to be.

Cheers.