Signal routing / monitoring

Issue 1)
When Ardour is doing the monitoring I still get latency. Is there any way besides hardware monitoring? ( My Tascam 224 does not support hardware monitoring under LINUX.) I played with the Auto input button - everything has latency. I do not have a realtime kernel and in JACK monitoring is checked. Even if JACK does the monitoring I get latency. Cubase has this little speaker button on each track where I can get direct monitoring. I am sure I miss something here. I bet Ardour can do that do.

Issue 2)
Each track I have to route to OUT 1 + 2 in order to experiance the panner settings (left or right) The drawback is, I have no master fader. If I use the default setting to [ardour:master/in1] and [ardour master/in2] both speakers play the master channel but all instruments are in the middle (MONO)

Kindly help if you find time. I love the Ardour project and hope people will follow my example and donate plus get a subscription.

Hans

Lowen: sorry to bother, but I would like to use the US-428 as a complete control surface, faders included.
Did you manage to get the faders working in Ardour?

Thanks

  1. hardware monitoring is implemented by the audio hardware and is totally outside of ardour’s domain. Years ago when we first started JACK, we added support for controlling h/w monitoring using JACK (so that apps like Ardour could turn it on and off) but we found that (a) very few devices at the time supported it (b) what was supported differed a lot from device to device © it was almost always easier to control it with an app specifically oriented toward the specific device. Thus, these days, if you want h/w monitoring on the two classes of interface that are recommended (RME and anything based on the ice1712/1724 chipsets like M-Audio’s Delta series), you use a specific app (hdspmixer or envy24ctl) to control it.

Cubase is using ASIO’s “direct monitoring” API, which is similar to what we tried to with JACK. However, as you note, the linux drivers for devices like the 224 do not include “device specific” support for things like this, and such support is necessary to have control over this from either the JACK level or some other application. Note also that the ASIO direct monitoring API really only supports a very trivial kind of direct monitoring - it may be very useful to you, but its completely useless for more complex situations like setting up headphone monitoring mixes for musicians & performers.

What latency settings are you running JACK with?

  1. If you route via the master, then you have to pan the master hard left+right for the individual track panner settings to be honored. My suspicion is that the master is panned in the center, which means that no matter how you pan the tracks that feed it, the signal will all end up there.

Paul, thank you for your prompt reply. (how do you have time for this…) My latency is in 80ms because I use lots of plugins on existing tracks (have to avoid the crackling) I also purchased a little Behringer interface ($25.00) which has a direct monitoring switch and that works. I would prefer the Tascam because I navigate Ardour with it. Would it make any difference if I compile my own rt-kernel? and check realtime in JACK?

Have a great weekend! And thanks for developing such a great program. I appreciate it.

Hans

RioExotics:My Tascam 224 does not support hardware monitoring under LINUX.

Press the ‘Input Monitor’ button on the 224’s surface, and bring up the slider for the channel you’d like to monitor. On my US-428, the 224’s older and bigger brother, the Input Monitor button is located above the transport controls and to the immediate right of the Pan rotary (two buttons to the left of the shuttle wheel). I am making the pretty big assumption that the 224 acts like the 428 in this regard, and that you have us428control running, have the nrpacks=1 option to usx2y enabled (which, if you’re using the transport with MMC now you likely do), and have the mmc control loop between Ardour and the US-224 up in the ALSA MIDI connections set up right (but then your transport wouldn’t work if you didn’t have that set up properly…just covering all the bases).

I’d like to hear more about your experience with the 224 as a control surface; what you’ve found working, and what you’ve found not working.

RioExotics: Each track I have to route to OUT 1 + 2 in order to experiance the panner settings (left or right)

I can’t duplicate that here with my US-428; the panners on each track in Ardour move the sound within the stereo image perfectly. But I’m using the nrpacks=1 option to snd-usb-usx2y, and the ardour masters are routed to the two separate outputs on hw:1,2 (not hw:1, or hw:1,0, but hw:1,2, as that uses the rawusb mode that enables really low latency support among other things (on the 428, that’s how you get inputs 3 and 4 to work)). I’m able to duplicate the results of http://alsa.opensrc.org/index.php/Tascam_US-224 with my US-428, and get XRUN-free overdubbing with -p64 -n4 JACK parameters.

The author of QJackctl and Qtractor, Rui Nuno Capela, has a US-224 and is the maintainer of the us428control program in ALSA, and would be a resource to check with. See www.rncbc.org for more info.

how do i have time? http://xkcd.com/303/

And a note: the spam filter did not like my post above, and I had to do the CAPTCHA thing. So it’s possible that the next round of spam removal might think my post above is fair game…

paul: Cubase is using ASIO's "direct monitoring" API, which is similar to what we tried to with JACK. However, as you note, the linux drivers for devices like the 224 do not include "device specific" support for things like this, and such support is necessary to have control over this from either the JACK level or some other application.

As far as I know, things like this are handled by us428control on Linux, which can accept MMC commands and do things with the surface based on those commands (it’s what is acting as a mezzanine now to get/put messages from/to the surface to/from MMC and on to Ardour). Not sure if individual track monitoring is implemented as an MMC target; Rui would know.

Lowen, thank you very much for your reply. Hopefully you can help me out. I got the 224 to work under UBUNTU Studio with alsa firmware, tools, loader ect… installed and then I just run the us428control command and all the lights come on. Recording and jog wheel play pause fwd ect buttons work after linking the 224 with midi JACK. The only thing is the four faders and the control input (also that light comes on if I press the button) are not recognized by ARDOUR. I assume it has to do with nrpacks=1 (first time I hear bout that…) Please help me out and explain how I can I have the nrpacks=1 option to usx2y enabled? Thanks a million.

Hans

Lowen, please disregard followed you excellent links and will try to duplicate this at home after work. Great community people! Thank you all.

Hans

[EDITED: left an incomplete sentence]

Hmm, Ubuntu Studio… I had that installed for a while, but went back to Fedora with PlanetCCRMA for many reasons. So, I’m going on memory here, and not from having a Ubuntu box in front of me. Having said that, I’m not using the US-428 with the Fedora box…but with an AVLinux 2.0r2 box (because of the advertised ‘US-122 works out of the box’)…and that is somewhat similar to Ubuntu Studio. I’m not sure how much GMaq customized things to make it work OOTB, though.

In any case, see the web page mentioned above at alsa.opensrc.org and that will get you closer to the needed information. In my case, the nrpacks=1 option isn’t in the ‘normal’ place; I’ll take a look tonight or tomorrow when I have that box back out and see which file I edited to add the required ‘options snd-usb-usx2y nrpacks=1’ line… this may fix your second issue. I never ran my US-428 without the nrpacks=1 option to see if that symptom showed up or not. And setting qjackctl to attach the hwdep device (on my laptop, that was hw:1,2; if you don’t have another card it would be hw0,2) (which is enabled by nrpacks=1) enables the JACK rawusb driver, which improves everything.

I’ve not yet gotten Ardour to recognize the faders, either, but for monitoring Ardour doesn’t need to recognize them. You set the Input Monitor on (and the light comes on), then select the input you want to work with on the US-224’s surface, then move the slider for that input up (input 1 is on fader 1, etc). The master fader (assuming the 224 has one like the 428 does) needs to be up, too, in order to hear the rest of the tracks that are playing.

For the overdubbing I’ve been doing with my 428, I’ve adjusted the overdubber’s mix using the input fader for the overdubber’s signal, and the master fader to adjust the rest of the mix up or down; Ardour compensates for the latency between the recorded input and the output signal that is routed through the master fader. With Ardour/JACK monitoring OFF you don’t hear the recorded signal delayed by the latency, and coming through the master fader; you hear what the US-224/428 is routing internally from the input fader (which, by the way, does not change the input level that Ardour sees, just the monitor level).

Set your trim so you have signal on the signal LED, and bring up the fader until you hear the signal in the US-428’s outputs. Ardour doesn’t even need to be running for this to work, by the way. At least not with my 428. For that matter, neither does JACK; the monitoring function is controlled by the us428control program and is implemented in hardware on the US-224 or US-428. I’ve got monitoring set to off in qjackctl, and it doesn’t impact the lower-level monitoring. Quite similar to what envy24control does for ICE1712 cards need, except that envy24control is a GUI and us428control is more of a daemon with no GUI.

I haven’t dug too deeply, but the fader bank (the 224 has four IIRC; the 428 has eight) sends MMC messages; it shouldn’t be too hard to bind those to controls in Ardour, but I haven’t made that happen yet.

i’m new to this discussion. please forgive my ignorance this what i find interesting

[quote]
Note also that the ASIO direct monitoring API really only supports a very trivial kind of direct monitoring - it may be very useful to you, but its completely useless for more complex situations like setting up headphone monitoring mixes for musicians & performers.
[end quote]

but this is the area i’m looking to gain experience with - setting up headphone mixes for the musicians - and the latency basically makes this process useless

what i’ve been doing is burning the mixes on a CD and playing along with them to
get the next track - where as with multirack tape i didn’t have to do this

thanks, pardon my ignorance