I am working on creating an online rehearsal/jamming space for a group I play in (due to the pandemic). I have set up a Jacktrip server on AWS and connected the participants by way of QJacktrip (a graphical interface available on Windows for jacktrip). In our physical workflow, we were used to having one of us mixing our instruments and voices on a mixer, and I am trying to replicate that mode of operation for the online rehearsal space. We have learned that we need it even more for the online space, as Jack+Asio4All on Windows often leaves participants on integrated audio without a way to adjust their volume (most common problem is that it is too low).
I have experimented with ecasound, until tests revealed that it is not multicore and thus would not scale well, and I am aware that some folks have set this kind of mixing using SuperCollider. However, either approach does not give us the flexibility inherent in having someone just operating a DAW.
I am familiar with DAWs, and since Ardour is built with Jack in mind, it seems ideal for mixing, adding effects, etc., to a bunch of Jacktrip participants. However, I am not sure how to proceed and would welcome your thoughts and recommendations! Here are some ideas we are currently entertaining:
- The first approach that comes to mind is running Ardour on the server and using X-forwarding or some other remote desktop technology to use the GUI. It has the disadvantage that server resources are wasted running the GUI.
- The second approach would be having Ardour running headless in some way, and controlling it remotely. I have found some not entirely conclusive threads on whether this could be done. Is it possible to run Ardour headless? If yes, how would one go about controlling it? Is there any way to run the full GUI on a client machine, or would the interaction have to be programmed over OSC or Web-sockets? If the latter, are there any full-featured control clients available?
- It would be ideal to have a base setup, with a full mix and personal mixes for each participant, available from the get-go. Would this be possible using an Ardour project, given that Jacktrip clients may come and go during the session? Would a scripting approach using Lua be more appropriate?
At this time we are exploring available options, so any ideas are welcome!
Ardour can be built to run headless. We do not distribute a build that includes this version of the program, but if you build it yourself, it will be created and installed.
You cannot run “the full GUI” on a separate machine. You can use OSC or the still-experimental WebUI to control the headless instance, along with MIDI based protocols.
Thank you for the reply - building will not be a problem. Everything else in this project had already to be built from source as I am running on an AWS ARM instance and most of the software for the project on Ubuntu’s repos is outdated.
The Web UI that you referred covers the same functionality as the normal GUI, or only a subset of features? I will perhaps start by trying that out.
The WebUI is still experimental, but has access to everything that OSC has access to (because it is built on top of our OSC support - basically OSC-via-websocket). What is actually presented on the pages that we already offer is fairly limited, but it’s all completely modifiable without rebuilding.
I don’t believe that regular editing GUIs can be built using a protocol like OSC, mostly because of the problems with displaying waveforms. But I’ve been wrong before in my under-estimation of what modern computers can/could do. However, everything related to mixing should be accessible.
Also custom plugin UIs won’t be available when you use the websocket control surface.
The most reliable way is to run Ardour on the remote system (ideally even with graphics acceleration three) and then use VNC (this usually performs better than remote X with gfx acceleration on the remote machine, and also allows to easily dis/re-connect)
PS. see also Headless Mixbus
Yes. you could either start it and then control it via OSC or websocket, but a more powerful way is to use the interactive Lua shell. That also offers edit operations on the timeline.
arodur6-lua comes with ardour.
With your indications, I have just completed the first part - compiling Ardour 6.6 on the remote server and running Ardour headless (hardev) and the web server control surface. Most dependencies were available with suitable versions on the Ubuntu Focal repositories, except for gnome-doc-utils, jpegsrc, lv2 and libwebsockets, which I had to build from source. The latter in particular required enabling LWS_WITH_GLIB and LWS_WITH_EXTERNAL_POLL before building.
In order to enable the Web Server Control Surface without setting it in the GUI, I took some time reading through Ardour’s source code to find that I could do it editing the Ardour session (*.ardour) XML file. In hindsight it was kind of obvious.
So now I have a mixer on my browser, that I can use to control Ardour on the remote server. For the next step I will try to automate creating the base timeline from the jacktrip clients using ardour6-lua as recommended.
If you do not need to change the session-layout (tracks busses etc) on the fly, you could also get away with template sessions, or tinker with the XML and re-load the session.
If you do need interaction, I suggest to run ardour6-lua in a screen(1) or tmux(1) on the remote server (just in case you don’t know those tools already). That allows to keep it running and re-attach to the interpreter at any later time.
I’m new to this community – this is my first post. I got here because I built the “apt get” version of Ardour into a Jacktrip Toolkit that I’m working on.
It builds a headless Linode virtual server with Jacktrip, Jmess and Ardour on it (along with some other stuff). I’ve been astonished by how well Ardour works in this environment – I’m going to use this rig to record a concert in a few days and am completely confident that it’ll work fine. Kudos to all of you for a wonderful DAW!
Here’s the link to the web page that gives details.
Here’s what the first part of the description has to say about it.
A Pretty Good Jacktrip Toolkit — Part 1 – Installation
This StackScript builds Jacktrip from the main Github repository. So version 1.3.0 as of this writing.
The script also installs a few supporting audio apps (Qjackctl, Ardour and Jmess) and some easier-to-use utility apps (Kate – a graphical text editor, and Nautilus – a graphical file-manager).
Installation — the terse, four-step recipe:
– Provision a Linode server with this (public) StackScript
– SSH into server and run: ~/create_jacktrip_server.sh
– Answer “yes” to the prompt in the script
– Log into Glish and run: startxfce4
By the way, I’ve been doing the development on the cheapest Linode and after two weeks of pretty-intense work, my bill is just under a dollar. It actually delivers pretty good audio, although I’m going to jump up a few notches when I have the gang in there with me.
Feel free to give it a try – and let me know about anything I should fix.
That’s pretty cool! I ended up not using a jacktrip + Ardour setup because Ardour added some extra latency that I could not get rid of, and in my case it was a deal breaker - eventhough the headless setup, web socket interface and Lua scripting were all in working order (although for the latter, I found myself reading pages and pages of code from the Ardour source in order to be able to achieve anything).
I am curious - do you use Ardour in your setup just for recording (and thus latency is not important, as it works just as a Jack sink) or do you run the live audio through Ardour for routing and/or effects? If the latter, are you noticing extra latency?
Best of luck!
I built this mostly to make it easier for people to use a Linode server. A lot of folks in the Jacktrip community use them, but with the command line only. So my first goal was just that – Jacktrip and Qjackctl on a GUI desktop.
I found Ardour on the search for a recording device after discovering that Audacity doesn’t work intuitively when running a Jacktrip session with the Dummy interface. Ardour worked great for recording (especially handy when using the new Broadcast option in JT 1.3.0, which delivers separate channels of more-buffered sound for a recorder to capture).
Now I’m on to the latest, which is to use Ardour in the performance – for mixing, effects, recording (both input/dry and output/mixed&FX) and monitor-mixes to the participants. The folks I’m supporting are scattered around the world, so latency isn’t really an issue – it’ll be pretty high no matter what. I think Linode servers are great for that kind of project. But yeah, not so great for playing in the pocket. Nothing’s gonna match a local server for that application.
You might read that StackScript anyway – it might be a good start on a recipe of steps required to configure a local server.
This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.