How to link Ardour with Blender

Some time ago there was a post on linking Ardour with Blender. The posts on the topic are several years old and the links are now dead. I hope that connecting these two great programs has since improved. I launched Ardour, Blender and Jack Transport as per the old instructions but am not sure where to go from there. Can anyone comment on this?

While I see some Youtube videos that demonstrate how these programs can work in tandem–there is scant little on how they actually do this.

can you provide some links to the old videos? asaik blender uses jack, so it should be possible running them parallel…

@calimerox
It is not in English and the vids are old. If you insist, here: https://www.youtube.com/results?search_query=ardour+wih+blender
“…so it should be possible running them parallel.” Good. How?

@fathomstory

Last I looked into this, I thought it was a plugin that had to be loaded into Blender to get it to sync to Jack. You would then sync both Blender and Ardour to Jack.

    Seablade

Based on these responses I suppose I had better contact the Blender Comuunity as well. In school I was trained in Adobe Premier Pro and Adobe Audition. What is cool about those programs is that they link to one another when you want/need to do granular audio editing. I was thinking that Premier Pro and Blender could work in much the same way as the latter has a video editng mode that looks quite promising.

Oops, I meant Blender and Ardour might work in much the same way as Adobe Premier Pro and Audition

@fathomstory

They definitely don’t work as well in the same way sadly. Part of the reason has to do with interchange formats, namely that there isn’t a well developed and open option, and even the proprietary ones can be rough once you leave the ecosystem that wrote them (So Avid, etc.) If a good solution could be found to allow for this it may open the door to such a thing, but for instance Adobe since it controls both Audition and Premiere, can implement it’s own end to end solution for that that is a bit more difficult when the software is controlled by two different entities.

   Seablade

There is no special plugin needed. Provided that the build of Blender being used has been built with Jack support enabled, then it’s simply a matter of setting Jack as the audio interface within Blenders preferences. Once that is done, the video editor / compositor will lock to the jack transport.

@Reuben Thanks, can you please tell me how you do that, step by step? If you can do a blog post and/or video, that would be fantastic.

First make sure jack is active. Either via Ardour or something like Qjackctl.

In blender:
File -> User Preferences -> System Tab -> Audio Dev dropdown -> “Jack”

Go to the screen layout preset for “Video Editing”. Hit the playbutton of the timeline. You should be able to observe the Jack transport slaving to blender’s timeline.

So its 2019 and took at look at this thread.

It doesn’t seem to be relevant anymore. How would we do it now can i ask.

Thanks Ad

Nothing has changed. Works the same as it did 3 years ago.

Hi mate that good to know. I don’t want to admit this, but it seem i am not as savvy as everyone else around here.

Any chance that you can point to an extremely clear walkthrough on getting this up and running. Maybe like using something like lightshot or OBS and getting it on youtube.

Links:

  1. Lightshot: https://app.prntscr.com/en/index.html
  2. OBS: https://obsproject.com/download

Its always very humbling learning something new from scratch.

Thanks Ad :star_struck::star_struck::star_struck:

From what I’ve seen in another thread, you trying to do this on Windows, and I doubt you will have much luck. This is very much a Linux feature.

And even if you installed Linux, I’m don’t really see what would be gained by it. Just because you can do something doesn’t mean you should. Scoring and sound design is further down the production pipeline than editing, compositing or the zillion other things within the scope of blender.

The only utility I ever had in linking them is that I find using blender’s video sequencer display in place of xjadeo offers more flexibility in being able to easily change which video track I want displayed during playback. But trying to edit on both at the same time would be very tedious.

1 Like

I understand what you’re saying and can appreciate it. Thanks for the comment about xjadeo.

Fluidity would be nice, wouldn’t it?

I work in area with fast turn around times so a seamless, hassle-free, quick, fast, nasty and dirty workflow with time to do a reasonable mix would be great! So give me all the time I can get.

It would be interesting to hear what you have to say not so much but how things have been done for years (that’s kinda obvious), but potentially how they could look in the future.

Creative workflow under pressure right?

Ad

Hi,

There is a video editing project I have with multiple audio sources. I have the interview/dialog recorded with multiple mics. There are also sound effects/field recordings and the music soundtrack. So I did connect Blender to Jack, but am not sure where Ardour fits into this.

Here is the workflow. First, I need to sync the dialog from multiple mics and adjust. Then I edit the video, and bit by bit, add my sound effects and soundtracks.

To sync dialog parts, can this be done via connecting Ardour and Blender via Jack? I have audio playback on Blender from the camera, but I want to include my other mics. What would be the best way to do this. I would prefer to do the audio syncing and adjustment in Ardour.

What I hope happens is that once I edit the video in Blender, the changes will be reflected in Ardour. Is that a realistic expectation?

No, I don’t think so, if I understand what you are asking. Typically you would edit the video, then export a project edit list, and use a tool to convert that to an Ardour project in some way so that the audio which corresponded to the sections of video you kept were placed onto the Ardour timeline in the correct location.

Pretty much this.

My preferred workflow: edit the video with the dialog audio, then take into Ardour and process the dialog and add SFX and Music. I haven’t done this out of Blender in a long while so I don’t know if there is a good tool for exporting audio sessions yet, similar to the goal of AAF/OMF, but when I did it I always just took stem outputs, and in Blender they were intentionally edited with handles on either end as much as possible so that all the actual audio editing was cleaned up in Ardour.

Note that none of this process requires tying Blender to Jack or Ardour to Jack honestly.

And yes just because you do tie them both to jack to lock their transport together and in sync, there is no way for Blender to communicate video tracks to Ardour natively at this time (And would require a reexport from Blender anyways taking up lost of time for every video edit).

   Seablade

That sort of defeats the whole point of linking Blender with Ardour. When you have a dialog piece recorded from different audio sources, you do need a DAW-like tool to merge them properly. Audio records continuously, say an hour, whereas camera audio has the piece recorded in chunks that are not quite an hour. That is why you ought to be able to be able to move the audio sources and video along a time line to sync and adjust. Once you have your audio synced and adjusted (perhaps channel 1 is louder or 2 or vice versa or different depending on what is going on during a recording due to other noises. For example, maybe one mic source is preferable for a few minutes, then another, depending on what is going on. Sometimes there may be three or more audio sources.) Blender does not seem to be a good tool to sync and adjust multiple audio sources. I am not sure if you can even group audio streams in Blender (I asked the Blender IRC and received no answer.)

Now you are getting into the gist of it, is Blender the best tool for the job?

Don’t get me wrong the VSE in Blender has improved significantly, but it is still best when used to edit sequences together created in Blender, and most of the time for animation you are animating to the dialog, which means this is a moot point for the most part. I would edit the dialog, give it to the animators, they would animate to the dialog to match lip sync etc. and then I would mix in music and SFX.

Up until now you had just mentioned Blender and Ardour, not that you were only editing externally created video. In that case then yes I would edit the video with the audio recorded and then import both into Ardour to resync the audio. Ideally you would actually import the audio and sync in a video editor using something like Pluraleyes, but there isn’t a great solution for that in Open Source or Linux that I know of yet (Though in the back of my head I thought I remember seeing a project to solve that at one point). But honestly I wouldn’t have both open at the same time, edit the video, tell the story, export and then edit the audio in Ardour or similar.

For the record this is what I did for years with fast turnaround times, working with others from Final Cut Pro/Pluraleyes (The latter of which had a nasty habit of downmixing to mono for my editor at the time but that was years ago) and other solutions for editing video, and taking it into Ardour and Mixbus to edit the audio.

Seablade