Interesting to see this thread vitalized again and how different workflows and experiences are.
for me it was pretty much the opposite as Fathom story s experience: blender is a great program but not the most convenient to work with as a NLE … for me the best way is the “classic” film workflow: editing in an NLE (lightworks in my case works the best) exporting the audio as stems / omf / aaf to mix in DAW ( ardour/mixbus …) export the final mix and render the deliveries with NLE. syncing up NLES and daws in my opinion doesnt make much sense, as you want to edit something in the video and the edit will not get reflected in the audio and vice versa. in this case an in program workflow you can achieve with resolve or now also with reaper (for very basic video editing), if you dont want to jump back and forth…
@calimerox If you have two channels of a single audio source (dialog), it would be nice to line up in an VSE/NLE and export. However, if your source audio has issues, because we do not always record in an ideal, pristine studio world, you do need to sometimes adjust your source audio. It is an extra, cumbersome step, but necessary. If I record dialog in a studio or get a clean take, your way makes sense. Out in the field, things happen. So you need to adjust source as best you can, edit, and then export for final mixdown. Sure, Reaper is fine for final mixdown. When I used to use Adobe CC, I loved that as I had Premiere and Audition open and linked, any adjustment in the former was automatically reflected in the latter. That was pretty cool.
Ahh see all of that I would (And have) do/done AFTER editing the video when I have taken it into a DAW. Those are things the DAW is going to be more precise and quicker at as well as in most cases better. For for instance with what I can tell of your workflow, when I have done similar I edit, or often times someone else well edit the video and hand off to me with the sync’d audio (I tell them not to process the audio at all in fact if I am not the one editing the video). I will then process the audio in a DAW much better suited for the task (Though to be honest I haven’t used the fairlight channel in Resolve yet).
Is this true on Linux? I know it is on Windows and Mac, just didn’t remember on Linux.
This is true of more than just Resolve and is why on any platform it is not recommended to use h.264 as your editing format but instead to convert to something more appropriate and possibly use proxy clips on top of that.
Yes but it seems like you are missing a ‘Why does it crash to start with?’ and see if there is something causing it honestly. Note I am not saying ‘You have to use LIghtworks’ but rather saying you should understand why this is an issue for you.
Everything except the last step was my normal workflow for this type of material, the last step I would not take it back into the NLE but rather mux the exported video and rendered audio together using FFMPEG, just worked better for me in particular, but I understand that is not ideal for everyone. Robin and I discussed at the time I was doing these types of things weekly about making it possible to do this within Ardour in fact, it wouldn’t be to difficult to implement but may add a level of complexity that would not be ideal for most users.
See I still don’t understand why you feel you have to adjust in the NLE first honestly. I can understand if you simply cannot hear it (In which case simple amplification is all that is needed, but I would remove before exporting to DAW myself) but beyond that I am missing something.
Yes, audio is processed after editing, as a rule. But when you have two sources and both are problematic, you need to make a new (nicer) source. If I am wrong on that, let me know.
Yes, DaVinci Resolve can import and edit H264 in Linux IF you pay.
The Panasonic GH series camera I have records only in H264. Alas.
Lightworks crashes because it is too much H264 (for big project.) If I have little bit H264, it says ‘okay’. But past a threshold, it dies. Two hours of RAW seems to be the threshold. H264 is problem because Lightworks forum say so. They say, “H264 BAD! MAKE LIGHTWORKS CRASH! IT DIE!”
If I had the hard drive space and power and a nicer camera, I would work in better formats. Alas. I have what I have and endeavor to make due. Converting the sources for ‘big project’ would require more money, more processing power, more SSD, etc etc.
The moral of the story: more money, problems go away! Hooray!
Why not send both sources in sync from the NLE to the DAW to edit and make the better sound there after you have edited video (And audio both) in the NLE?
Well I won’t disagree that Lightworks occasionally does go “LIGHTWORKS CRASH! IT DIE!” I wonder if it would do that if you weren’t using h.264 honestly, that may be why I never ran into it.
Honestly I am taking a guess you are on a GH4 or GH5? Those are pretty nice cameras if I am honest, I spent years shooting on a GH4 and currently shoot a step down from it actually on a G7. Any of these are quite capable of generating good shots obviously. I would just suggest an external drive for you to allow you to edit less compressed clips and use a proxy workflow as needed, not even sure you will need an SSD for this though it will speed things up obviously.
Well the production triangle will always apply of money, quality, and time.
@seablade “Why not send both sources in sync from the NLE to the DAW to edit and make the better sound there after you have edited video (And audio both) in the NLE?”
How do you do that? I got Ardour and Blender linked via Jack, but how about the other stuff?
still, even with messy on-location sound , maybe with a boom, a lavalier and a bleeding mic, it is better to edit all that later in DAW. cleaning up sound in NLE is never good. and the second problem is: it will be destructive, meaning that later when you import everything in your daw for further edit (and where you have better plugins, better meters, better everything for sound) if you want to alter things, often you cannot…
I understand that you need a “usable” sound for editing to work well on your video edit. I n this case I would just go with the best source for edit, and mute the others. I would not recommend applying noise reduction and other surgical destructive audio processing in NLE…
then in your daw you can still decide: do i use the lav, or the boom, etc… These are technical questions but also aesthetical choices. And these choices I would not want to make beforehand in a NLE with the image not ready, with no ambient sound underneath and no sound design etc. In a shared workflow, with an editor, sound editor, designer etc… I would sometimes refuse getting a pre mixed dialog mix from the editor, because the mistakes made in that mix, they will be hard to fix later…
This all I say in case you want to use a DAW at all for sound editing. of course very basic stuff, and sometimes sufficient depending on the project, you can do in your NLE .
@ seablade I agree Muxing the sound to the video would be best in a lot of cases and would avoid problems that could come with the rendering. On the other hand: when you need different renders to different codecs (like one render for youtube, one for screening , a lossless version for storage , etc) then muxing will be more trouble i guess…
h.264 : on lightworks this just works fine with either a proxy workflow, or converting the sources to an editing codec like AppleProRes with eyeframe converter or winff. this is kind of true for all editing software, as h264 is a delivery codec and very heavy on cpu for editing. Performance of video editing will improve drastically when you do not edit in h264. Same would be in Blender, kdenlive, shotcut, etc…
edit: fathomstory now i see your point 4 and 5 so you re aware of the h264 problem. In case you have the gh4: i m sure you can record with another codec, like quicktime, avchd, maybe some kind of intra codec that is nicer for editing. I dont have the camera, but you should check these options and test…
@calimerox My understanding is that quicktime and avchd are ‘containers’ and under the hood it is still H264. I have gone on the FFMPEG IRC and they suggested an opensource codec that is an industry standard, but so large that my machine can not handle it. Any conversion tends to mean signicantly larger file sizes. It is stupid that that I have a camera and cannot edit a video because manufacturers and editing software developers do not communicate with one another. Now I am stuck with something that ought to be a breeze to edit, but is not. I hate them all. It seems better and wiser, based on all this input I am getting, to simply give up. What really happened is the major players just dropped support for H264 and am left with legacy crap.
I am not cleaning up sound in an NLE (thought that would be nice) rather adjusting levels and deciding what bits on what track are more useful than the other. The ‘real’ editing is still in the final mix.
My understanding is that quicktime and avchd are ‘containers’ and under the hood it is still H264
I think you are right, but noneoftheless NLE s sometimes behave strangely with some of them and like others more… it s worth a try…
and yes it can be pretty frustrating all these codecs etc. thats why i love sound!!
i m not sure about dropping support of h264 is the reason. it s generally problematic to edit with h264. to compare it with sound it would be like editing mp3s in a daw directly. therefore afaik all workflows using h264 as an input format involve proxies (or some internal conversion like final cut does, which is “the same”)
btw which version of lightworks are you using? with which distro? in my experience lightworks got super stable with version 14.5 and is the best NLE out there
Only a quick moment so I will come back later to answer more:
Not quite. MOV (Commonly called ‘Quicktime’) is a container, AVCHD is a container yes, but also tends to refer to a specific h.264 encoding in my experience. MOV can contain multiple different encodings, h.264 is one and common for consumer and delivery video, but it can also contain ProRes for instance which is much better for editing. There are lots of options out there for editing, but generally the less compression the better, which is why they take up a lot of room (Though there are ways to bend that rule as well).
When Calimerox mentioned Prores he was not referring to h.264.
There are many reasons why h.264 is not commonly used as an editing format, one of which is because it can be fairly heavy to decode, another having to do with I-Frames and the fact h.264 only writes out a full frame every X frames (for instance every 60 frames) and just records changes instead, so to decode and edit at the frame level you have to decode all 60 of those frames, etc. It isn’t a matter of camera manufacturers and editing software not getting together, but a matter of compromises made when creating the cameras. There are cameras that will record in an edit ready format, but they are much more expensive, and generally consumers don’t want to deal with the size of disks you have to have to record in that format for their home movies. h.264 is a compromise for cameras like the GH4 etc. that can provide good enough results for editing, without needing so much space, but that doesn’t mean it is a good codec for editing in, just that it can provide a good enough visual quality to edit.
Ill be back later, just wrapping up my dinner break now at work, sorry.
@calimerox “Worth a try”. Wow. Ahem. I tried every video editor available for Windows and Linux. All of them. Blender is the one editor that gives me the least amount of grief for the big project.
For smaller projects…which are about 90% of the work I get, I use Lightworks 14.5 on Ubuntu 18.04, paid edition (they had a promotion, a hundred dollars for a year) for about a year. Prior to that, it was DaVinci Resolve lite (on WIndows) for a year and before that, it was Adobe CC. Meanwhile, I have tried other editors to look for a viable, reliable, replacement.
So the fine people on the Lightworks forum had me go into my footage with a little program on LInux called ‘Mediainfo’ and they discovered, regardless of container, my camera outputs H264 video. Well, here is a pastebin of one segment: https://pastebin.com/xr3VuuFx Yes, I can re-encode it, but I will need to buy a bigger hard drive. Preferably an SSD. Mind you, their marketing material says that Lightworks can handle H264…I guess only to a point. Whatever the format is, Lightworks will only tolerate so much. Adobe CC used to handle the footage fine, but then they stopped (or have limited support). Why? because they wanted to save money (they really said that) because we all know…Adobe is hurting for money. We need to take up a collection for them.
You do not know what clients pay me. Less than minimum wage. I am being driven by pure will/madness. So getting funds to record with proper gear on proper formats and doing the right, proper industry-standard thing, which means, buying a Mac and their closed-source software is not happening anytime soon. Lightworks, for the big project, will not happen until I can buy a bigger drive and re-encode. The best horse right now, seems to be Blender. I load hours of footage on the timeline and Blender seems to be okay. So I will try, through force of will and a little more madness…to work on the project in Ardour/Blender/GIMP/Inkscape et al. Maybe I can learn/share something in the process. Like leveraging the link between Ardour and Blender via Jack.
I m not sure what is to wow about : I mentioned trying other containers than the one you mentioned, mp4. But I did get no answer. It s pretty hard to give advice or solve problems when substancial infos are missing like: did you try other containers? what is your os? etc etc. I m editing mp4s with a proxy workflow forever and it just works, so there are ways to have things work out. and it s not about budget. there are feature films edited on open source software with cheap cameras. when your client doesnt want to pay some bucks for a harddrive to do the project, maybe that s a bad start from the beginning. Maybe this is also the wrong forum to talk in depth about video editing…
@calimerox I have the feeling you are not reading the thread. The reason I write that is because a lot of your questions are already (repeatedly) answered.
I agree, this forum is not the place to talk about video editing.
What will be useful to know, which is my original question, how does one link Ardour with Blender via Jack? So I have one part of it figured out. But it seems really limited, like to the point of being useless. So perhaps, some good Samaritan can show, via video (and in English–because there is a Spanish video and I cannot speak Spanish), how to make the link useful and productive…?
As has been answered above, it is not likely to do what you want.
If you don’t feel like it was answered above, can you clarify exactly what you want it to do? For the most part the largest things that would be apparent it does is two fold:
Lock transport in sync at an audio frame level so that playback in one application is sync’d to playback in the other.
Allow for audio routing between applications (Assuming both are using the jack backend)
As I said above, I don’t consider the GH4 a bad camera at all, I shot with it for years professionally, and know many people that do, and put out great work. But it does have compromises to reach the market it was made for, and for many people those are fine. Most people doing what you are doing would invest in larger hard drives, as has apparently been told to you elsewhere along with here, and use a different format for editing, so they wouldn’t have the same problems you are having.
Maybe I am reading to much into the text but it sounds like you are getting offended when no offense was intended on anyone’s part.
And to confirm, yes the GH4 does record in h.264 for video, there is no way around that short of using an external recorder and larger drives.
This does not require Jack at all. The basic process would look like:
In the NLE:
Sync the audio clips together on a timeline
Sync the video clips on top of the audio on a timeline
*3. If the NLE supports it, then import that timeline into a new timeline so that it appears as a single clip, some NLEs support this some do not, some may do it another way by grouping clips or whatever terminology they wish to call it
Edit the clip into the final video, just make sure you are editing the audio as well as the video when you make cuts/etc.
Export the video(Possibly with a reference audio track) and audio tracks seperately (Stem export) so that you have one file for your finished video, and one file for each audio track that are all identical in length
In DAW/Ardour
Import the video (And if used a reference audio track) into the timeline, typically converting to use a lower quality just for sync at this point to speed things up
Import the audio files into the timeline, if you did the export from the NLE correctly all of them should line up if they all start at the same point in time (01:00:00 is a standard time to use for this, but session start could be fine as well)
Confirm sync between audio and video
Edit and process the audio tracks
Export the final audio to a file
Either in NLE or using seperate software like FFMPEG mux the final audio and video together to create a finished and deliverable product.
I am not offended, I am frustrated, which are seperate things. Specifically, that our tech, from media generating-to-editing is so bad that I cannot directly edit what I shoot. I used to be able to do that on say, Adobe CC, but can no longer. It is like going to the bathroom and a new policy is in place where you first must go through a maze before you can go. Editing ought to be fun and easy, but for larger projects, it is not. I feel ripped off by the manufacturerer and major corporate editing platforms. People suggest things that are not tenable for me at this time. That is why I am searching for ways to achieve the objective with the meager resources I have.
“This does not require Jack at all. The basic process would look like:” <—Ahh, okay. So I saw this video where Ardour and Blender were connected via Jack. It looked like there was some sort of syncing going on but am not 100% sure why. The video was in Spanish. The things you describe are what I do anyway, but feel that there is probably something cool out there that I cannot access yet.
@ Fathom Story i will put it here for people still want to follow this thread without reading the whole thing again, yes there came this info once about your os and machine once:
The machine I edit on is a desktop with an AMD Athlon X4 880K CPU, an RX 560 (4 gig version) graphics card, 16 gigs of RAM and SSD drives wtrh Ubuntu 18.04.
@calimerox I have the feeling you are not reading the thread. The reason I write that is because a lot of your questions are already (repeatedly) answered.
interesting viewpoint. If you check the very first answer you got in this thread it s me asking about providing the video you talk about (and you provided as a link the youtube search result for blender and ardour. not helpful.), and you still talk about (3 years later) some mysterious video where something is happening to work but no one knows what it is. I couldnt find infos about your version of Lightworks, therefore I asked, so not sure how you get these impressions.
If you are “not offended but frustrated, which are seperate things” please keep it out of the discussion anyways as it is no fun at all to communicate that way.
I see this thread going nowhere for too long and everything has been answered to above in depth already a few times.
@calimerox I provided the version of Lightworks. I am not sure what you are going on about. Perhaps reread the thread a bit slower…? You may find the answers to your questions in more than one post.
: - )
As for offended/frustrated, that was adressed to the good Seablade who mentioned “…it sounds like you are getting offended…” which I am not. It is an impotant point to address, to clarify that it is the editing situation (which should not have to be a situation in an ideal world) that is difficult to deal with (being cheated by a product that produces a bad codec), but the process of problem solving is not.
Back to the original query of Ardour/Blender, I saw a few more (newer) videos on the topic, which is in French, and it seems the link between the two programs is more for a finished product than for the editing process. That was my query, can this software link be used as part of ‘workflow editing’ and I now realize, ‘no it cannot, only end result.’ This is valuable knowledge.