I have arranged a couple of clips in the cue grid and put some cue markers on the edit view. Now I would like to record them to a normal audio track so I can edit them and apply some effects. My first idea was to select the cue marker track as the input for the the audio track via the track tab in the routing grid as that is how you do similar tasks in Ableton Live. Am I completely wrong here or is there another way to achieve this?
There are some tutorial videos by Harrison about Cues. Should be the same in Ardour. This one probably answers your question: Bringing It All Together on the Cue Page | Mixbus32c V8 - YouTube
The practical answer, for now, is: do a ‘Stem Export’, select the Cue track, and check the box to re-import. That will create a new audio(*) track for the rendered clips. Now you can Mute, Disable, or Remove (delete) the source Cue track, whichever fits your needs best.
(*) if it’s a midi clip, and you want to preserve the midi (not the audio) then you can un-check the “apply processing” box during stem-export, and you’ll get a linear MIDI track instead.
Longer-term, there are many design questions to resolve:
-
don’t you really want that clip information rendered “to the same track”, preserving all the instruments, plugins, and such, so you can continue tweaking?
-
if you do that ^, how do you decide whether you hear the ‘region’ versus the Cue player? Currently this is done via the input-monitor switch, but maybe you’d want one side or the other to have ‘priority’ over the other, depending on which is playing (?)
-
when printing the data to the track, don’t you really want some kind of looped-region, or multiple copies of the region, rather than a single linear track that can potentially take up disk space, even though it’s the same data repeated over&over?
-
if you do that ^ the regions must be ‘stretched’ to match the tempo, as they were played by the Cue system. Which implies that we need a special kind of ‘stretchy’ region on the timeline (as seen in some other DAWs). These regions could potentially stretch if you changed the tempo, or copy/paste them to another song with a different tempo. (currently a MIDI region would stretch but an audio region wouldn’t)
… as you can see, this gets pretty complicated. Ableton Live does all that (and more) but I think they incur a tradeoff in sound-quality because they assume “all” regions on the arranger page are stretchy. Great for DJ’s but not great for pro-audio.
SO, an alternative way to approach this problem might be to add some kind of ‘trigger’ region on a track playlist, which tells that track to fire a certain clip at that time, and extend it for the length of the region. In other words, the region wouldn’t have a file directly associated with it, it would just be a metadata ‘box’ that tells clip X when to fire, when to stop, and potentially to fade-in and fade-out. But the clip-launcher is doing the work of actually stretching and looping the clip in realtime. This has the ancillary benefit that you can replace that clip in the grid, and all the regions would sound different. And it draws a line of demarcation between “clean linear audio” regions and “stretchy sampled loops” regions. Maybe … (?)
can you explain this? in Abl Live -every audio region can be warped by stretching, re-pitching or without stretching at all
@dspasic: I am not an Ableton expert, and note that I said “I think …” but anyway, anecdotal evidence leads me to believe that every track/region is constantly being resampled in Live, so it can respond to all the things that a ‘live dj’ might need to accomplish on the fly. For example: globally changing the tempo is a pretty trivial thing to do in Live.
Another example is that you can use externally linked samples at the ‘wrong’ sample rate (meaning they are being converted in realtime, which is likely lower quality than a non-realtime import). Furthermore, there are several settings in Live’s general prefs (and per-clip prefs) for sound quality settings. All of these facts lead me to assume that Live prioritizes the sound ‘flexibility’ (pitch&time changes) over the quality.
Said another way: Live makes it ‘so trivially easy’ to stretch and warp your audio, that it has almost no chance of surviving unscathed and full-quality. Whereas Ardour’s linear editor requires explicit operations to stretch or re-pitch your audio, it’s not something you’re likely to do accidentally. None of this is a knock on Ableton; Live is not a recording/editing/engineering tool. It’s an awesome arranging and performance tool. The goals are different than Ardour.
i would like to understand your point of view. i do not understand what you mean ‘changing tempo is trivial’. If stretch is enabled for audio regions - everything remains synced when editing global tempo. We can discuss since i use Live for like 10 years
interpolation can be selected as type of stretch (or disabled)…
are we talking here about flexibility of clip manipulation (read cue page), quality of audio being played back, or stretching behaviour (when modifying global tempo)? or just in general?
i think that everyone agrees on that, it’s different set of tools.
I think that we need high-level roadmap for cue page… like what do we want to achieve on that page, what options do we have, and how does that co-relate to music that we make
Yes, this “everything remains synced when editing global tempo” is a fundamental feature of Live (for live/dj/arranging purposes) but it necessarily degrades audio quality. Something that seems trivial to the user … “speed it up, but keep the same pitch” … requires a tremendous amount of lossy dsp processing.
The Cue page was largely originated by Paul, but I became a fan and helped him develop the idea to a ‘minimum viable product’, with these goals in mind:
- lower the barrier of entry for enthusiasts who just want to make music (make it “more fun”)
- help the user find, organize, and audition clip material (whether it’s bundled, downloadable, or already on your system) in both MIDI and audio format.
- allow users to drag&drop those clips into slots, where they can be triggered and looped
- …using an inteface that is recognizably similar to the Live “session” view, to leverage any experiences or expectations that someone might bring with them
- play clips at the session tempo, without all the attendant complications of doing this in the editor
- allow live performance by firing individual clips or rows of clips, for realtime experimentation
- provide ‘algorithmic’ playback (via Follow Options) for generative and semi-random arrangements
- provide Cue Markers so you can finalize an arrangement (verse/chorus/bridge…) to accompany vocals and instruments on a timeline.
At the very least, the Cue Page is a fun&inspiring replacement for the metronome when recording an idea. But if you get into the details, it is very cool.
The Cue page will continue to evolve over time. We have plans to extend the number of rows, allow copy/pasting of rows, provide an audio trimmer and (someday) midi piano-roll editor, provide integrations for launchpad-style controllers, print loopy/stretchy regions to the timeline, and allow recording directly into clips.
Going forward, which of those features are most important to you, @dspasic? Or is there something else we should pursue?
changing global tempo can also be trivial subject… if we wanna go down that route - because everyone has unique approach to making music.
changing the bpm up and down for few intervals (example: from 95 to 93) would not introduce noticeable artifacts at all… sometimes people want that ‘degradation’ of warping… jungle genre - amen break… and its variations
You are correct about quality degradation only when warp is used in audio regions (and not re-pitch)… By using midi (which is more flexible for live stuff anyway) audio quality does not degrade at any circumstance… since it is only note data…
cue page without ability to :
- record data directly to clip (be it audio or midi)
- edit directly (at least) midi data (without getting into arranger forth-back juggling)
seems useless from my pov.
Mostly - i make ideas in ‘cue’ or clip launch page in ableton/bitwig by recording+editing, and then after a while i print/move those ‘scenes’ in arranger. I further develop in arranger/edit page by editing or re-recording parts…
in Ardour/Mixbus workflow seems the opposite in my scenario:
- you can record and edit in arranger, and then use parts in cue page to ‘jam’ with? that’s great, but since i’ve got it already in arranger, why bothering transferring to cue, then re-recording parts again to arranger? looks like too much effort for same end result
but then again, everyone has its own way of doing certain things… so this is just my personal standpoint…
Of course, i could demonstrate my usual workflow and ideas anytime, if that might be helpful…
Anyway, Ardour/Mixbus are not going to be replaced for audio editing & mastering no matter how useless the cue page is for me:)
I have also used Ableton Live for many years and my songs often start of in the session view for the first arrangement and move into the arrangement view (linear) for add-ons like comping of guitars and song. To be really useful like in Ableton the cue page should have recording off the clips and adjustment of the audio/midi (trimmer). Of course the addition of more launchpads like Novation Launchpad pro mk iii and copy paste between clips and rows. And as my initial post here asked for you should be able to print the cue page to the linear workflow when you have made the foundation in the cue page. I managed to figure out a way of one track at the time but you should be able to do it for a whole arrangement.
In the long run it would be really nice to be able to extract grooves from and audio or midi clip and paste it on to other clips. This would also be very useful in the edit page on regions.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.