Suggestions to improve MIDI workflow in Ardour v7

please read a post before replying. The first line was “If I am editing drums”. Drums are different (in most MIDI setups) than any other instrument. With Drums each note is a different instrument. This is why for drum kits synths they fan one midi channel out to 16 or so audio outputs. Please do not expect the same thing to apply when all the notes in a midi channel are the same instrument and can be easily treated with one automation track.

I am beginning to think my explanations are really bad. I have not had so many people find the wrong meaning of things I have said before

By the way, I just had to comment that changing velocity on the 1980s QX7 was harder than any in Ardour

With Ardour this can actually be managed by having multiple MIDI tracks into one MIDI bus - the bus has the actual drum sampler, the MIDI tracks each for one part. I did this for most of my tracks to make the MIDI editing a little easier on my eyes (I had it broken up by snares+kicks and “tops” which included all cymbals and toms), but I’m not sure if it’s a faster way overall to do it. I think if I had it to do again, I’d do the MIDI editing in Zrythm and then migrate the MIDI into Ardour, so I’m not editing the MIDI data.

Again, I appreciate the suggestion. (And for the record, I will continue to use Ardour either way. I’d love to be able to do it all in Ardour, but I will keep using Muse if I have too.)

On the other hand, imagine programing some complex drums with each instrument on a different track! It’s simply not feasible.

As far as processing the different instruments separately, this is already done with plugins. DrumGizmo, for instance, has 16 outputs, separating the instruments. So, they are processed separately, but the “performance” is coming from the MIDI track.

Hmm… Conceivably, that could work… I could program the drums in a single MIDI track. Connect these to individual MIDI buses filtering each instrument. (Assuming that there is a MIDI velocity plugin, these could be automated by drawing the automation curves.) Then, connect all individual these instrument MIDI buses to another MIDI bus with a single instance of DrumGizmo to actually produce the audio.

It’s a lot of steps, but I might try someday.

Thanks for the suggestions.

Simpler than your suggestion is to have a track for each drum instrument, and feed all those into a MIDI bus which runs DrumGizmo - the bus then feeds the audio data into all the fanned out audio tracks. I’ll add this to my list of videos to make once I’m done moving

As I said above, my problem with that is that it would make extremely difficult to program complex drums in different tracks. My suggestion was made exactly to remedy that.

(On the other hand, another downside with that would be that there would be no visual representation of the notes in the individual instruments MIDI buses…)

There is a velocity plugin already in place. The Ardour fader when presented with MIDI will fade by setting velocity and can be automated.

-general reply to the thread:
The purpose of all the ideas I have used in this thread, is to try and understand what different work flows there are and how they relate to the internal workings of Ardour. The idea of taking a track, splitting up, automating, mixing these tracks to synth or drums and splitting it out again is not the right way to work but it does show that Ardour does have the right internals to do the job.

  • Drum and melody MIDI editing are very different from each other and any workflow for one will not be optimal for the other. MIDI drum workflow is not at all the same as analog drum workflow. The MIDI drum creator expects all the drums to show in one place. Splitting them into different tracks is not a good idea but being able to treat each note separately would be a good thing.

  • Plugins could make a difference. A drum velocity plugin (LUA?) that could present velocity automation for up to 32 notes that allow naming of these lanes could be helpful. The same plugin could have a switch from drum mode to melody or chord mode as well.

  • A working Bounce mode for MIDI would allow baking these changes into the midi file itself. A midi bounce mode that only included plugins before the audio generator would be another path. An auto bounce that includes automation directly as edits… sounds scary could create more trouble than help but could be a step forward too.

  • the mouse scroll wheel already changes the velocity of a note hovered over for single note edits.

So yes lots of things can be improved but some of these workarounds can be tried to see if things like an automation lane for velocity makes sense in the first place.

I wish I had answered more quickly. Unfortunately, I was unable to do so.

Any list editor could probably be tweaked to act as a musical tracker, by adding a set of conditions that constrain its use.

But Cubase’s is much more than that. It was designed to address the shortcomings inherent in a piano roll editor (viewing overlapping notes and the position of non-note MIDI events in relation to each other).

Sweet memories… I also started with Cubase on Atari. Its list editor was more complete and functional in 1990 than most list editors today.

Here are screenshots showing the list editor in Cubase VST/32 5 released in 2000 (with the open SysEx sub-editor, which has not changed much since), Cubase 6.5 released in 2013, and Cubase Pro 10 released in 2020.

The same MIDI patterns are used for the Cubase 6.5 and Cubase 10 screenshots, with the same events being muted and selected.

Some essential improvements have been added over the years:

— end note position
— all fields are now scrollable, in addition to being double-clickable and modifiable graphically with the mouse (except the Type field, but that one is via the « logical editor »)
— colouring of events according to their nature, pitch, MIDI channel etc. (old Cubase 6.5 is better in this respect than the latest versions)
— text events (non-MIDI event for entering arbitrary text, used as personal notes in the list editor or performance instructions in the traditional score editor)

Some unfortunate losses:

— distinction between Snap and Quantize (*)
— distinction between SysEx from different manufacturers
— display of the pattern name, start and end positions (in the top bar of the window)
— information density, clarity and contrast going downhill

(*) “Snap” defines the graphical grid and position adopted by MIDI events when interacting with them graphically (of course it can be off). “Quant” defines the position taken by selected MIDI events after invocation of the Quantize command (not limited to notes). This position can be from a rigid grid or any arbitrary predefined groove. The two are now merged and it’s sad (the Cubase 6.5 and Cubase 10 screenshots show this, where the irregular grid is that of the selected groove).

So true.

Recording and mixing with a computer? Amazing. No one would have dared to imagine 30 years ago what is possible today.

Writing music with a computer? Eww. We have to fight the computer a little more every year.

The Cubase List Editor handles this with its horizontal succession of bars on the right.

The horizontal succession of vertical bars is a graphic representation of one aspect of the music, i.e. a score. It is indispensable, although it could indeed be improved when several notes fall at the same time (colours or grayscale?).

The same representation should be added for off velocity.

I hope that terminology from, I believe, Pro Tools (“Region”, “Session”) will not somehow limit future first-class MIDI implementation. “Region” is not really a musical term and Ardour documents are already a bit more than recording “sessions” (because of its piano roll, however rudimentary).

This will be even more the case in the future. First-class MIDI means that Ardour is intended to fully evolve from a recording software to a music composition software. Except in specific cases, recording is a part of the writing process, not the other way around.

Why do I say this? Because MIDI is currently treated by Ardour (and most DAWs that have appeared in the last two decades) like data that is recorded, as hard to see and manipulate precisely as waveforms. That’s wrong.

From the musicians’ point of view, MIDI is a score that we write. Sometimes we play it, the computer listens and writes it for us (I note that for now, Ardour “records” MIDI automation but “writes” native automation when it is conceptually the same thing).

Then, in order to work on our MIDI score, we must be able to read it, hence the importance of the way it is presented to us on the screen.

The same applies to a musical text as to a literary text, the context is king. There is no point in presenting just one letter or word at a time. You have to reread the whole sentence, the whole paragraph, the whole chapter, the whole book to be able to make the right change.

While I’m at it: another very important thing that is widely misunderstood is that we need to be able to visually compare proposals with each other instantly.

A more fluid and powerful interaction model would be to single-click on any “item”, including regions, wavs, native automation and MIDI events, to display its metadata in an info line or floating window always present on the screen (even when empty — unless voluntarily closed by the user). In practice, this means that selecting an item is enough to display its info.

Use the arrow keys to go from one item to another, shift-arrow to select more than one item.

Several items can be selected and edited at the same time, relatively or absolutely (of course, when the items are different in nature, only the relevant data can be changed).

A lot to like about Ardour’s current floating windows (very readable and thorough), but they could be much smaller and present information horizontally rather than vertically, to hide less of what’s underneath. The ability to display one floating window per item (only region for now) is great and could be made optional via a preference (not always desirable because overwhelming when many items are selected).

Isn’t the current Ardour list editor a dedicated window? By combining a vertical and horizontal representation of the MIDI events list, a Cubase-like list editor could probably respect this policy, just saying :wink:

The usefulness of the bar graph has nothing to do with speed. It’s part of a score that you can read with a context, use to get a mental picture of your music, and then make a decision.

Indeed, they can, via MIDI plugins, which can modify live incoming or outgoing MIDI streams (BTW don’t we need an industry standard for MIDI plugins?).

But first, the MIDI score and the best way to represent it on the screen. Secondly, the MIDI plug-ins, although they are absolutely useful and should be part of a first-class MIDI.

Absolutely. Note w/ MIDI number, start and end position, length, on and off velocities, MIDI channel.

The same goes for any MIDI message.

They are not “very” different. You’ll eventually want to control the length and velocity of your drum hits, to place them alongside your melodies, chords and other MIDI messages, and you’ll want to play melodies with your 808s.

If well designed, any toolset that can help one can help the other. The truth is that MIDI is extremely homogeneous. This homogeneity is a strength to be leveraged.

Composing and producing in MIDI means writing a score, more modern and precise than a traditional score, where the interpretation and the timbre are already partly defined.

Given that you have not seen Nils’ work, I think this is a bit of an unfair comparison. From the rest of your reply, I am also going to guess that you’re not particularly skilled at the user of trackers - that’s fine, very few of us are. I have a friend who has introduced me to some of the “peak tracker” work of the 90s (music created entirely in then current trackers) and it is frankly mind-blowing.

This (a) consumes screen real-estate (b) for some types of objects there is a huge amount of meta-data to present, which the vast majority of users neither want to see nor to have to choose what to see (c) fails completely in the case of multiple selection (d) ignores selection by other means (i.e. you don’t mean “click on something”, you mean “when something is selected”)

Vertical vs. horizontal display is a complex question. For some workflows (and on some monitor setups), not obscuring more of the timeline than necessary is primary; for others, obscuring less of the vertical axiis important. It’s not good enough to just say “add an option for this”.

Depends a lot on what “writing music with a computer means”. If generative, modular music is your thing (it’s one of mine), then we are in a golden age of “writing music with a computer” at present, and it’s only getting better.

On the other hand, there seem to be a significant number of people for whom “writing music” should always involve the western staff and notation, and any MIDI-piano-roll-list-editor approach is instantly inferior for their processes.

It is hardly indispensable, when western musical practice has not had any equivalent of this in its entire history. Notated scores do not (as a rule) mark the attack/volume/tone of individual notes, but generally show trends and vague hand-waving descriptions. It is useful anyway? Sure. But let’s not get ahead of ourselves with claims about what is and isn’t indispensable. There are many (most, even) fine musical traditions that do not even have notation at all.

I would disagree with this. I think what is happening is that the creators of those DAWs are following a “writing model” that differs from what you’d like to see.

This is true for some kinds of literary work, and not for others. When you change the name of a character, global-search-and-replace on a word-by-word basis is precisely what you want. When you change a character from a 9 year old boy to a 90 year old woman, or the location from the coast of Ireland to the steppes of Uzbekistan, you need different tools.

What I really want to say in summary is this:

We know that Ardour is currently not a great tool for composing music with MIDI. We also know that there are huge variations in the way people want to compose music on computers, and that any claims that there is a right way to do it are nonsensical from the beginning. Rather, there are many possible variations that one could implement/use, and these would have varying appeals for different groups of composers and workflows (and music styles). The Ableton Live workflow, for example, has essentially no relationship to Cubase or other “classic” MIDI sequencers, and yet turned out to be wildly popular among a large number of (mostly younger) composers. So the question is not “what’s the right way to do this?” Instead it is closer to: what workflows do we try to support well? Which ones (if any) do we abandon? Which do we marginalize?

And finally, I’d note that to this day, I am still generally unaware of much music that I’d consider “great” that was composed using any form of direct MIDI editing. I’d be happy of examples that “correct” this experience, should you be aware of any.

1 Like

Which musician(s)? Any musicians I know of write from something that produces sound in real time. I would echo:

As a musician who can not read music in real time (dyslexic) and so doesn’t both thinking in music as written either (who came up with that horrendous way of depicting music anyway?) Writing music as midi notes direct to DAW does not make music unless each note is fiddled with to correct timing. A very time consuming and unmusical approach. Rather input the midi as a recording of a performance and edit only odd wrong notes. A good musician will have few of those and their sense of feel will be far superior to anything done with mouse clicks.

They are… unless you are willing to accept the drum sound generator’s idea of level, eq, effects, etc. With drums you have one midi channel with 16 or more instruments (or in some cases mics). Any proper editing of a drum track would really be drum tracks because to properly deal with it you would split each note to a different midi track. The same as if one was recording live drums. It would make sense to be able to automate each note(instrument) separately. With a chord or melody instrument none of this is true. Drums are totally different in MIDI than other instruments and any composer who does not understand that will make sub optimal music.

Everything is there. MIDI is already an industry standard since the 70s, And all plugin formats that deal with MIDI at all have a standard way of doing so. I think you more likely mean a standard way of dealing with MIDI in the GUI… by which you mean they should all do it the way you like. However, while the way you like is probably reasonable for a large number of people, I suspect an equally large number of people have made plugins with a different gui base for their plugins because that makes sense to them. It works better for their workflow. There is not a correct way. There are a large number of bedroom composers who love to make music with minimal cost with a 24 key midi controller and a computer. They are happy with their results and some even receive radio play (unfortunately). None of that compares to a live performance in any way, from my point of view. There are some who would take to mean a studio recording is superior (certainly more correct) but I prefer the musician to communicate their heart in the music.

I was not partecipating in this thread anymore but this sentence made me again.

I believe what you said earlier about personal preference regarding workflows can be applied here, I don’t know you personally nor I know your musical tastes but I can assure you there are great examples of music (especially electronic music) which have been composed/written using direct MIDI editing/programming which are awesome to many people.

Of course, as I said, it is subjective but your sentence made me rethink of many musicians I spoke to who listen to certain musical genres and consider different kind of music inferior or not even music at all; 100% of the time those musicians were listening to music that is mainly composed using traditional instruments (guitars, keyboards and such), music that tends to be recorded, and were considering electronic music (super general term but makes sense) inferior because of the use of synths and MIDI editing (“because you don’t play an instrument, you know”).

I don’t want to attack you or even put you in that category of people/musicians but that sentence and the ideas correlated to it feel to me a bit elitist, not very open minded and simply wrong.

Here are some examples of music I like, I repeat: I personally like, that are made mostly using MIDI editing/programming:
Noisia - Shellshock ft Foreign BeggarsInvidious link
All of their music focuses a lot on sound design, and in some interviews they said they always MIDI program their drumsets and if you listen to this song drop/chorus you can definitely know why.

KOAN Sound & Asa - StarliteInvidious link
Their music too focuses a lot on sound design to make all sort of sounds and drums, MIDI programming is essential in here too. I chose this one of theirs because I think is a masterpiece using both orchestral sounds and insane synths.

These are just two examples that come to my mind right now but I can assure you there are many many more.

Sorry if I sounded rude in this post, I didn’t mean to attack anyone, just wanted to share my views.

I use Ardour a lot and I think it’s the best piece of software I use but given the music I make I feel like MIDI composing (not MIDI editing in general) is a little behind like many other users who said it in this thread and others. I know that not every tool is right for everyone but Ardour is, according to me, perfect in all senses and it just needs (again: my opinion) some adjustments to make it awesome for MIDI composing/programming too. Please don’t consider it a class B feature just because the music you like/make doesn’t use it/uses it a little: as I, hopefully, showed you there are many music makers, including me and all people replying to this thread, who use it to make beautiful and fun music, sounds, whatever (maybe @unfa can help this discussion sharing his views or pointing you to more MIDI made music)

Have a nice day Paul, thank you for your work.

4 Likes

You are absolutely right. I myself come from a guitar-drums-bass-vocals background, but it would be a major issue to implement my musical ideas the traditional way, no studio, no musicians, not enough money to rent either.
I’m not saying my ideas are great and will the save the musical world, however, to express them anyway, I am movimg more and more towards a mixture of midi and audio, with an increasing part of midi programming. While I can get along with the current facilities of Arodur, I’d also would appreciate any improvements on the midi side.
And remember: You do not have to be a brilliant artist to use Ardour!

4 Likes

I want to be very clear. I listen to a A LOT of electronic music. Probably at least half of my daily listening (and I listen to music more or less all day every day) is either 100% electronic or mostly electronic.

My remarks about the results of direct MIDI editing were specifically about nothing but direct MIDI editing. That is, using some technique to directly manipulate MIDI at the individual event level inside the program (either using the mouse, keyboard, touch or some combination). They were not about using groove generators, pattern sequencers. They were not about using clips/loops in the context of a trigger-driven setup (AKAI MPC or Live, or anything in this area). They were solely about stuff done by someone sitting in front a piano roll or an event list and working with it analogously to the way one might work with a western notation staff.

They were not about electronic music in general, which I have enjoyed since I was 14 (that’s quite a long time ago now) - perhaps not all of it, since it’s a huge genre that really doesn’t deserve a single name, but certainly huge chunks of it.

Having said all that, if either of the pieces you linked to (I listened to both of them) were in fact done mostly with direct MIDI editing, then I stand corrected.

While we’re at it, the electronic music I enjoy most is made with synths that do not use MIDI for the score. e.g. supercollider, csounds etc.

I think it would be cool if we could integrate something like AlgoScore and have a more buchla like approach to things.

Just dreaming.

1 Like

Oh my, I’d already be thankful for one single program change (and bank change perhaps) field for each MIDI track so I could automatically set the right preset for my hardware effects instead of ending up with an export that went through the wrong master FX whenever I forgot setting it manually.

There’s lots of room for confusion there. Program changes could be considered per-track, or per-region, or neither of the above. You can trivially insert a PC into a MIDI region, but sure, if you edit or remove that region, you may lose the PC message.

The assumption that there’s a single PC for the track is common to many workflows, but not all.

Undisputed. But in the quite specific case of bank select and program select a single value sent with the “PLAY” MTC should definitely do the job. I never heard of any MIDI device that would change program or bank because it did not get a program change signal - and I’ve seen some pretty neurotic ones.

MTC has absolutely nothing to do with this.For most users, Ardour will typically never send MTC “Play”, and very few devices will respond to it. MIDI Song Position Pointer messages are a little more likely, but also not related in any way to PC messages.

Ardour’s not going to send a PC message that doesn’t exist, so if you’re device changes programs it’s because the MIDI you’re sending it contains PC messages.

Ardour remembers the most recently received bank/program change per track and replays this on session load. Perhaps there’s a bug that it’s replayed too early for the external synth to to receive this? What backend do you use (Menu > Window > Audio/MIDI backend)?

As @paul mentioned this can be overwritten by adding explicit bank/patch changes in a region and play through it.

Can you elaborate how this failed for you? How did you set the program?

Also, could you add “ACE MIDI Monitor” to the track in question and see if the event is played?

First of all, I didn’t set the program. Not via software, that is. Because I couldn’t find a way to set bank or program. So I set it manually - but since effects don’t react on “Play” or “Stop” there was, of course, no feedback. I might, however try to record on MIDI next time and set the program while recording, my TCE compressor, at least, should send the CC to MIDI out.