The MIDI 1.0 Detailed Specification defines a Bar Marker message (p. 59 if you are going by PDF pages, p. 54 if you’re looking at the numbers on the bottom the pages) which “specifies that the next MIDI clock received is the first clock of a measure, and thus a new bar.” I’ve configured Ardour to send out MIDI Beat Clock. MIDI Tracer and my attached device both show that the 0xF8 bytes are sent when I start the transport, but I don’t see any Bar Marker messages.
Context: I’ve been using Ardour to send messages to a device I’m developing which I’d like to be able to sync with other stuff that speaks MIDI (e.g., Ardour, a sequencer, a synth capable of sending out its BPM, etc.). As I was thinking about how to implement this, I realized that Timing Clock messages on their own don’t really provide musical information but rather raw timing information; timing data need some reference point to gain musical meaning. While it’s possible to count ticks relative to some arbitrary event (e.g., a Note On) to, say, achieve an arpeggio effect, there’s something appealing about having more structured reference points such as Bar Marker messages in the toolbox as well.
Ardour does not send Bar Marker or Time Signature messages. There is no code to do so. It could be added.
These are very, very niche messages, and it isn’t clear what would be gained if they were added. But hey, it’s all open source, and I’d accept a non-LLM generated patch to do make this possible.
I thought not. I poked around the repo for a few minutes and didn’t find a smoking gun, but as I’m unfamiliar with the codebase I thought perhaps I wasn’t looking in the right place.
So I suspected, and so you confirmed!
The benefit the author of blog post articulates is ease of syncing gear, e.g., Ardour could drive time signature changes to external gear (to be quantized to the next bar), which can be a bit of pain point depending on how much gear you have and how many time changes you’re working with.
Another application (I’m just spitballing now) might be to quantize some external effect such as a filter sweep over X bars, regardless of articulation or when/how many notes are played within those bars. One could use automation for something like that, but it’s a pretty different workflow and not as conducive to live performance or improvisation (like if the performer wanted to change the effect, or the value of X).
I’ll put it on my rainy day list. If we get enough poor weather, I might actually get to it!
But the post is also lamenting the fact that the external gear doesn’t support it either. So it is a bit of a chicken-and-egg situation. No hardware support … no or limited software support … no hardware support, etc. etc.
There’s also the general observation that until very recently one could say with high confidence that DAW+external hardware workflows were fading away in favor of DAW+plugins (the latter have much richer access to the “time & tempo state” of the host). However, at this point in time, it is no longer quite so clear that that’s true.
Thinking even more broadly though, western european time signature concepts are pretty inadequate for representing rythmic structures found elsewhere in the world, so in a way I am sort of happy that this never really took off. The idea of Indian classical musicians trying to use N/M concepts to enable external hardware to sync up with a tal is a bit depressing.
This is probably where I’m different from most people. I will make objectively ill-advised purchases for single-purpose, screen-less external devices in order to reduce the amount of time I spend in front of my do-everything laptop screen.