What is Ardour's Generative AI policy?

Hello,

I am trying to figure out what Ardour’s stance on Generative AI is.

The response in Do the tutorials on Ardour’s YouTube channel use generative AI? was this:

However, I am noticing that Ardour is now trying to add an MCP server in version 9.3.
From New web-based control surface and remote collaboration for Ardour, post 2*

On the upside, recent Ardour/git has a MCP server, which will be included with the 9.3 release, and it looks it can replace your shim.

This would appear to contradict the answer given from the first post, because a MCP server allows Generative AI to control Ardour, therefore giving it a role. In addition, recent commits has both a README.md and ISSUES.md; which is unusual for a control surface for Ardour (the other control surfaces have no markdown documents), and the latter document (ISSUES.md) appears to (but might not) be written by an AI.

I have some questions, because I am confused:

  1. If Generative AI is supposed to have no role in Ardour, why does the MCP server exist?
  2. Does this mean that LLM code is not allowed, but LLM based features are?
    • If the answer to #2 is Yes, why would a full AI ban not be possible?
    • If the answer to #2 is No, would it be possible to remove the MCP server?

*can’t link due to two links per post limitation

1 Like

We are still, like the US legal system, evaluating how much human involvement is required to change something from “machine generated” to “worthy of consideration”. The answer is definitely “more than none”, but it’s not clear that it is “any LLM involvement is disqualifying”.

Okay, I see that using LLM’s for generating code is banned. Does this mean that LLM based functionality is not? The policy does not appear to cover LLM based functionality (the MCP server).

1 Like

Our policy re: LLMs is based primarily on its impact on the project itself, despite the two main developers’ severe concerns with LLMs in general. At this time, we are not seeking to make a statement about LLMs by banning (or ignoring) any possible interactions between Ardour and an LLM (that could change). We are, however, seeking to avoid the problem of random individuals overwhelming our code review process with LLM-generated submissions, along with the issue of incorporating code that has subtle but deep misconceptions making its way into our codebase.

So at this time, “LLM-related functionality” is acceptable.

3 Likes

I earnestly and respectfully ask you to consider banning anything to do with “generative” AI outright. It’s issues are quite foundational and unlikely to do anything but worsen with time, such as it’s environmental impact, it’s effect on the price of hardware, and that’s not mentioning its ties to the war industry as well.

I’ve been using Ardour for around a year, supporting it financially since I made the switch and have been getting great use out of it. I would have tried to contribute directly as well, if I could code worth a damn.
I’m proud of the fact that my audio workflow is entirely FOSS and AI free (with the extremely troubling exception of the Linux kernel itself now). I’ve recommended it to others using that post of mine mentioned by OP, as I was so happy with your answer and knew it would put others at ease, as well. It’s kind of funny, because I soon realized how accusatory I came off and regretted my phrasing, especially given your response. I still wish I had chosen better words, but I even used that post to suggest Ardour for the Starlight Network’s No-AI list! I was oddly proud to be the one who got it there too, it simply meant that much to me.

I appreciate the fact that Robin and yourself (I assume that’s what you mean by “the two main developers”) are concerned about “gen” AI as well, but I urge you to let that bleed into Ardour’s policies for more then just copyright or quality reasons. It’s not unprofessional and would be quite a good look for the project, in fact. Heck, even if I wasn’t the sound nerd I am, I’d still throw a few dollars at the project every month just for that. Backing such projects is even more important as open source software grows complacent with these slop machines, so long as they’re invisible to the average onlooker.

Apologies for the lengthy message, though I hope it communicates the importance of this topic. Having switched to Linux from Windows in the first place to escape corporate theft devices, I thought I was retreating to a place predisposed against such things. Now I find myself becoming a staunch advocate for strict anti-AI policies almost out of necessity. If you want an example to work off of, I can suggest the one adopted by elementary OS. I’ve been considering installing it on my music creation rig, as I want it to be close to AI-free as possible.
I’m deeply grateful for the work you have done over the past couple decades to bring Ardour into the world, continuing to grow and maintain it all this time. I’m also grateful for the kindness you continue to show myself and others who are learning, no matter how basic the questions may be, and for your continued positive response to user feedback. The fact that you finally added a separate MIDI window for those who demanded it, a Tool-releasing-Fear-Inoculum moment, speaks to that. I know it means you’ll consider this as well, and (I’m not sure how to say this without it sounding like a passive threat, but) I hope you’ll make the right decision.

Thank you for reading,
cloudskater <3

7 Likes

AI seems to be here, and here to stay.
Not being a SW developper myself, I cannot rate if this a good or bad thing - except that the code quality might be poor and hard to test - or if it is the fear everytime a new technology is introduced (which is a very human behaviour of course).

For me it looks like SW generation now has to focus more on the specifation phase which I believe is a good thing.

Anyway I’m very curious to hear pros and cons of AI supported SW development.

Pro: if you know what you’re doing, AI can be a helpful tool to accelerate the tasks you’d be doing by hand otherwise. It can free a capable developer of mundane, repetative or boring tasks, and free time to focus on the more important tasks.

Cons: If you don’t know what you’re or the AI is doing, you have a bullshit generator that’s good at lying market itself as the best thing since sliced bread.

AI is a capable tool with lots of good use cases. Unfortunately, it’s overhyped beyond the moon and forced into a lot of use cases where it really does not belong, or forced to do tasks it is not ready to do.

Not into programming myself, it is at least imaginable quite a problem is getting flooded with “code”.
Like it is a problem in others and pretty much any topic. You get a wall of “info” in the blink of an eye, some things correct, some half-wrong and some fully and even laughably wrong. Before you have that sorted, before you even started to sort it, assuming you care for it, you already have 5 more walls of info.

(Not much interest in general, i let an llm write me in 5 seconds a lua script for ardour. Impressive indeed. A pity it didn’t work at all. And the same happens in forums and chatrooms. You ask something, someone gives you the llm generated code in the blink of an eye, but it doesn’t work ).

Perhaps that rather social problem, simply gettting flooded with half-wrong “info” ( or code) will diminish over time. It sure is annoying as of now.

Generative AI is a theft machine.
I’m not sure how it handles coding assignments, but when it comes to art, it simply snatches and assembles pieces it finds online. Use it long enough and you’ll get the idea. Anything AI related where you interact with it’s server, run like hell. Being completely offline is your best bet :slight_smile: .
I hate to even talk about it (makes you look paranoid etc), but i cought it snatching something mine on two occasions. Once it found my classical guitar intro i did for a student movie back from 2005, keept the notes/harmony/tempo and changed it to piano, and put it in an indie horror game trailer. It’s very unlikely that it’s coincidence because the chords/positions aren’t standard ones, and it’s created by ear. Second time it snatched the bass line+rythm, tempo and the title of an electronic track, and put it in a pop-sounding tune which ended up on artlist. Now, this wouldn’t be even weird (the original idea ain’t that off) only if the title wasn’t snatched also :), that’s simply too much coincidence .
You have Ola Englund’s video on youtube where he shows what AI snatched from his tunes.
AI isn’t “creating” anything, that’s an ilussion a lot of people don’t have the courage to admit, because it means most powerful men on earth are outright liars and thieves - which is true, in opinion. It’s just to much to take in, nobody wants to see that kind of sight all around them.

3 Likes

I sure understand AI is a threat in creative aspects

But, keeping the discussion to SW generation
remains the question how the generated code must be classified inline with the copyright restrictions of the training data if it is legally acquired.

And if it is and the copyright restrictions allow I do not see a problem in reusing code in whatever form be it by human or artificial intelligence.

Thinking about it even for any other type of AI applications the real problem is not AI but the illegal use of training data. (I think there was a case in Germany about exactly this last year)

AI should be a tool to support humans, help us, make us better, not replace us, separate us, isolate us…

Unfortunately, it can do both and will continue to do both for a while…

1 Like

I think good use cases exist, at least in the sense of searching for things, with different possible objectives behind the searching.


While Python is not the same as Ardour, I’m slightly reminded of the following Python post and chain of replies upon replies, which I was following daily for a while when it was active, though now I wouldn’t recommend reading all of it, for one can probably make better use of their time.

Here’s 3 quotes in that topic, from someone who’s a CPython core developer, for the purpose of perspective I thought was worth sharing.

“…I want responsible use. AI is a new tool, but brings new potential benefits as well as new potential dangers. …”

…Because the PSF can only accept contributions that the contributor has the legal right to license, all submitted material must be free of copyright or licensing conflicts, including material produced with the assistance of AI tools. Contributors are responsible for ensuring that their submissions meet this requirement. …

"…we already have a policy. “Keep the status quo exactly as is” is a third proposal, and wins by default. …


(For clarity, I’m against building datacenters though. At least in the current sense of noise pollution and consequences for local electricity bills. Not gonna spend time thinking of every reason to despise them.)

That’s why I am very strictly against any “generative” AI, but not machine learning in general. I could be wrong, and maybe all LLMs are unethical by design, but to the best of my understanding, machine learning can be used to help predict things, like spotting medical issues before they become life-threatening, or at least pointing doctors in a general direction.
However, due to the fact that all AI currently marketed as such is built on the same foundation that disrespects artists and people as a whole, I see little indication that these distinctions matter as of right now. Once we get out of this bubble, maybe earnest people will be able to develop such technology for good, but it will be a long time if so, and whatever comes out of it will only be a distant cousin to “generative” AI at best, because “generative” AI is built from the ground up to do harm.

2 Likes