This is for Ardour 8.2.0. I’ve got a a session in which the dragonfly reverb plugin starts with distortion and pops when the session is loaded. If I manually go through each track and disable/enable it behaves normally. This can be tedious. Any Ideas? Thanks
How old is the reverb version you use? There was a similar problem fixed a few years ago (version 3.2.3).
Version 3.2.10 is the latest version.
I also had this problem. Disable/enable the plugin and it works. But if you have 20 tracks with dragonfly plugins, then it’s a bit annoying to disable/enable the plugin in all tracks. I think Dragonfly is really a great reverb, but it’s a little heavy on cpu and with the distortion at the beginning I’m working with other reverbs now
Instead of having separate reverb plugins on single tracks, you really should consider introducing a “reverb bus”, where you then only have one reverb plugin instance (plus a highpass EQ after that
) and have aux sends to this bus on all tracks where you want reverb.
This way you can use the send fader to control how much signal you want to be reverbed per track, and you have the bus fader to control the global amount of reverb added to the mix, and a mute button, so you can double-check, if the reverb really improves the overall mix. ![]()
If you do that, keep sure to disable the “dry” signal in the reverb plugin.
And you may, if needed, have multiple busses with different types of reverb (spring, room, plate).
I have the same problem and i am using 3.2.10.
I would love to have the feature in the plugin dsp window to turn on/off plugins from there.
Maybe a lua script that enables/disables all plugins would help with that.
Does anybody know such a script?
I’m ashamed to say that LUA for me is like Chinese algebra but I did ask chatgpt for a hand. Unfortunately it doesn’t work. If there are among you that know this kind of thing and can debug it, here it is:
ardour {
[“type”] = “SessionHook”,
name = “Reset Dragonfly Reverbs on Session Load”,
license = “MIT”,
author = “ChatGPT”,
description = “Disables and re-enables all Dragonfly Reverb plugins on session load”
}
function factory ()
return function (session, config)
print(“Resetting Dragonfly Reverbs…”)
local proc = ARDOUR.LuaAPI
for route in session:get_routes():iter() do
for proc_index = 0, route:processors():size() - 1 do
local p = route:processor(proc_index)
if p:isnil() then goto continue end
local name = p:display_name() or ""
if name:lower():find("dragonfly") then
print("Resetting plugin: " .. name .. " on track: " .. route:name())
p:set_enabled(false, nil)
p:set_enabled(true, nil)
end
::continue::
end
end
print("Done resetting Dragonfly Reverbs.")
end
end
Safe to say that ChatGPT has no idea what it’s doing here. It’s using methods that don’t exist and there’s no such thing as a “SessionHook”. The log should have been telling you that this script was failing with an error.
The below works, in that it does find, disable and enable all “dragonfly” reverbs. Whether it fixes the distortion I don’t know.
This must be added in the Action Hooks section of the Script Manager, if you didn’t already know.
ardour {
["type"] = "EditorHook",
name = "Reset Dragonfly Reverbs on Session Load",
license = "MIT",
author = "Ccee",
description = "Disables and re-enables all Dragonfly Reverb plugins on session load"
}
function signals ()
return LuaSignal.Set():add ({
[LuaSignal.SetSession] = true
})
end
function factory ()
return function (signal, ref, ...)
print("Resetting Dragonfly Reverbs...")
for route in Session:get_routes():iter() do
local i = 0
repeat
local p = route:nth_processor(i)
if not p:isnil() then
local name = p:display_name() or ""
if name:lower():find("dragonfly") then
print("Resetting plugin: " .. name .. " on track: " .. route:name())
p:deactivate()
p:activate()
end
end
i = i + 1
until p:isnil()
end
print("Done resetting Dragonfly Reverbs.")
end
end
Chatgpt is like a teenager, has some knowledge, but often err frequently, until he learns, also influences the detail of the prompt, you really have to give it much detail when writing to get a reasonably useful answer, I personally prefer the knowledge and answers of a real person, the AI systems are still teenagers, and to train them they are using, in many cases without permission the work of people, professionals and authors.
In any case the usual procedure of using reverbs is to use buses (1 or more) and send to that bus the channels you want.
The main reason is that all plugins consume resources and especially reverbs usually consume more resources even computer mostly CPU but also memory RAM.
Placing a reverb for each channel you want to have reverb is inefficient (use a lot of CPU) and usually gives phase problems or other side effects, if someone in this forum is a sound engineer can explain it.
Those of us who already have some gray hair and have been DJs, live technicians, sound engineers, etc., most of us didn’t have a lot of hardware resources available, we used to have a reverb or two at most, that’s what we had and dealt with it.
On the average decent (not necessarily expensive) mixing tables there are always sends in each channel and return channels even some tables have built-in effects.
As already indicated in those buses the effects are applied at 100% that is to say the dry signal must be at 0, only leaves that bus signal with effect applied.
woozl.1986 do not know what is your knowledge base, I do not know if you have studied something related to sound, I do not know if you are musician and produce/ mixes your own musical creations or for cinema etc, but you are not the only one every little someone asks in the forum and we find that they use reverbs by channel, instead of using buses.
But this to apply reverb in each channel, we strongly recommend that you do not do it, use buses for that function, your CPUS thank you;)
.
Since it became popular to study at home with a pc a sound card and a microphone, it is quite usual especially for a long time, that there was not so many resources as now, that’s why it is recommended to those who produce themselves but do not have a base of knowledge of mixing and production and the efficient use of virtual or real mixers, that you look for example in videos or courses on the network, tutorials from people who know that they have studied production and mixing, nowadays with the knowledge available on the internet in many cases it is not necessary to spend even a € to learn the basics.
greetings
Wow, thanks!
I am going to try this tomorrow.
I would use fx busses if it would be possible in my work. Unfortunately not all projects can be built that way.
I am interested in how several reverbs should create more phase issues in comparison to a single one. Can someone explain this? ![]()
This may help explain ……
That’s just explaining basic phase cancelling, or did i miss something?
I get that these phase issues are caused by interference. I just don’t understand why the resulting signal should be different when you use multiple reverbs instead of one. Don’t the signals get processed through the same formula? I imagine it to work like a delay, where the signal is the same independent on if you use only one or multiple instances of the delay.
When slightly delayed ,1ms to around 40-70 ms of delay, signals are mixed they can cancel each other or cause filtering or cancellation . So it is dependent on the reverb settings. It may or may not cancel.
This one will also enable all previously disabled dragonfly plugins, right?
I am not sure if this is what dave owl meant. I thought it was less obvious. Definetly right, that different reverbs will give a weird room impression especially when delaying them…
Hi, you can use reverb on both buses and channels, if you want to use a reverb only on one channel and not the others, it’s a creative decision, just like if you apply delay or distortion to one channel and not the others, what I have commented before as a reference is that if you want to apply reverb to all tracks the optimal way to do it is using buses not putting 20 reverbs in 20 channels, for the reasons I explained earlier.
If you place a reverb on a bus DELETE the reverbs you have in the channels (except for creative decision, in a specific channel, if you wish), and create sends from the channels to the corresponding bus.
The shipments can be pre or post fader, that depends on what we intend to get, if it is prefader you adjust the shipment to your taste, if it is post fader the sending amount varies according to the setting of the channel fader.
I’m not a sound engineer, I’ve been a dj and sound technician, and surely someone will explain it better than me but I’ll try…
When you have to mix a certain number of channels, few or many, the important thing is the range and intensity of the frequencies that each instrument generates in each track, that’s why it is advisable to equalize in the tracks by looking at the frequencies dominares and those that do not generate the instrument, so that from that track no frequencies come out that are not useful to hear well and beautiful the particular instrument.
For example a bass or a bass drum, generate low and medium frequencies and you have to adjust them so that they do not “disturb” each other, (with sidechain for example), but equalizing both is not simply putting a filter passes low, because a bass or a bass, to be heard in between the other instruments need to have higher frequencies that also generate these instruments (but to a lesser extent), especially considering the devices on which music is usually heard today, on small phones, tablets and speakers that cannot reproduce low frequencies.
Each instrument has its frequencies and ranges more intense and others that are not so much and that is what we control with equalizers in the link below you can download a plugin that includes an interesting functions: Releases · ZL-Audio/ZLEqualizer · GitHub
This of frequencies and dominance or not in different instruments, also affects the reverbs because we use them to simulate spaces, and if you have been in some live concert you will have noticed that it does not always sound good in all parts of the event, normally in some parts you hear a lot of bass and in others hardly, although this is caused by many possible variables, sound systems that are used position, height range of frequencies, etc of the speakers, Nowadays sound systems in concerts can be really complex (in these cases it is not the use of reverb, which may cause problems, is only one of many possible factors).
But leaving aside the concerts when in DAW, in this case Ardour of course ;)) when you start adding signals, each track with its effects (the treatment we have given it), you start having an amalgam of sound that has to be controlled, so that everything is heard but nothing is excessive, or one instrument is cancelled with another, or some frequencies are uncontrollably amplified.
It is recommended that the reverbs you use, have adjustable filters as it has dragonfly reverb, to control to a certain extent the frequencies coming out of the reverb and reaching it from the sends, the reverb create sound space, but if you go over reverb and bathe everything in it you can blur the mix, especially if there is excess of mid frequencies.
I highly recommend to those who do not have basic knowledge say, mixing and production search on youtube and other platforms and of course in websites directly (which is not all watching videos), for there are free resources and paid, to learn the skills needed to blend well with sense and sensitivity.
There are a number of plugins that can help much to detect phase problems and others and you may know their creator ; )) x42 Meter Collection
I leave this link in which we talk about known problems and how in the company of DAVE RAT they solve it, although they talk about the system that they use , this video is interesting for the concepts that are explained on phase cancellation and other problems, I recommend to see it, even if you do not talk in concrete so reverb.
Live Sound: How To Make Every Seat
Forgive this text so long, I wrote it not an AI, and sure that I have been wrong in something of the written, especially considering that I am dyslexic… Greetings.
Yes it will. To only re-enable enabled ones change this line…
if name:lower():find("dragonfly") then
…to…
if name:lower():find("dragonfly") and p:active() then
Unfortunately it does not help with the problem. Maybe it is running too early in the startup process?
The log says, that all the reverbs were reset so i guess it must have done what it’s supposed to. ![]()
This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.