ARDOUR Live Interface Touchscreen and Hardware Controls

Example: Open Stage Control Ui/Ux for ARDOUR

My Use-Case
Nearfield Computer Vision Augmented Audıo-Analysis for Predictive Note Onset or less technically Motion Capture Computer Vision Augmentation for existing Polyphonic Guitar and Bass MIDI Controllers.

The computer vision system described is an entirely hidden electroacoustic, a phased array ultrasound that tracks hand and finger geometry to create a musical context for standard MIDI pickup and hardware output in much the same way it is commonly used with a camera for reading sign language.

In my implementation the instrument learns the unique expression of the individual musician, responding with increasing speed and fidelity.

I have two versions, Linux and Windows IoT real-time OS guitars, making OSC/ARDOUR effectively the Instrument’s user interface.

The knobs and switches on my partscaster test mules are absolute position encoder actuators which in default mode allow presets for pickup volume and tone, emulating different types of capacitors for tone, individual pickup volume, master volume.

My proposal is currently slated to be an OpenCV + PyTorch based system inspired in part by the GuitarML developer/user community launched by Keith Bloemer and friends.

This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.