Seamlessly automate your audio-visual setup! This open-source framework uses the Open Sound Control protocol to integrate audio mixer consoles, OBS, PTZ cameras, and more. Perfect for live production enthusiasts, streamers, and tech tinkerers.
I have made it originally to meet our needs, then opensourced it: We needed to move a PTZ cam based on the stage/pulpit mute states on our X32, but it is capable for way more. Let me know what do you guys think!
Cheers!
> We needed to move a PTZ cam based on the stage/pulpit mute states on our X32, but it is capable for way more.
PTZ - pan/tilt/zoom camera, that much I understood. The rest? Uh… can I get an ELI5 please?
Even though I’m clearly not in the target demographic, I’m eager to learn more..
Edit: ok, clicked through to GitHub, now I (kinda) got what it’s for :)
Starred!
I’ve been playing with hooking up a MIDI controller to my OBSBot Tail Air PTZ camera and OBS.
The config and filters and triggers looks similar to my prototypes.
I’ve been wondering if there’s any sort of prior art or standards here from other domains like lighting consoles or workflow systems.
I did an explainer on OSC to explain the concepts. Indeed I think its the easiest network protocol to just hook different things together: https://www.youtube.com/watch?v=0uOR2idKvrM
Could you have some sort of tee, so instead of dedicating a channel you multiplexed a second extra-audible signal (outside the audible range) onto the same channel? Or perhaps send a really low frequency signal over the cable sheath??
Just spitballing.
Very nice! The ability to get novel interactions out of connected devices is something I think we're just starting to bloom.
Unrelated, I have an old Axiom midi controller that I'd like to reprogram to use with GarageBand. I'm not sure where to start, but I'm thinking of using Go or Rust. Do you have any pointers on how to get started?
OSC (Open Sound Control) is just awesome. It's basically a lightweight protocol on top of UDP packets. It's not hard to roll your own implementation if there isn't one for your platform. It's lacking a lot of features you'd need for a scalable system, but when you just need a few systems to send realtime messages to each other, it's tough to beat.
I've used it a lot for the original designed use-case (sending parameter updates between controllers and music synths), but also a bunch of other things (sending tracking information from a python computer vision script to a Unity scene).
The amount of neat things my synth friends do, including controlling lighting, with control voltage has me dreaming of a world where knobs and levers and UIs generally were way more customizable than we have now.