I've been working on an free open-source macOS app for just that - https://nottawa.app Hoping to release in the next couple months!
The UI has been greatly improved since I took the original demo on the site, the real thing is MUCH better now. Same base idea - chain together shaders, videos, or webcams and then drive their parameters via an audio signal, BPM, oscillator, MIDI board, or manual sliders.
The beta link on the site isn't really worth trying yet - if you're interested in getting on the TestFlight just shoot me a message at joe@nottawa.app. Would love some HN feedback :)
The code isn’t anything to write home about, it’s in C++ leveraging OpenFrameworks and OpenGL. I’m an iOS and macOS dev, but after the initial release I’ll get started on porting to Windows and Linux. OF generally works well multi-platform so I’m hoping it won’t be too hairy.
I’m specifically targeting the non-technical artist/creator market, ideally with optional macOS App Store distribution. I’ve been involved in the live visuals scene in NYC a bit and something I commonly heard was that musicians and DJs wanted visual accompaniment which just works out of the box. TouchDesigner etc are incredibly powerful, but generally out of reach for non technical folks.
I’ve contracted a great artist from UpWork who’s been making presets which will be included. There should ideally be as little friction as possible for a user to go from first launch to live, audio-reactive visuals.
The UI has been greatly improved since I took the original demo on the site, the real thing is MUCH better now. Same base idea - chain together shaders, videos, or webcams and then drive their parameters via an audio signal, BPM, oscillator, MIDI board, or manual sliders.
The beta link on the site isn't really worth trying yet - if you're interested in getting on the TestFlight just shoot me a message at joe@nottawa.app. Would love some HN feedback :)