TidalConduit is a live coding editor suite built for sound production and performance. It is built around Supercollider and TidalCycles, extending both with custom MIDI, sound and video engines. It allows the creation of dynamic and interlocking patterns that can be interpreted as sound or visuals in real time. Any external data source can be used to influence these patterns.
TidalConduit is a native macOS application, written entirely in SwiftUI and Metal.
main interface, with external visualizer window
↗
Tidalconduit v1.0: 10/05/2024
↗
visualizer is written in 100% SwiftUI
↗
current features
- Automatic startup and hosting of GHCI and SuperCollider subprocesses
- Automatic multi-channel audio routing
- Quick actions injecting SuperCollider code into the audio server, e.g., for re-routing of effects
- Autocompletion and tagging system for snippets, library functions, sample names and more
- Sample browser with musical key tagging and harmony recommendation
- AudioKit output stage with Bus effects
- Quick dispatch to external multi-channel recording software with routing pre-configured
- Internal visualizer module, allows to inject f.e. WebGL into an embedded WebView
- External Visualizer module, that renders the code on top of programmatic visuals (SwiftUI Canvas API), or GIFs that are fetched from an are.na channel
- Procedural composition tools, derived from an extension of my project amsel
Currently, TidalConduit is very opinionated and tailored to my own way of working with sound, visuals and Tidal Cycles. In the future, and if there is interest, I would like to make it more customizable and configurable, so that it can be used by others as well.
Tidalconduit v1.0: 10/05/2024
↗
Algorithmic pattern creation and analysis is an extremely interesting field. Live coding brings an intimacy to the process - the direct responses such a system can give allows one to experience rhythms, pulses and n-dimensional patterns on a more emotional and primal level.
are.na update channel
Follow updates on this are.na channel: