‘Constellation’ is a generative/procedural MIDI composition application which updates and provides visual feedback of both current and upcoming events in near-realtime. The aim is that future optimisations will allow the app to run reliably enough to function as a live-coding music application, enabling both improvised composition and live performance alongside from accompanying musicians.
Recently, there has been a resurgence of interest in generative music through the development of smartphone apps. Ios apps such as Bjork’s Biophilia, Kraftwerk’s Kling Klang and Brian Eno’s Bloom have pioneered this trend.
While these apps are fascinating and fun to use, few have significant use to composers and musicians, either imparting too many of their own qualities and sonic imprint on the resultant music or not offering enough control over the sounds, pitches or chord structures used. They are more app-as-album than creative tools in their own right.
‘Constellation’ is an attempt to build a generative and procedural music application which functions as both a useful composition tool and live performance environment with visual feedback.
The application is essentially a modular composition tool, which allows the creation of multiple interlinked nodes. Each node is assigned one of a number of predefined commands and parameters, which are then processed, with the data being passed on to other connected nodes. This data is then used to render MIDI data for playback and provide visual feedback. As changes are made to the commands and parameters, the MIDI data and visuals update. In addition, as the composition plays, the visuals update to indicate which notes are playing.
It can be used:
- To generate new melodies, rhythms and song structures according to a user-created system.
- To experiment with new structures, scales, rhythms, melodic contours and compositions within a given framework quickly and easily.
- To generate harmonies and melodies which fit a given chord pattern or as an aid to further develop existing ideas.
- As a visualisation and learning tool for music theory study and analysis.
Electronic music often struggles to maintain interest in a live setting, relying on live visuals which may not have a significant connection to the actual music. There is not necessarily anything wrong with this approach, but my aim with this application is to make the actual arrange space work equally well both as an accurate visual representation of the composition and provide interesting, dynamic and meaningful visuals for the audience. The audience sees exactly the same that the performer sees, removing the usual barrier of the laptop screen.
Lo-fi green-on-black star-charts, vector graphics and DOS-style commands invoke the 80s sci-fi computer display aesthetics found in films such as Robocop, Bladerunner, Wargames, Tron, Videodrome and The Fly. Given the themes the field of generative music touches on - the roles of humanity and technology in creativity, AI, determinism, machine learning and neural networks - this seems wholly appropriate.
Currently, the application is in beta stage. It has been developed using Processing/Java to ensure fast development and relative ease of porting. The aim is to eventually port the app to the ios platform, where the application would greatly benefit from touch screen and wireless MIDI via coreMIDI capabilities.
More information about the application and a downloadable beta is available from:
Examples of music composed using this application can be found among other work submitted here and at: