Coda is a generative musical instrument and MIDI system coded in JavaScript. mixing music theory, mathematics, and design.

 
 

THE PROJECT

I set to create a system generating dynamic musical compositions based on sets of rules and user input, driven by my interest on applied mathematics, music, generative art, and creative coding,

The first idea was to create a mini-game with a dynamic responsive soundtrack that would evolve depending on the player’s actions, but it soon became really evident that the generative sound system was deserving of all the attention, and became the main focus of the project.

 

GENERATED PIECES

Those two pieces were generated in real-time by the engine, using different synth stacks as output and different embedded logic.

 

EARLY VERSIONS

An early version featuring characters and random sound

CODA key picker

PRECURSORS & HISTORY

I always had a close relationship with music: From playing with toy electric pianos as a kid, to studying classic guitar and then playing bass guitar in a band for 7 years. As my musical taste widened, I got more interested in different approaches to music creation and expression, discovering a really interesting aspect of it through the works of artists such as Brian Eno or Björk, who defined and developed generative processes and non-standard instruments as a basis for some of their work. Another clear influence for this project were the interactive pieces and videogames created by Toshio Iwai.

Pythagoras' Harmonic Theory defined the mathematical base of western music theory, which was widely applied by J.S. Bach later to "generate" some of his work by applying some arbitrary rules on top of it. This mathematical base coupled with rule-sets makes music generation a perfect fit for computational systems, as Ada Lovelace anticipated in her study notes on the Analytical Engine:

"Supposing, for instance, that the fundamental relations of pitched sounds in the science of harmony and of musical composition were susceptible of such expression and adaptations, the engine might compose elaborate and scientific pieces of music of any degree of complexity or extent."

ADA LOVELACE

TECHNOLOGY

One of the most challenging and interesting parts in this project, was finding the proper tools to bring the idea to life, mixing the different parts involved in it (image display / music generation routines / music theory / hardware input & output) into a coherent and functional system, all of it in JavaScript. For this, I made an extensive use of different Open Source libraries and other programs:

  • Heartbeat.js became the backbone of the project. This library enables experimental MIDI input and output communication routines in different browsers. It also provided CODA with a main clock (or metronome) which drives and synchronizes the rest of the elements.
  • Teoria.js is a library which calculates proper grades, chords and scales. It made it easier to get the proper notes which worked over the dynamic chord progressions.
  • Pixi.js is a 2D graphical library which enables WebGL graphics. I wanted to experiment with different drawing routines and being able to create complex graphic representations without sacrificing performance.
  • Ableton Live acted as the main software synth stack and mixtable, translating the MIDI messages to proper live sound.

Aiming to make the experience as straightforward as possible, and to minimize the amount of input involved (still with the videogame idea in mind), I decided to use a non-standard simple controller as an input device. Inspired by the classic PONG game, I decided to settle with a PowerMate knob, which acts as a circular slider.


APPROACH

The system works by abstracting music to relationships between notes. First of all, the system needs a main key (this first version doesn't introduce key changes) from which it will pick the proper grades:

 
 

Those grades are used to build chord progressions, with which the program will build the structure of the song, establishing it beforehand or in real time following state-logic and chance:

The logic in the program coupled with the user input is used to drive the behavior of the different instruments, picking the appropriate notes to play over the current chord:

Different instruments and lines can follow a different logic the way a band, a string quartet or an orchestra would. Different musician "personalities" and user input can affect the lines, creating unique variations in melodies and rhythm:

  

POSSIBILITIES

The MIDI nature of the project means that sound can just not only be rendered by a software synthesizer like Ableton Live, but also be directly plugged into external MIDI synths and instruments, making it possible to get analog input and output.

One of the most interesting aspects of the project is being able to gather this input and apply it to visualizations for live performances or, as it was originally intended, connect it to a game or any other system to generate relevant sound and music on the fly, which is synchronized to the action.

I also see this project as a first step towards building a set of non-standard musical instruments, and I contemplate the possibility of recording an album based on them.

CODA was first introduced at "The IT Show” exhibition organized by School Of Ma on December 2014.