Fractal Music, Interactions of Sound, Light, and Movement

Fractal Source Image

This page provides links to eight examples of sound sequences created by Dick's Fractal Music Workbench in 1988 and 1989. The melodic and rhythmic content in the recordings was created entirely by the Fractal Music Workbench program, using selected parametric inputs and source images. The base tempi, and the synthesizer and sampler voicings for realizing the songs were selected by hand for each recording.

NB: Most of the links to sound files are non-functional for now. I'm working on it!

Fractal Music Experiments 1986 - 1989

In creating the program that produced these compositions, it was important to ensure that the mechanisms mapping the content of visual displays into sonic material be inherently musical in their design and at their core. The metaphor that inspired the creation of these formulas was the image of Prometheus wandering the Minotaur's maze. The text of the article published in the fractal arts journal, Amygdala, explaining in detail the algorithms used to generate these compositions, is reproduced below.

Two methods for deriving musical parameters from chaotic dynamics

  1. Follow a border.
    Display a plot of a Mandelbrot set blow-up or Julia set. Black pixels should be used exclusively to represent points inside the Mandelbrot set, or internal to a connected Julia set, those points which never "dwell." This ensures there will always be a well-defined border on the screen between the lit and the unlit areas. Select a point on this border as the starting point for the note finder, input a pace or metronome value, and seed the tune with a first note. The finder proceeds to wander the screen staying on the black points, and always keeping a lit point immediately to his left, moving four steps per metronome count. For each step the finder takes, an event occurs. This event's nature depends on the relationship between the last point visited by the finder and the current one, according to these rules: If movement of the current step was a counter- clockwise one relative to the origin, the event is a silence, otherwise, the event is a sounding interval. For sounds, if the current step moved the finder away from the origin, the interval is a downward one, making a note lower than that last sounded. If the finder moved toward the origin, the interval moves upward. Intervals are selected from an array of 15 equal tempered possibilities, not necessarily all different, using the color value of the nearest lit point as an index.
  2. Track some trajectories.
    Begin with a group of points arranged in a circle anywhere on the complex plane. Using each of these points as a C and initial Z, iterate Z -> Z^2 + C until all the points have dwelled or cycled, or until memory runs out, saving the points of the trajectories as well as keeping track of the maximum and minimum modulus (distance from the origin) for each trajectory. Assign to each trajectory one of four functions for a voice: Sound/Silence, Pitch, Duration, or Volume. Sort them so that longer trajectories (usually those which never dwelled or cycled) are used for pitch, and shorter ones for Sound/Silence. Map the range of modulus values for each trajectory into an array of values appropriate to the trajectory's assigned voice function, e.g., { 0, 1 } for Sound/Silence or { G, A, B, C, D, E, F#, G } for Pitch. Finally, simultaneously traverse all the trajectories again, evaluating their voice functions. Each iteration produces an event of silence or sound for some duration at some pitch and volume. At the end of each of the shorter trajectories, loop back to its beginning. At the end of the longest trajectory, end the piece.

Back to portfolio Forward to Future Section