[livecode] Live coding of visualized interactive soundscapes

From: justonium <justinorthrop_at_gmail.com>
Date: Fri, 25 Sep 2015 12:07:55 +0200 (CEST)

Copied and pasted from my blog:

Here's an image for reference:

http://imgur.com/ZzIj1FV

You are looking at a sketch of part of a song. Every arrow is a Tanscript bond, and every other object is a Tanscript node. The tiny circles are musical notes. The entire graph is both a Tanscript program and a song. When the interpreter interprets the program, the song is played. The interpreter can be thought of as being like a record player, and the Tanscript program a record.

Each note's pitch is encoded in its color. A note is a subtype of class action, and has in its implementation a sleep call; the note wakes up again after a duration of time has passed. The durations of each note in this picture are not visualized, but they should be in an actual implementation. When a note is done playing, the interpreter comes to life again and walks to and executes/plays the next note.

The large circles and the square are templates for groups of notes. (They are also currently subtypes of class note.) The notes that they contain can be hidden to make the code more readable, but all of them are shown in expanded form here. When the interpreter executes/plays one of these templates, it walks into its contents, then back out again and to the next note that the template has a bond to.

Templates are re-usable, which is powerful for composing music. There is more power here than in the copy-paste functionality of traditional midi editors, because the template can contain context-sensitive conditional nodes! Which brings us to another part of the diagram. See that V-shaped symbol with "T" and "F" written on it? That's a conditional node. A conditional node is like a railroad track junction. The condition can depend upon the state variables in a program, or, in this case, a value that is constant, and can be toggled by the listener! Yes, the listener can navigate the structure of a song as it plays, and flip the railroad track junctions in order to determine which parts of a song, or rather, interactive soundscape, are played. Note that one branch of the junction loops back to a previous part of the song.

One note can trigger another note to play in addition to the one that it is bonded to. You can see this type of triggering connection at the top of the square template, leading to another template. When this happens, the new note begins to run as a separate Tanscript program, simultaneously to the Tanscript program that was already playing (and continues to play as well).

If you look to the right of the square template, you sill see another triggering connection to a green rectangle. This green rectangle is a function which says to play the template that it points to in a modified manner. This particular one shifts every note in the template up by a fixed number of half-steps.

In addition to users selecting condition values in an interactive soundscape, composers can edit any part of a soundscape as it is playing. As long as the interpreter doesn't reach a dead end, the soundscape will continue to play as the composer writes.

The assignment of sounds to instructions is also useful for arbitrary programs for the purpose of sonifying the program. This is also a very important feature for when I program and write music while I'm blindfolded.
-- 
Read the whole topic here: livecode:
http://lurk.org/r/topic/2NUr3eI1QuRJaNMomXZscP
To leave livecode, email livecode_at_group.lurk.org with the following email subject: unsubscribe
Received on Fri Sep 25 2015 - 10:02:44 BST

This archive was generated by hypermail 2.4.0 : Sun Aug 20 2023 - 16:02:23 BST