Re: [livecode] toplap

From: Amy Alexander <amy_at_plagiarist.org>
Date: Fri, 22 Oct 2004 13:22:25 -0700 (PDT)

On Fri, 22 Oct 2004, alex wrote:

a> > I understand your skepticism, but I like to believe and sincerely
a> > hope that reading and understanding the code CAN truly enhance the
a> > musical experience (you do too, I am sure) - at least in the same way
a> > of watching/appreciating performances of traditional musical
a> > instruments.
a>
a> I don't think that's a fair comparison - you can't read or understand a
a> traditional musical instrument.
a>

i think this is a really interesting comment, because there's a couple
ways to look at this. from one perspective i agree, but i usually find
myself saying just the opposite when explaining my interest in livecoding
performance to someone who says, "but it's of no use to someone who can't
understand the code!"

to me what's interesting about watching someone perform a traditional
(mechanically-operated) instrument is that even if you don't play that
instrument or know anything about it, you still understand how it makes
music. you can see/hear the causality of the performer's motions and the
music you hear, and that's what makes it exciting. performances in which
things get more kinetic are usually the most exciting, and that is played
out in both visuals and sound.

on the other hand, a person who plays that instrument will have a more
sophisticated level of appreciation for the performance. but, in some
cases, the knowledgable audience member will be so far inside the process
they won't be able to enjoy the performance viscerally at all.

what i personally find most interesting about livecoding performance
usually is not trying to understand the code that's being typed. what i
find most interesting is the kinetic relationship between what i see on
the screen and what i hear. i like slub a lot e.g., because even though it
may not be considered pure livecoding, there's a lot of action on the
screen, and a lot of clear relationships between the onscreen activity and
the sound produced - so i get a sense of the *motion* of the processes.
with livecoding that involves longer bits of code being developed
onscreen, more of the "innards" of the process are shown but
paradoxically, i get less of a sense of that process as a piece of music
in motion.

so the tricky thing for me with livecoding becomes balancing the
presentation of internal kineticism (the motion within the process that
motivated the performer to livecode in the first place) with external
kineticism (a cause-effect relationship between the visual and the aural
aspects of the performance.)

this of course assumes that we're trying to work within the model of how
mechanical instruments are performed and the audience's experience with
that model. it's also possible to try to move livecoding away from that
model just as early films eventually made the move away from documentation
of stage plays toward a medium-specific model using montage, camera
angles, time compression, etc.... and how software art tries to move away
from the display-centric model of video and media art. in that case, the
challenge becomes figuring out how the audience will come to relate to the
new model, whether there first needs to be a gradual transition and
development of increased literacy of the medium, or whether it can happen
right away.

my approach with the thingee has been decidely/admittedly non-purist,
especially since i don't much trust myself to type long bits of code
correctly onstage, or even figure out where my typos are quickly in a live
situation. so i've just gone for the overall effect - combining livecoding
with quickly-executed parameterized commands, and even some foot-stomps
through FIFOSY (FIFOSY Is Foot-Operated Software Ya-know)... plus some
actual livecode that acts on whatever text is clicked, so that the
onscreen action of clicking can cause something immediate to happen, while
still being part of a livecode process (though the process itself may not
really be obvious enough live.)

btw, i actually did write an actual livecode FIFOSY implementation into
the thingee, wherein you can assign an arbitrary snippet of livecode to a
square on the dancepad, and then when you hit that square with your foot
again, it will spit the code back onto the screen, which thus causes it to
be executed. but it was too rough to try to do at the aarhus show, and i'm
not sure i'll ever use it because it's pretty difficult to get people to
understand any causality with a maneuver like that...

oh well, foot in, foot out...

-_at_


 
Received on Fri Oct 22 2004 - 20:22:30 BST

This archive was generated by hypermail 2.4.0 : Sun Aug 20 2023 - 16:02:24 BST