[livecode] physicality and live coding

From: Nick Collins <nc272_at_cam.ac.uk>
Date: Tue, 26 Sep 2006 16:53:49 +0100

[warning- one thing below in particular is ethically dubious but
interesting to suggest...]

I thought I might set down a few ideas I had the other day concerning this
reoccurring issue of the physical.

Not to anticipate too much I hope, but my current conclusion remains that
live coding and conventional instrumental control are simply different, and
should be celebrated for that.

But as with all such sweeping categorisations, there is murky artistic fun
to be had in the middle?

Physical results after coding
Errors-> physical punishment.
Beethoven's father would strike his hands with a ruler if he made mistakes
while practising. I suggest electric shocks applied to the programmer
linked to syntax errors or bugs of certain graded seriousness (obviously
linked to pain). A full system crash would be matched with death for the
programmer from a loaded pistol, or drop them from a great height as a
trapdoor opens, thus incorporating a real concert tightrope.

Physical coding
Tangible computing and sign languages, jumping
The performer dictates a program in sign language. The performer plays with
some tangible computing interface. The performer jumps around a symbol mat,
etc

Physical data as an input
Posture->sonification
The data to be sonified is the position of the live coder at their desk, as
they unconsciously slump, fidget, fail to move an eyelid etc.

Direct physical control
Typing notes is trivial, but uninteresting for a live coder (it is very
interesting for a pianist, I'm not against note control!). This is the only
exemplar of note-level control with live coding, the rest is score-level,
as described in many sources on interactive music systems (see David
Wessel's work for instance). We can argue about the score-code analogies,
but I don't see any way around the failure to specify note by note with
direct feedback control.

There are levels of abstraction that don't have a physical analogue, and
this is a fundamental brickwall we shouldn't beat ourselves up against. It
is an inherent 'price' of live coding that directness is exchanged for
greater abstract power.

One remaining speculation:
However, I think we can deal with automaticity, just not note-level
control. Musicians learn to automate many physical actions because they
otherwise could not control everything at once (this is why they have very
developed cerebellums in neuro-imaging studies!). There is no reason that a
machine assistant could not help us to automate coding tasks. Think of
going beyond auto-completion into auto-prediction. I might train up a
system on my August live coding exercises, then let it try to anticipate
what I will type far ahead. Of course, this is a blue-sky scheme which will
be semantically lacking for the usual reasons of advanced AI etc. But it is
one avenue of investigation for interesting live coding work- and could
remain an optional component of a performance system, to be grabbed if
necessary if you need to speed up your responses. You could train up over
many rehearsals with acoustic musicians, linking automated audio analysis
to code snippets...getting gradually faster?

[the last paragraph is very speculative, and I'd also suspect such
artificial automaticity might undermine the interesting side of live coding
where your conscious algorithmic thought is the focus. Nothing wrong with
practicing fast thinking however, and there are some mathematical
automations surely possible- back to practice again!)
Received on Tue Sep 26 2006 - 15:53:56 BST

This archive was generated by hypermail 2.4.0 : Sun Aug 20 2023 - 16:02:23 BST