Re: [livecode] help - "future directions in live coding"

From: Andrew Brown <algorithmicmusic_at_gmail.com>
Date: Tue, 5 Jul 2011 10:38:21 +1000

Interestingly, I'm doing a solo live coding performance this Friday in
Auckland and have been grappling with similar balances during preparation
between automated functional processes and gestures on mapped controllers. I
agree with Andrew S that the back and forth between controller and code can
be disruptive, but I've found in this my first really 'timbral' live coding
work that a combination of evolving processes and more immediate gestures
works itself out with as you play with it and the more appropriate centre
for control seems to become clear during rehearsal - we'll see how it all
goes of course, it may be a disaster as I get totally confused :)

Cheers,

Andrew B




On 5 July 2011 10:10, Andrew Sorensen <andrew_at_moso.com.au> wrote:

> Yes, I think flow is the primary problem. Both my flow (in a
> Csíkszentmihályi sense) and the musics flow. This is probably the main
> reason that I don't use MIDI controllers much any more and also why I'm not
> super keen on GUI components in my code (sliders, dials and alike) -
> although I'm all for GUI overlays (http://www.vimeo.com/25699729**). The
> trouble I've found with manual gesture controls (as opposed to computational
> gesture), is that they're great when you're manipulating them but then you
> stop ...
>
> For example, you're sweeping a filter cutoff with a MIDI controller - which
> works great of course - accept that you have to stop adjusting the MIDI
> controller to go back to more typing. This back-and-forthing between the
> controller and the computer keyboard just never worked well for me. Of
> course, you could press a button and record a "gestural loop" but this is a
> somewhat cyclical argument. Instead I've tried to focus on abstracting out
> the gesture control just as much as the "note" control. I very rarely type
> out a melody either, it's almost always generative. If I do type something
> out in long hand it will usually be a 3 or 4 note cell - which is easy to
> type and easy to grow into larger structures.
>
> I generally play solo these days though, and I imagine that if you're
> working with other performers a lot of these issues (primarily continuity)
> would be less of a concern.
>
>
>
>
> On 05/07/11 09:35, Sam Aaron wrote:
>
>> On 4 Jul 2011, at 23:06, Andrew Sorensen wrote:
>>
>>> An early version of Impromptu had a *midi* button along its button bar.
>>> When you pressed the *midi* button and started playing on an attached midi
>>> keyboard the relevant midi data (pitch and relative timing information) was
>>> written directly into the text buffer at the current cursor position as
>>> valid s-expressions. You could then use that data for whatever structural
>>> purposes you required (chords, melodies etc.). I didn't end up using it
>>> much so I got rid of it at some point but you can still see the midi button
>>> in a picture of the Impromptu IDE in my 2005 paper.
>>>
>> Wow, cool. Out of interest why don't you think you ended up using it? Was
>> it too cumbersome? Did it not fit into your performance workflow at the
>> time? Do you see yourself ever using such a mechanism again?
>>
>> Sam
>>
>> ---
>> http://sam.aaron.name
>>
>
>
Received on Tue Jul 05 2011 - 00:39:04 BST

This archive was generated by hypermail 2.4.0 : Sun Aug 20 2023 - 16:02:23 BST