Re: [livecode] livecoding and gestural control

From: Georg Essl <gessl_at_umich.edu>
Date: Mon, 2 Dec 2013 12:08:44 -0500

I think this is a very interesting area and I really enjoyed Marije's and
Konstantinos's work!

At last NIME we discussed some related ideas that fall into the
intersection of gestures and live coding, through in the context of live
coding mobile devices. The associated piece has since been performed too
(and a demo version of it got 3rd place at the LIVE 2013 workshop) and
perhaps we'll get a chance to show it at the coming NIME (pending how
reviewers think about it).

The idea is that we live code the instrument/interface directly. Of course
the instrument/interface can be acted on with musical gestures, but those
gestures are per se not stable but critically contingent on the progression
of the live coding performance. The piece used two live coders and a
separate instrument performer performing together. Sang Won Lee is really
the driving force behind this. Check out his blog post for more of the
reasoning behind it or the NIME paper:

http://echobin.wordpress.com/2013/04/04/improvisation-on-a-live-coded-mobile-musical-instrument-using-urmus/
http://web.eecs.umich.edu/~gessl/georg_papers/NIME13-MobileLiveCoding.pdf

- Georg


On Sun, Dec 1, 2013 at 11:37 AM, Charlie Roberts <bigbadotis_at_gmail.com>wrote:

> Nice work Marije! I especially enjoyed the sounds in the six to eight
> minute range. I had similar thoughts to Konstantinos and hope to hear more
> about the incorporation of gestural control / devices into live coding
> practice from anyone on the list. I've seen other performers do it (Sam
> Aaron jumps to mind) and I'm curious about the motivation for it.
>
> Transparency was mentioned, but this could also be achieved in a
> collaborative performance, as in Konstantinos's second video . I assume
> there's something attractive about moving between modalities for performers
> and am hoping someone can articulate what they find appealing about it.
>
> For what it's worth I also share the impulse to bring more embodiment into
> my live coding performances... I'm just not sure I trust the impulse in the
> absence of a collaborator(s).
>
> Any pointers to papers touching on this would also be appreciated.
>
>
> On Sun, Dec 1, 2013 at 6:12 AM, Konstantinos Vasilakos <
> k.vasilakos_at_keele.ac.uk> wrote:
>
>>
>> On 1 December 2013 13:57, Marije Baalman <nescivi_at_gmail.com> wrote:
>>
>>> Do you have any footage of your work along the same fashion?
>>
>>
>> Hi, I have done some performances elaborating in the same strategies
>> (live hardcoded mapping), one is a an audio based representation of this
>> method: https://soundcloud.com/konstantinos_p_vasilakos/formations which basically this is to show the huge potential to create variations of
>> structure within the musical context by changing the relationships of the
>> controller with the performance environment, and/or to manipulate the sound
>> synthesis bits itself in order to change the morphology of the sound.
>>
>> The other is a performance I did with some dancers and jointly with
>> another laptop performer Shelly Knotts which you can see it here:
>> http://www.youtube.com/watch?v=2Pk1nmIAoQs
>> and this basically was elaborating in manipulating the signal of the wii
>> and again to change the mappings(range specs included).
>>
>> There will be some thorough documentation for these and more on the
>> manipulative mapping in real time with live coding in due course of my PhD.
>>
>>
>> Thanks
>>
>> --
>> Best
>> K.
>>
>
>


-- 
-- 
Georg Essl
gessl_at_umich.edu
(734)-730-0791
Assistant Professor
Electrical Engineering & Computer Science and Music
University of Michigan
Received on Mon Dec 02 2013 - 17:09:32 GMT

This archive was generated by hypermail 2.4.0 : Sun Aug 20 2023 - 16:02:23 BST