Re: [livecode] help - "future directions in live coding"

From: Jeff Rose <rosejn_at_gmail.com>
Date: Tue, 5 Jul 2011 01:28:27 +0200

Oh cool, good to know. If only Impromptu had been open source then history wouldn't have to repeat itself... ;-)

On Jul 5, 2011, at 12:06 AM, Andrew Sorensen wrote:

> Hi Jeff,
>
> An early version of Impromptu had a *midi* button along its button bar. When you pressed the *midi* button and started playing on an attached midi keyboard the relevant midi data (pitch and relative timing information) was written directly into the text buffer at the current cursor position as valid s-expressions. You could then use that data for whatever structural purposes you required (chords, melodies etc.). I didn't end up using it much so I got rid of it at some point but you can still see the midi button in a picture of the Impromptu IDE in my 2005 paper.
>
> Something that aa-cell used in the early days were *live* symbols fed from MIDI controllers. You could type one of these *live* symbols into your code to stream live midi information from a controller (we used to use BCR2000's). This was useful for both continuous (feeding synth params) and more discrete (pitch ranges for chord generation) actions. I think AB still does this a bit but I haven't bothered for quite a few years now. Not really sure why.
>
> Cheers,
> Andrew.
>
>
> On 05/07/11 03:23, Jeff Rose wrote:
>> One thing I've been thinking about lately is integrating small bits of live instrumentals with live-coding. So for example, one thing I find tedious when live-coding is listing out sequences of notes or chords. This is something that you have to pretty much think out in advance or else spend too much time with in the moment. Instead I was thinking it would be nice to have a midi keyboard next to me, and have a mechanism to hit a hot-key or something, and then as I play notes on the keyboard have their text representation paste into the current buffer or into a variable that's easily accessed from the code. Similarly, I think this could tie in with other aspects of a live-coding session where you could do things like feed a markov model with a short riff, or adjust synth parameters with knobs on the keyboard. I haven't started on anything yet, but hopefully this gets the idea across. It could be interesting to integrate more than just one person too, but that's for even further into the future :-)
>>
>> Good luck with the talk.
>>
>> -Jeff
>>
>>
>> On Jul 3, 2011, at 9:38 PM, alex wrote:
>>
>>> Hi all,
>>>
>>> I'm giving a talk called "future directions in live coding" at dorkbot
>>> newcastle tomorrow evening. I thought I'd mention live coding,
>>> TOPLAP, differentiate between the better known practitioners, visual
>>> live coding, livecoding with a gamepad, algorithmic dance,
>>> instrumental/livecoding hybrid performances, collaborations with live
>>> performers and all the usual stuff. A lot of this is already
>>> happening though...
>>>
>>> So I'm wondering what other people think the future directions in live
>>> coding might be. Any thoughts/ideas? Anyone doing anything new that
>>> they'd like Newcastle to know about? :) I will credit appropriately
>>> of course.
>>>
>>> Cheers,
>>>
>>> alex
>
>
Received on Mon Jul 04 2011 - 23:28:55 BST

This archive was generated by hypermail 2.4.0 : Sun Aug 20 2023 - 16:02:23 BST