Re: [livecode] live coding with EarSketch

From: alex <alex_at_slab.org>
Date: Tue, 6 Oct 2015 11:18:21 +0100

Yes this is a really interesting issue that comes up from time to
time, how to transition from the old to the new. Julian has some
concerning thoughts about this, and I find the crossfades in jitlib
very interesting.

I think I already mentioned recently that in Tidal I've been working
on a library of transitions. Generally I make changes at the start/end
of rhythmic cycles (the sam), evaluations are quantised to I think an
eighth of a cycle.

Example transitions:
  histpan - each new pattern comes in on the left channel, and through
successive evaluations get pushed towards the right channel until they
disappear.
  anticipate - the previous pattern becomes increasingly intense
(through the use of comb filter) until it drops out to the new pattern
  clutch - events in the previous pattern are gradually dropped out as
events in the new one are brought in, at random
  mortal - the new pattern comes in immediately, but gradually dies
  wait - the old pattern stops immediately, leaving silence for a bit
before the new pattern comes in

It's pretty strange to live code how to transition between live code edits.

I guess you have an extra challenge here Jason because you are
generating finite sequences which end, and your lovely polyrhythms are
creating discontinuities at the loop point. It would be nice to code
an infinite timeline and visualise a 'window' of that, but I think
would not work for you? Treating conventional imperative control flow
as musical time seems to put certain limits on the notion of musical
time. It'd be interesting to hear your thoughts on this!

alex

On 5 October 2015 at 01:38, Jason Freeman <mail_at_jasonfreeman.net> wrote:
> thanks all for the comments.
>
> _at_Patrick: we’ve talked internally about swapping in the new code at a key timeline point, yes. I did something like this in an old environment of mine (LOLC) where you could schedule things to start at the next beat, measure, or hypermeasure (in that case, every 4 measures), or to start after the end of the last scheduled item. With EarSketch, we’re still not sure whether something like this would be the best approach or whether a quick crossfade between old and new would work better (or perhaps some combination of the two). Even if we wait until significant timeline points, a sudden change is still going to sound discontinuous sometimes.
>
> —Jason
>
>> On Oct 4, 2015, at 3:18 PM, Amy Alexander <amy_at_plagiarist.org> wrote:
>>
>> Muy cool! And another thumbs-up for the DAW-style visualization. I think
>> this will be hugely useful for the uninitiated and intimidated, as it's a
>> big help in getting past the livecoding "WTF factor" as well as the "I get
>> the idea in abstract but I don't really relate to it or picture myself
>> doing it" factor. I want to learn EarSketch! :-) (I'm checking out the
>> tutorials now...)
>>
>> -Amy
>>
>>
>>
>>
>>
>>
>> On Fri, Oct 2, 2015 at 2:24 PM, alex <alex_at_slab.org> wrote:
>>
>>> I really enjoyed the piece, and the analysis driven effect was a nice
>>> surprise at the end. Thanks for sharing, Jason! It's interesting that
>>> the visualisation helps you keep aware of the pace of change, I think
>>> this is one of the key skills of live coding performance and
>>> improvisation.. Counting while coding.
>>>
>>> Patrick, Julian made something in SuperCollider that let you schedule
>>> code changes ahead of time, although I don't think events were
>>> visualised in this way.
>>>
>>> Here's another python live coding environment by Ryan Kirkbride, fresh
>>> from Yorkshire:
>>> https://www.youtube.com/watch?v=mNTp_ECpxBY
>>>
>>> I've always felt it strange that there weren't more Python live coding
>>> environments, it seems like it should be well suited to it. Has anyone
>>> done live coding with ipython notebook?
>>>
>>> cheers
>>>
>>> alex
>>>
>>> On 2 October 2015 at 22:14, Patrick Borgeat <patrick.borgeat_at_gmail.com>
>>> wrote:
>>>> Hi Jason,
>>>>
>>>> looks great! From my perspective as a SuperCollider live coder I really
>>>> like that you are able to *see* ahead of time instead of just listening
>>> *in
>>>> time*. From my performer perspective, this might be something I would
>>> like
>>>> to have too, especially as I can sill reshape the future time I was able
>>> to
>>>> see.
>>>>
>>>> I think a cool feature could be to quantize changes to the timeline, e.g.
>>>> run the code and compute a future timeline but the new timeline will just
>>>> be swapped in at the next bar (or each 4 bars). It could also be
>>>> interesting to be able to preview the results of code, so it would be
>>>> displayed as an overlay to the actual sounding timeline, so that I can
>>>> inspect it visually but the old audio timeline is still playing (until I
>>>> decide that I want to hear the new timeline).
>>>>
>>>> Cool stuff! Cheers,
>>>> Patrick
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> 2015-10-02 22:17 GMT+02:00 Jason Freeman <mail_at_jasonfreeman.net>:
>>>>
>>>>> Hi all,
>>>>>
>>>>> Some of you know about EarSketch (http://earsketch.gatech.edu), the
>>>>> browser-based coding environment that includes a Python / JavaScript
>>> API,
>>>>> sound library, and DAW-style view for algorithmic composition. We’ve
>>> been
>>>>> developing it for the last few years at Georgia Tech, and it’s primarily
>>>>> targeted towards students in intro computer science courses, with an eye
>>>>> towards increasing engagement and participation in computing by
>>> populations
>>>>> traditionally underrepresented in computer science.
>>>>>
>>>>> We’ve recently begun exploring the potential of EarSketch as a live
>>> coding
>>>>> environment too. Last week, I performed a live coding set in concert
>>> with
>>>>> EarSketch for the first time and wanted to share a screencast with you:
>>>>>
>>>>> https://www.youtube.com/watch?v=5ThWr3stq9M
>>>>>
>>>>> We still have some work to do in making live coding smoother in
>>> EarSketch,
>>>>> and I still have some practice to do as a performer, but…I wanted to
>>> share
>>>>> because I think the unusual structure of the environment has interesting
>>>>> implications for live coding, i.e.
>>>>>
>>>>> * all the music is based on DAW-like operations: placing audio files on
>>> a
>>>>> multi-track timeline, splicing them, adding effects, etc.
>>>>> * time is organized as a DAW timeline that loops
>>>>> * the results of code execution are visualized (for both the live coder
>>>>> and the audience) in a DAW-style display
>>>>>
>>>>> Hope you find these ideas interesting!
>>>>>
>>>>> Best,
>>>>> —Jason
>>>>> --
>>>>>
>>>>> Read the whole topic here: livecode:
>>>>> http://lurk.org/r/topic/5kYLYw3lYMBKIOfyIrgzg4
>>>>>
>>>>> To leave livecode, email livecode_at_group.lurk.org with the following
>>> email
>>>>> subject: unsubscribe
>>>>>
>>>>
>>>> --
>>>>
>>>> Read the whole topic here: livecode:
>>>> http://lurk.org/r/topic/o5mxajyrCelUrWS6goDem
>>>>
>>>> To leave livecode, email livecode_at_group.lurk.org with the following
>>> email subject: unsubscribe
>>>
>>>
>>>
>>> --
>>> http://yaxu.org/
>>>
>>> --
>>>
>>> Read the whole topic here: livecode:
>>> http://lurk.org/r/topic/47NUuSHdWw7jf3Hzc6KraR
>>>
>>> To leave livecode, email livecode_at_group.lurk.org with the following email
>>> subject: unsubscribe
>>>
>>
>> --
>>
>> Read the whole topic here: livecode:
>> http://lurk.org/r/topic/3Tuj3pgGHGMtrFZLeR8gZx
>>
>> To leave livecode, email livecode_at_group.lurk.org with the following email subject: unsubscribe
>
>
> --
>
> Read the whole topic here: livecode:
> http://lurk.org/r/topic/5jfnN9k8Nzi1AQ0hp0bmTr
>
> To leave livecode, email livecode_at_group.lurk.org with the following email subject: unsubscribe



-- 
http://yaxu.org/
-- 
Read the whole topic here: livecode:
http://lurk.org/r/topic/6eteZYQkOKEN0CJfN0eGqQ
To leave livecode, email livecode_at_group.lurk.org with the following email subject: unsubscribe
Received on Tue Oct 06 2015 - 10:18:34 BST

This archive was generated by hypermail 2.4.0 : Sun Aug 20 2023 - 16:02:23 BST