[livecode] live coding with EarSketch

From: Jason Freeman <mail_at_jasonfreeman.net>
Date: Fri, 2 Oct 2015 16:17:35 -0400

Hi all,

Some of you know about EarSketch (http://earsketch.gatech.edu), the browser-based coding environment that includes a Python / JavaScript API, sound library, and DAW-style view for algorithmic composition. We’ve been developing it for the last few years at Georgia Tech, and it’s primarily targeted towards students in intro computer science courses, with an eye towards increasing engagement and participation in computing by populations traditionally underrepresented in computer science.

We’ve recently begun exploring the potential of EarSketch as a live coding environment too. Last week, I performed a live coding set in concert with EarSketch for the first time and wanted to share a screencast with you:

https://www.youtube.com/watch?v=5ThWr3stq9M

We still have some work to do in making live coding smoother in EarSketch, and I still have some practice to do as a performer, but…I wanted to share because I think the unusual structure of the environment has interesting implications for live coding, i.e.

* all the music is based on DAW-like operations: placing audio files on a multi-track timeline, splicing them, adding effects, etc.
* time is organized as a DAW timeline that loops
* the results of code execution are visualized (for both the live coder and the audience) in a DAW-style display

Hope you find these ideas interesting!

Best,
—Jason
-- 
Read the whole topic here: livecode:
http://lurk.org/r/topic/5kYLYw3lYMBKIOfyIrgzg4
To leave livecode, email livecode_at_group.lurk.org with the following email subject: unsubscribe
Received on Fri Oct 02 2015 - 20:17:47 BST

This archive was generated by hypermail 2.4.0 : Sun Aug 20 2023 - 16:02:23 BST