From: alex wrote: > I think this is working now. So! > > Hello, > > I spent the morning on toplap.org, and now we have a nicer wiki (a > 'usemod' one) and a mailman mailing list. > > I also invited Amy Alexander into the project as we discussed... Maybe > we should hold back from any full-blown discussion until we hear from > her (shouldn't be longer than a day). > > Now I need to find a haircut > > alex > > > > _______________________________________________ > livecode mailing list > livecode@hiddenorg > http://toplap.org/cgi-bin/mailman/listinfo/livecode _______________________________________________ livecode mailing list livecode@hiddenorg http://toplap.org/cgi-bin/mailman/listinfo/livecode From: alex anyway, should we consider more graphical programming languages to be > 'pure' live coding? is a syntax/ascii interface a requirement or is > it more the way of working? what is significant for live coding > then? where do we draw the line? some things to have in mind for the > manifesto... It patching programming? James McCartney has something to say about this: http://www.ai.mit.edu/~gregs/ll1-discuss-archive-html/msg03610.html I think the answer has to be yes, but it's a different flavour of programming from textual languages. Luckily, Programming and Patching share the same initial letter so we can interchange them (as you can see from the toplap wiki, the words behind the TOPLAP acronym aren't fixed, amy). Limiting ourselves to ASCII (or even UTF8) seems arbitrary. But we don't necessarily want to let people contruct sequences in cubase and call that live programming! One idea we had was to have TOPLAP ratified programming languages and programming methods, and a procedure that people would follow to get their particular programming/performance environment ratified. alex _______________________________________________ livecode mailing list livecode@hiddenorg http://toplap.org/cgi-bin/mailman/listinfo/livecode Date: Sun, 22 Feb 2004 19:48:31 +0000 From: Nick Collins I drafted an abstract for a RUNME paper, obviously fully open to > discussion. It might also help get us get grounded? (I don't presume I've > got this right first time!) Are there some wiki instructions, how do I add > a new page to the wiki for the abstract? There are some here: http://www.usemod.com/cgi-bin/wiki.pl?TextFormattingRules I think the main thing that tripped me up with this one is that to make a link you do [[this]] rather than the [this] which you might be used to. It also turns 'wiki words' into links... A wiki word being a word with two capital letters in it and no spaces. such as NickCollins. alex _______________________________________________ livecode mailing list livecode@hiddenorg http://toplap.org/cgi-bin/mailman/listinfo/livecode Date: Sun, 22 Feb 2004 22:24:51 +0100 Subject: Re: [livecode] ok! From: Fredrik Olofsson I think this is working now. So! > > Hello, splendid. great work alex. not totally comfortable with wikis myself so excuse me totally filling up the changed history log. nick: "Alex's comment on code running in the mind of the programmer (from the OS paper)" and, iirc, in the audience too! reading that for the first time really trigged me. think manifest should mention dancing too as a kind of code digestion/processing. _f #| fredrikolofsson.com klippav.org |# _______________________________________________ livecode mailing list livecode@hiddenorg http://toplap.org/cgi-bin/mailman/listinfo/livecode Date: Mon, 23 Feb 2004 16:29:57 -0800 (PST) From: Amy Alexander Hello a> a> Amy joined us -- welcome Amy! Her participation is limited by her lack a> of time, but that's probably true of all of us to some extent. We're a> not under any time pressure anyway. a> howdy all! yes, i may have to lurk most of the time due to current realities, but i am excited to join and will try tokeep up as much as i can. some uber-brief intro to my interest in live coding performance. i spent most of my growing-up years doing musical performances of various kinds. eventually i wound up in film, video (real-time and not), computer animation, unix systems administration, and media art projects made through programming of various kinds. my first sys-admin performances were by accident, because i used to get absorbed in what i was doing and type very fast and suddenly look up to find all the animators in the lab where i worked grinning at me. anyway, being a leisure-oriented lazy person, i eventually got interested in how computers were changing pop culture. i came to the conclusion that while computers could be used for cool things like music and video, that they'd brought the business paradigm with them into pop culture. so you could go to a club and hear some great laptop techno music, but it could look just like they were doing database programming. quite a shift from sweaty guitar bands. so cool pop/leisure culture gets more boring; but on the other hand, geek culture seems to get continually more cool and mainstreamed. (case modding, slashdot, e.g.) so anyway, somehow in all this i became a geek vj, trying to make club culture "cool" by being an extra cool geek (tongue-in-cheek, more or less.) so one thing i do in shows is try to be very performative about my typing. (this is more explicit or less in various projects.) so, i'm very interested in the performative aspects of live-coding. live-coding can invert the usual electronic music performance assumptions i think, because it's now coding-as-improv, i.e. live composition, rather than software-usage as performance. (of course, there places in between those two extremes, which alex and frederick have already mentioned.) but anyway, seems like a good way in which a "geek approach" to computers in pop culture can provide an ultimately less-nerdy alternative to the traditional ways this integration has taken place. so... some initial questions from me: 1) is this all about music? (e.g., i work from the visuals side of things. then again, i'm not specifically live coding, either... ) 2) are we interested in the performative aspects of live coding, the conceptual aspects, or both? in other words, is the audience always made aware of the live coding process? how? are the performers' displays shown on screens? do we see their fingers typing? 2a) if we're interested in the audience's experience of the coding performance - how much of this is about typing vs. coding? i remember from alex's impromptul 5-minute coding performance at transmediale last year, the audience was very excited to watch alex's cursor fly around in emacs. fortunately, alex is not only a whiz at coding, but also moves like a dancer in emacs. so now that's 3 things - coding well live, typing skill, and a particularly visual facility with a text editor. do we separate those issues? for the audience? for ourselves? what about lowly vim users like me? ;-) ... and, how to address these issues without degenerating into emacs/vi wars? 3) is live coding inclusive or competitive? i.e., does one need to be the fastest, most proficient coder to do perform live? if so, it might prove programming prowess but exclude people with interesting ideas and a feel for live performance, but who aren't as quick at the programming end of things. or, can both approaches be accomodated, for example, through various types of languages? a> > 'pure' live coding? is a syntax/ascii interface a requirement or is a> > it more the way of working? what is significant for live coding a> > then? where do we draw the line? some things to have in mind for the a> > manifesto... a> a> It patching programming? a> 4) from this question, as well as the issue of cubase sequences, it seems to me that we've hit on the idea that the distinction between programmer and user is really more of a continuum than a line in the sand. and after all, programmers are users anyway, using C++, perl, various libraries, etc. a> a> Limiting ourselves to ASCII (or even UTF8) seems arbitrary. But we a> don't necessarily want to let people contruct sequences in cubase and a> call that live programming! a> yeah, i'd also vote for keeping things as loose as possible, and then providing some rationale for whatever admittedly-subjective line we decide to draw (and that the line can be crossed for particular situations - what if someone constructs a turing-complete language out of cubase sequences? ;-) ) a> One idea we had was to have TOPLAP ratified programming languages and a> programming methods, and a procedure that people would follow to get a> their particular programming/performance environment ratified. a> i'm not sure i follow the rationale for this? it sounds a little like it could create a feeling of an exclusive club to me like it could discourage people from working in new directions... but maybe i missed something? sorry if any/all of this is stuff that's been discussed already! i know it can be weird when new people join a group and then bring up the same things that have been gone over before... hoping to catch up soon! :-) cu, -@ _______________________________________________ livecode mailing list livecode@hiddenorg http://toplap.org/cgi-bin/mailman/listinfo/livecode Date: Tue, 24 Feb 2004 11:57:42 +0100 From: Julian Rohrhuber 1) is this all about music? (e.g., i work from the visuals side of things. >then again, i'm not specifically live coding, either... ) I would suggest it is not all about music (as probably music is not all about music either) My personal interest is to get away from the idea of production - so if music as a product then I'd even say it is not about music. >2) are we interested in the performative aspects of live coding, the >conceptual aspects, or both? >in other words, is the audience always made >aware of the live coding process? how? are the performers' displays shown >on screens? do we see their fingers typing? I use live-coding mostly for film sound, so there is no 'performance' as such. But when playing for an audience or with an audience, I would try to show the code. I was a bit unhappy about our last performance at betalounge in Hamburg - they showed fingers typing as a maximum - why should this be so very interesting? But again this is my personal view. > >2a) if we're interested in the audience's experience of the coding >performance - how much of this is about typing vs. coding? this is a good point. I would hope it is about live coding - the problem as I see it is about understanding, semantics. >i remember from >alex's impromptul 5-minute coding performance at transmediale last year, >the audience was very excited to watch alex's cursor fly around in emacs. >fortunately, alex is not only a whiz at coding, but also moves like a >dancer in emacs. so now that's 3 things - coding well live, typing skill, >and a particularly visual facility with a text editor. do we separate >those issues? for the audience? for ourselves? what about lowly vim users >like me? ;-) I'm a 2 - 3 finger typer and I'm totally desinterested in typing speed apart from practical consideration. I find it fascinationg to be quick, but who would judge a speech from its speed, for example? >... and, how to address these issues without degenerating >into emacs/vi wars? > >3) is live coding inclusive or competitive? i.e., does one need to be the >fastest, most proficient coder to do perform live? this would be the thing I hated most and would be the furthest possible away from the idea I had about it in the first place. Mostly I dislike the outcome of combinations of sports and music. > if so, it might prove >programming prowess but exclude people with interesting ideas and a feel >for live performance, but who aren't as quick at the programming end of >things. or, can both approaches be accomodated, for example, through >various types of languages? generally I would say that live coding doesn't even have to be specifically interesting or in the center of attention. For me it is another way to think about language and computers, and another way to interfere with sound - it is good if it is visible what's going on, but not for sake of showing off, but for being able to understand it and to have conversation about it and with it. -- . _______________________________________________ livecode mailing list livecode@hiddenorg http://toplap.org/cgi-bin/mailman/listinfo/livecode From: Adrian Ward I'm a 2 - 3 finger typer and I'm totally desinterested in typing speed > apart from practical consideration. I find it fascinationg to be > quick, but who would > judge a speech from its speed, for example? Hear hear. If I may digress slightly, and bring in another related consideration we mustn't lose sight of: There is considerable snobbery amongst coders, even amongst friends. There are too many misconceptions, and too much rivalry between skill sets. There were raised eyebrows in the audience in Hamburg when it was revealed that I use REALbasic to make music. Later, over food, there was considerable surprise that whilst Alex is the "perl one" and I was "REALbasic one", I was the one who wrote the slub sound engine in C. Heavens above! I know how to use gcc as well as a mouse. How did that happen? Coders are the worse at this. Not even stroppy prima donna artists do it. Amy asked if we should be inclusive or competitive. I suggest we need to be inclusive (fairly obviously given my experience), but let's be careful that we dont endorse or promote any particular practice, since it'd be easy for that to be misconstrued in the eyes of a community of coders who are conditioned to judge by tools alone. My line is that it's not the tools that are of interest, but how you use them. Note that this differs somewhat from the "graphical programming/patching" debate. For my next project, I'll be writing a raytracer in Visual Basic. The core renderer will be implemented using DOS batch files, though. -- Adrian Ward, Signwave UK http://www.signwave.co.uk/ - +44(0)7711 623498 2nd Floor North, Rutland House, 42-46 New Road, London, E1 2AX, UK. _______________________________________________ livecode mailing list livecode@hiddenorg http://toplap.org/cgi-bin/mailman/listinfo/livecode Subject: Re: [livecode] foo From: alex Hear hear. If I may digress slightly, and bring in another related > consideration we mustn't lose sight of: > There is considerable snobbery amongst coders, even amongst friends. > There are too many misconceptions, and too much rivalry between skill > sets. There were raised eyebrows in the audience in Hamburg when it was > revealed that I use REALbasic to make music. Later, over food, there > was considerable surprise that whilst Alex is the "perl one" and I was > "REALbasic one", I was the one who wrote the slub sound engine in C. > Heavens above! I know how to use gcc as well as a mouse. How did that > happen? Coders are the worse at this. Not even stroppy prima donna > artists do it. Is REALbasic really frowned upon that much? I guess it's a Macintosh culture thing. I think the same unfounded snobbery can be found upon other artistic mediums however - for example painting, watercolours and so on (we both know the disrespect that the various flavours of fine arts student discord to each other in badly managed academic situations...). I guess the problem is that a medium becomes embedded in a culture, and gets dumb stereotypes attached. Then you get the conformists reinforcing the stereotypes. > Amy asked if we should be inclusive or competitive. I suggest we need > to be inclusive (fairly obviously given my experience), but let's be > careful that we dont endorse or promote any particular practice, since > it'd be easy for that to be misconstrued in the eyes of a community of > coders who are conditioned to judge by tools alone. My line is that > it's not the tools that are of interest, but how you use them. I agree. How can we define "live coding" in these terms? We could take the runme.org approach, of challenging what we disagree with by taking it to ridiculous levels. So having a page for officially ratified TOPLAP live coding performance systems and having thousands of languages in there. Anyway - we are talking about performance, and for me an essential part of being audience to a musical performance (for example) is seeing the movements that make the music. It connects me physically with the music. If I went to see a violinist and he turned his back on me so I couldn't see his fingers move, I'd wonder why I didn't just buy a CD instead. So in relation to live coding, movement needs to be visible. So seeing a mouse being moved or a keyboard being tapped at does help. But even better is seeing the processes move. Nick and Fred's joint visual + audio performances seem like an ideal. But letting your screen be seen, so that the movements within the interface can be seen exposes more of the process. The audience might not understand any more of the process (watching a guitarist perform doesn't help a non-musician learn about chord structures), but they are nevertheless allowed to witness movements of musical processes. The only alternative in my opinion is to have a dance floor, so that the audience adds the movements themselves. The non-alternative is to instead sell recordings so that the audience can listen how they like (I don't believe people really want to sit listening to music in expensive theatres at some random point within a stereo field sitting next to sniffing, coughing strangers). alex _______________________________________________ livecode mailing list livecode@hiddenorg http://toplap.org/cgi-bin/mailman/listinfo/livecode Date: Tue, 24 Feb 2004 18:21:42 +0100 Subject: Re: [livecode] foo From: Fredrik Olofsson We could take the runme.org approach, of challenging what we disagree > with by taking it to ridiculous levels. So having a page for > officially > ratified TOPLAP live coding performance systems and having thousands of > languages in there. was it you alex or julian who brought up apple's new garage band program as a good opposite pole http://www.apple.com/ilife/garageband/ it sure is ugly and scares me. crap music will be flooding. ableton live is another good case study i think. alex: > But letting your screen be seen, > so that the movements within the interface can be seen exposes more of > the process. The audience might not understand any more of the process > (watching a guitarist perform doesn't help a non-musician learn about > chord structures), but they are nevertheless allowed to witness > movements of musical processes. defining live-coding will probably be difficult and might take time. but couldn't we try to break down the problem and see what we can agree upon at the moment? dare i suggest we all can agree with alex above on a notion of playing with open cards (in practice... project our screens during performance or if that not possible at least not hide away behind the screen looking 'very serious'). trying to share and communicate by visualising, in whatever way, what's going on in the computer and our minds? i got another thought though... if you project any sequencer program it will be instantly obvious what's going on. on-screen knobs and faders bores you and after a few minutes you'll loose interest - maybe in the resulting music too. it's hard to be a virtuoso in ableton live. rather, isn't the thing about live coding the curiosity of how it really works? at least slub makes my brain spin trying to figure it out. i just love the expectation seeing a command executed and listening for its effect in the music. or which chunk of unreadable terminal output belongs to which sounds. i'm not proposing making things more difficult than needed but isn't it a trick of the trade to keep some secrets, arise curiosity (and thereby dialogue) and not letting the audience grasp the whole working process at once? amy: > sorry if any/all of this is stuff that's been discussed already! i > know it > can be weird when new people join a group and then bring up the same > things that have been gone over before... hoping to catch up soon! :-) nono, you didn't miss much. we just got started really. googling for 'live coding' brought this up http://www.eagle-software.com/snobol_review.htm witness of a delphi live coding happening. "If you thought that programming was always a personal, selfish pleasure, then you should have seen this: a British audience (correction: a British audience of programmers ) shouting out and competing with each other and actually cheering and bursting into spontaneous applause " is this something to strive for? the [delphi] guru at the keys and an [delphi] audience that can appreciate all twists and turns and contribute. sounds a little romantic in my ears. _f (also slow typing) #| fredrikolofsson.com klippav.org |# _______________________________________________ livecode mailing list livecode@hiddenorg http://toplap.org/cgi-bin/mailman/listinfo/livecode Date: Wed, 25 Feb 2004 01:27:28 +0100 From: Julian Rohrhuber On 24 Feb 2004, at 10:57 am, Julian Rohrhuber wrote: > >> I'm a 2 - 3 finger typer and I'm totally desinterested in typing >>speed apart from practical consideration. I find it fascinationg to >>be quick, but who would >> judge a speech from its speed, for example? > >Hear hear. If I may digress slightly, and bring in another related >consideration we mustn't lose sight of: > >There is considerable snobbery amongst coders, even amongst friends. >There are too many misconceptions, and too much rivalry between >skill sets. There were raised eyebrows in the audience in Hamburg >when it was revealed that I use REALbasic to make music. actually renate said to me (raising eyebrows) that this is really done in Visual Basic she found it very nice that this language known rather from economic informatics could be used to do music language. > Later, over food, there was considerable surprise that whilst Alex >is the "perl one" and I was "REALbasic one", I was the one who wrote >the slub sound engine in C. Heavens above! I know how to use gcc as >well as a mouse. How did that happen? Coders are the worse at this. >Not even stroppy prima donna artists do it. hm, reminds me a bit being in india being the "angresi" (english) and in northern australia being the "balanda" (hollander - dutch)... I think the idea of "level of abstraction" in programming language should be abandoned as far as possible, as well as "value" of a certain language. >Amy asked if we should be inclusive or competitive. I suggest we >need to be inclusive (fairly obviously given my experience), but >let's be careful that we dont endorse or promote any particular >practice, since it'd be easy for that to be misconstrued in the eyes >of a community of coders who are conditioned to judge by tools >alone. My line is that it's not the tools that are of interest, but >how you use them. I would suggest not to use tools at all, or not to see algorithms as tools at least. -- . _______________________________________________ livecode mailing list livecode@hiddenorg http://toplap.org/cgi-bin/mailman/listinfo/livecode Date: Wed, 25 Feb 2004 09:19:52 +0000 From: Nick Collins I would suggest not to use tools at all, or not to see algorithms as tools > at least. julian, could you expand on this a little please? Am I misrepresenting you if I say : you want to explore programming languages as thought empowerers and mindsets in performance. Because conventional fixed GUIs allow no abstract thought, generalisation, meta-statements, interpreted languages are a better medium for tackling such issues as performance art? > I think the idea of "level of abstraction" in programming language should be >abandoned as far as possible, as well as "value" of a certain language. Can we all agree that there are many potential languages (and many that would be intriguing to use, even if supposedly less optimised to audio etc...), and no one optimal live coding solution? Are we allowed to discuss languages though in terms of their abstractions/representations- this is surely critical to their use and appreciation. Whilst in no way denigrating the artistic choice of a particular set of constraints on a live coding performance through a particular medium... _______________________________________________ livecode mailing list livecode@hiddenorg http://toplap.org/cgi-bin/mailman/listinfo/livecode Date: Wed, 25 Feb 2004 09:31:44 +0000 From: Nick Collins > I would suggest not to use tools at all, or not to see algorithms as tools >> at least. > >julian, could you expand on this a little please? > >Am I misrepresenting you if I say : > >you want to explore programming languages as thought empowerers and >mindsets in performance. yes, this I think is an important point. In performance would mean that what is in action is not the application (the tool) but the grammar and the language (the algorithm). What I try to describe is a perpective more than a thing in itself. >Because conventional fixed GUIs allow no abstract thought, >generalisation, meta-statements, interpreted languages are a better >medium for tackling such issues as performance art? well I would not find it wrong to count GUI activities as performance art. UI is a metaphor of acting and of worlds so why not apply this term to artistic treatment of windows and such things like buttons and mouse clicks - this is just another example that for artistic work a distinction cannot be made between a good and a bad medium in general. >> I think the idea of "level of abstraction" in programming language should >be >>abandoned as far as possible, as well as "value" of a certain language. > >Can we all agree that there are many potential languages (and many >that would be intriguing to use, even if supposedly less optimised >to audio etc...), and no one optimal live coding solution? optimum has no general meanings, so I would always agree to this. I find it important though that we have to ratify the acceptance of a certain language first. (: >Are we allowed to discuss languages though in terms of their >abstractions/representations - this is surely critical to their use >and appreciation. of course. I was pointing at a very common paradigm that builds a world order from low-level to high-level languages, which I find has its right but is not that useful as soon it comes to semantics and artistic programming. >Whilst in no way denigrating the artistic choice of a particular set >of constraints on a live coding performance through a particular >medium... I have a high respect for people who play the trumpet and try to get violin timbre out of their instrument.. -- . _______________________________________________ livecode mailing list livecode@hiddenorg http://toplap.org/cgi-bin/mailman/listinfo/livecode Date: Thu, 26 Feb 2004 23:51:25 +0000 From: Nick Collins splendid. great work alex. not totally comfortable with wikis myself > so excuse me totally filling up the changed history log. Same here! I just removed the 'member organisations' section from the front page of the wiki. Not sure if we're ready for that yet... Lets just contribute as individuals for now. > nick: > "Alex's comment on code running in the mind of the programmer (from the > OS paper)" > and, iirc, in the audience too! reading that for the first time really > trigged me. Yes, it's an important point I think. > think manifest should mention dancing too as a kind of code > digestion/processing. Yes. Perhaps if movement of processes/interfaces are not visible then there must be room for the audience to dance, to make their own movements. Either that or there should be complete darkness. alex _______________________________________________ livecode mailing list livecode@hiddenorg http://toplap.org/cgi-bin/mailman/listinfo/livecode Subject: Re: [livecode] foo From: alex so, i'm very interested in the performative aspects of live-coding. It would be good to define the different aspects of live coding that we find on the website as some point. It would make it easier to think about it all. > live-coding can invert the usual electronic music performance assumptions > i think, because it's now coding-as-improv, i.e. live composition, rather > than software-usage as performance. I think there could be a subtle but important point here. A programmer/musician/artist often needs quite a particular and individual environment in which to work. For example I like to think about what to program while I'm dancing, decide how I'm going to program it in the bath, and actually program it in quiet and comfortable surroundings such as my studio or sofa. The idea of going to a club, getting on stage, creating a source file and beginning to compose the next 40 minutes of entertainment is kind of ridiculous. So I think a more practical approach is to start with a pre-prepared composition, but instead of building in interfaces and parameters, you change the variables and structures in the code directly, and modify the composition over time to produce (for example) musical progression. As the code is edited it immediately gets interpreted and the sound changes. This seems to be how Julian works with his jitlib. So you can start off with fairly static composition and make changes to it over time to turn it into a performance. > 1) is this all about music? (e.g., i work from the visuals side of things. > then again, i'm not specifically live coding, either... ) No, I think the visual and other mediums are just as relevant, within the context of time-based performance. > 2) are we interested in the performative aspects of live coding, the > conceptual aspects, or both? in other words, is the audience always made > aware of the live coding process? how? are the performers' displays shown > on screens? do we see their fingers typing? As I think I said replying to a later mail; if there is any kind of visual spectacle as part of a sonic performance or vice-versa, then they should be closely related. Seeing what is causing (or caused by) the sound (or the process that makes the sound) brings us closer to the sound, and so the visual becomes part of the musical experience. Of course someone could say they have unrelated sound and visuals in their performance, that this somehow works, and that no-one should tell them otherwise. Fair enough - our job at TOPLAP is not to decide what is a perfect audio, visual or a/v performance, but to define what constitutes a TOPLAP performance. Ie, a particular type of performance that we think is interesting, worthwhile and should happen more often. I don't think we should be trying to define the perfect performance in general terms. > 2a) if we're interested in the audience's experience of the coding > performance - how much of this is about typing vs. coding? i remember from > alex's impromptul 5-minute coding performance at transmediale last year, > the audience was very excited to watch alex's cursor fly around in emacs. > fortunately, alex is not only a whiz at coding, but also moves like a > dancer in emacs. so now that's 3 things - coding well live, typing skill, > and a particularly visual facility with a text editor. do we separate > those issues? for the audience? for ourselves? what about lowly vim users > like me? ;-) ... and, how to address these issues without degenerating > into emacs/vi wars? In my experience, typing is a good way of engaging the audience in certain contexts but not others... For example an audience which is sitting down drinking wine might find it engaging to watch an editor being expertly used. I don't think projecting the editor above a dancefloor encourages people to dance however! It's hard to read when you're jumping up and down, so a lot of people just stand and watch. That's a shame; when you're dancing the processes really become alive around your movements. You don't need to see anything. However, typing is central to a lot of programming activity so should be flagged up as a useful performative tool. > 3) is live coding inclusive or competitive? i.e., does one need to be the > fastest, most proficient coder to do perform live? if so, it might prove > programming prowess but exclude people with interesting ideas and a feel > for live performance, but who aren't as quick at the programming end of > things. or, can both approaches be accomodated, for example, through > various types of languages? I agree with Ade that it shouldn't be competitive in general, although I heard about 'competitive poetry' recently - if poetry performance can be competitive then why not programming? I feel that a live coding performance requires some deep understanding of the code that is being performed with. If the performer was the one who programmed the code then they will already have this deep understanding. So yes, we can be inclusive to a point, but we can only include programmers. > 4) from this question, as well as the issue of cubase sequences, it seems > to me that we've hit on the idea that the distinction between programmer > and user is really more of a continuum than a line in the sand. and after > all, programmers are users anyway, using C++, perl, various libraries, > etc. Agreed. I think we have to draw that line in the sand somewhere though, and accept that others will draw it in different places. > yeah, i'd also vote for keeping things as loose as possible, and then > providing some rationale for whatever admittedly-subjective line we decide > to draw (and that the line can be crossed for particular situations - > what if someone constructs a turing-complete language out of cubase > sequences? ;-) ) Indeed! > a> One idea we had was to have TOPLAP ratified programming languages and > a> programming methods, and a procedure that people would follow to get > a> their particular programming/performance environment ratified. > i'm not sure i follow the rationale for this? it sounds a little like it > could create a feeling of an exclusive club to me like it could discourage > people from working in new directions... but maybe i missed something? If the procedure to get things ratified were easy and open, then it's not an exclusive club. > sorry if any/all of this is stuff that's been discussed already! i know it > can be weird when new people join a group and then bring up the same > things that have been gone over before... hoping to catch up soon! :-) Like Frederick said, we only just started! alex _______________________________________________ livecode mailing list livecode@hiddenorg http://toplap.org/cgi-bin/mailman/listinfo/livecode Subject: Re: [livecode] foo From: alex I would suggest it is not all about music (as probably music is not > all about music either) > My personal interest is to get away from the idea of production - so > if music as a product then I'd even say it is not about music. But of course that's a big 'if', the word music is applied to many things. For example can have paintings which contain musical forms (such as those by Paul Klee and M. K. Ciurlionis). For me that's the kind of music is what is interesting to explore. So live programming can be about this kind of music, but only as one option to explore. I do think it's also interesting to think about music as a product, to sometimes try to make a 'track' that people can listen to over and over, and get to know as something that is unchanging but that still sounds different every time. But performance isn't about such a product and so perhaps that shouldn't concern us here. Maybe we should call ourselves TOPLaP, and not include the 'audio' bit in the explanation. We could use 'art' or 'artistic' there, but these words can be problematic. > I use live-coding mostly for film sound, so there is no 'performance' as > such. But when playing for an audience or with an audience, I would > try to show the code. I was a bit unhappy about our last performance > at betalounge in Hamburg - they showed fingers typing as a maximum - > why should this be so very interesting? But again this is my personal view. Agreed. It's like when an expert guitarist is playing and they focus the camera on the right hand plucking away, when all the cool stuff is being done by the left hand. In this case the interest isn't in the fact that the fingers are typing away, but in what they are typing. You generally can't see what is being typed by watching the keyboard, you have to watch the screen. Frederick's problems were particularly annoying, they kept projecting the MAX objects instead of the output of them. > generally I would say that live coding doesn't even have to be specifically > interesting or in the center of attention. Agreed! > For me it is another way to think > about language and computers, and another way to interfere with sound - it > is good if it is visible what's going on, but not for sake of showing off, > but for being able to understand it and to have conversation about it > and with it. Yes! Beautifully put. alex _______________________________________________ livecode mailing list livecode@hiddenorg http://toplap.org/cgi-bin/mailman/listinfo/livecode Subject: Re: [livecode] foo From: alex was it you alex or julian who brought up apple's new garage band > program as a good opposite pole http://www.apple.com/ilife/garageband/ > it sure is ugly and scares me. crap music will be flooding. ableton > live is another good case study i think. It might well have been me. I'm not sure if we need to be negative in directly criticising these applications. I'd feel uneasy about it for a start - "some of my best friends use ableton live." I don't think I could agree that loop-based live performances are necessarily bad. I quite like loops. > defining live-coding will probably be difficult and might take time. > but couldn't we try to break down the problem and see what we can agree > upon at the moment? Yes! > dare i suggest we all can agree with alex above on a notion of playing > with open cards (in practice... project our screens during performance > or if that not possible at least not hide away behind the screen > looking 'very serious'). trying to share and communicate by > visualising, in whatever way, what's going on in the computer and our > minds? Perhaps we can agree in "exposing the movement of the process" to the audience, or allowing them to expose it for themselves (by dancing). > i got another thought though... > if you project any sequencer program it will be instantly obvious > what's going on. on-screen knobs and faders bores you and after a few > minutes you'll loose interest - maybe in the resulting music too. it's > hard to be a virtuoso in ableton live. > [...] isn't the thing about live coding the curiosity of how it really works? > at least slub makes > my brain spin trying to figure it out. You probably don't manage to work it out though! Just have fun trying. We've played to riotous dancefloors in amsterdam, berlin, paris, barcelona and london without projecting our screens and people have really got into the music as sound. > i just love the expectation > seeing a command executed and listening for its effect in the music. > or which chunk of unreadable terminal output belongs to which sounds. > i'm not proposing making things more difficult than needed but isn't it > a trick of the trade to keep some secrets, arise curiosity (and thereby > dialogue) and not letting the audience grasp the whole working process > at once? But I think there's a danger there of making a performance that only programmers can appreciate. That's a bit like making art for art critics, a bit self-defeating in my opinion. > googling for 'live coding' brought this up > http://www.eagle-software.com/snobol_review.htm witness of a delphi > live coding happening. > "If you thought that programming was always a personal, selfish > pleasure, then you should have seen this: a British audience > (correction: a British audience of programmers ) shouting out and > competing with each other and actually cheering and bursting into > spontaneous applause " Hee! alex (used to type faster until the RSI came along) _______________________________________________ livecode mailing list livecode@hiddenorg http://toplap.org/cgi-bin/mailman/listinfo/livecode Date: Sat, 28 Feb 2004 21:31:54 +0100 From: Julian Rohrhuber =20 a> I'm writing my own text editor in Perl, so I can interpret the code on a> the fly. =20 cool! will you open source it? vj =FCbergeek could have some fun developi= ng=20 that into other sorts of applications, i spect...=20 a>=20 a> http://www.finseth.com/craft/Chapter-9.html a>=20 cool stuff! i got overexcited and saw: o |\ The quick --- | \ red fox | | / jumps over / \ |/ the lazy user display buffer and assumed it was some slub-ish type thing in which you could type ascii= =20 art people into the text editor and it would generate sentences based on=20 the image, like some special kanji-esque interface. but no such luck. :-) -@ _______________________________________________ livecode mailing list livecode@hiddenorg http://toplap.org/cgi-bin/mailman/listinfo/livecode Subject: Re: [livecode] text editing From: alex cool! will you open source it? vj =FCbergeek could have some fun developi= ng=20 > that into other sorts of applications, i spect...=20 It'll be GPL'd although it's not ready for publication just yet...=20 Here's a sneak preview though: http://slab.org/lc.tar.gz alex _______________________________________________ livecode mailing list livecode@hiddenorg http://toplap.org/cgi-bin/mailman/listinfo/livecode Date: Sun, 7 Mar 2004 22:17:56 +0100 From: Julian Rohrhuber Content-Type: multipart/appledouble; boundary="============_-1133415643==_D============" --============_-1133415643==_D============ Content-Transfer-Encoding: base64 Date: Wed, 07 Apr 2004 09:14:28 +0100 From: Nick Collins I saw Ron Kuivila's Forth software crash and burn onstage in >Amsterdam in 1985, but not before >making some quite interesting music. > >The performance consisted of typing. > >Best regards, > >C > >Roads, C. 1986. "The second STEIM symposium on >interactive composition in live electronic music." >Computer Music Journal 10(2): 44-50. "Yeah, in Forth. David Anderson and I had a complete preemptively scheduled multi-threaded language running on OS7 with Forthmacs (this is the forth that is now part of open boot used to configure mac's) RJK > it seems that nick collins has found out that > you are one of the grandfathers of live coding. > If I knew I would have asked you when we wrote the paper about live > coding. > did you program in fourth? > -- " -- . _______________________________________________ livecode mailing list livecode@hiddenorg http://toplap.org/cgi-bin/mailman/listinfo/livecode Date: Thu, 6 May 2004 08:45:55 +0100 (BST) Subject: paper From: "alex" last Friday 7th at the Tank, NY, they finally had two projectors. So > klipp av (Fredrik and I) could finish our little tour with the av cut > stuff, plus a side projection of my SC3 live coding setup. No great shakes > as live programming, much to improve, but I finally feel like I met some > possible manifesto conditions...(projection, live coded (simple cut > function) algorithmic manipulation, no backup) Congratulations! I am yet to achieve this milestone, but am working towards it... Hopefully will get there for a gig in Pescara later this month. By the way, who could be in London on the 17th July? That's the date set for the London Placard headphone festival. 20 minute performances over 14 hours, where the audience listens on headphones. It's going to be really nice, would be good to get some TOPLAP performers there! alex Subject: [livecode] ramble From: alex wrote: > > Hi, > > Tonight I've been trying to write something, anything, for this read_me > paper. Here's what I came up with, a kind of introduction. Some of the > ideas in it might be useful anyway. I also fiddled with the wiki, and > typed some possible section titles in. Anyway, it's gotten late, so > here's the text without further ado: > > > There are many similarities between software and music. For example, > both exist as a set of instructions to be interpreted and executed to > produce a temporal form. I play this music I've scored, I run this > software I've hacked together; I breathe life into my work. > > The forms of software and music have been intertwined, not only by > musicians exploring their ideas as software processes but also by > programmers exploring their ideas as music processes. Software > environments such as CSound, MAX/MSP and SuperCollider are new > playgrounds in which we may all explore the processes of music; > describing and manipulating processes that make the music, rather than > the music itself. > > Terms such as "generative music" and "process music" have been > invented and appropriated to describe this new approach to > composition. But the focus of this text is not to explore this > approach, but to explore how such music should be performed. > > ... > > Goodnight! > > alex > From: Adrian Ward There's something funny with this mailing list manager. Ta. A long time ago (or, about 3 months to the day) Nick asked me if I could come up with a logo for toplap. I've got a tentative design up on the wiki so please do have a look and let me know what you all think. I hope you like it's simplicity. http://toplap.org/wiki?TOPLAPLogo Cheers, -- Adrian Ward, Signwave UK http://www.signwave.co.uk/ - +44(0)7711 623498 2nd Floor North, Rutland House, 42-46 New Road, London, E1 2AX, UK. Subject: Re: [livecode] ramble From: alex wrote: > > On 14 May 2004, at 10:20 pm, alex wrote: > >> There's something funny with this mailing list manager. > > Ta. > > A long time ago (or, about 3 months to the day) Nick asked me if I could > come up with a logo for toplap. I've got a tentative design up on the > wiki so please do have a look and let me know what you all think. I hope > you like it's simplicity. > > http://toplap.org/wiki?TOPLAPLogo > > Cheers, > > > -- > Adrian Ward, Signwave UK > http://www.signwave.co.uk/ - +44(0)7711 623498 > 2nd Floor North, Rutland House, 42-46 New Road, London, E1 2AX, UK. > > Subject: [livecode] runme and things From: alex Thought of a few comments that might help develop this... and thinking about > what we talked about the other evening (from memory). > > > There are many comparisions to be made between software and music. > > For example, both exist as a set of instructions to be interpreted and > > executed to produce a temporal form. I play this music I've scored, I > > run this software I've hacked together; I breathe life into my work. > > Maybe the relationship between the score and the music could be expanded. > Adorno thought the score more important than the music played (in 'On the > Fetish Character in Music and the Regression of Listening'). How does the > relationship between the code and the execution of that code play out? You > could describe this quite closely in relation to the live element. (although > clearly this applies to the conventional score and not the more experimental > type - such as in Cage's work). This seems to relate to composition too. > Some aspects are predetermined and some are improvised or at least > interpreted. There is a slippage between the score and the individual > interpretation of that score. There's a reductive tendency to see the use of > computers as deterministic - but the live component tries to undermine this > view. Live programming tries to reintroduce these interpretative, > improvisational and unpredictable elements. > > > Indeed, some musicians explore their ideas as software processes, > > often to the point that a software becomes the essence of the music. > > At this point, the musicians may also be thought of as programmers > > exploring their code manifested as sound. This does not reduce their > > primary role as a musician, but complements it, with unique > > perspective on the composition of their music. > > Maybe the craft aspect is important in this. Not all musicians take the > extreme view that they should make the instruments they play but they > certainly build a close and intimate relationship to them. There is also the > link of the tool to the user and the task performed. Florian Cramer explains > in the case of the typewriter this is important as it breaks down the false > distinction between the writing and the tool with which the writing is > produced, and in terms of the computer between code and data. I think there > is something in this logic that relates to the production of music, in that > the relationship of the code and the data or the score and the music are > brought closer together - perhaps more than with conventional music where > the distinction is emphasised (as with the Adorno comment). > > Does that follow? > > > Terms such as "generative music" and "processor music" have been > > invented and appropriated to describe this new perspective on > > composition. Much is made of the alleged properties of so called > > "generative music" that separate the composer from the resulting work. > > Brian Eno likens making generative music to sowing seeds that are left > > to grow. Much is made of how we may give up control to our processes, > > leaving them to "play in the wind." This is only one approach to > > combining software with music, one that this paper wishes to counter > > quite strongly. We advocate the humanisation of generative music, > > where code isn't left alone to meander through its self-describing > > soundscape, but is hacked together, chopped up and moulded to > > make live music. > > But then, paraphrasing the famous Eno quote on generative music is that > people will not listen to music the same way twice. Again by keeping laptop > music live this is either emphasised or the statement is made too obvious to > be useful. > > geoff Subject: [livecode] another ramble From: alex Here's another unedited ramble. With this I'm trying to describe my a> live code editor, which has just been given the name of feedback.pl. i've been thinking off and on about what it might be like to do something like this with visuals, and somehow the above two lines from alex provided the magical inspiration. so last night i threw together a rough quick funny thingee that's a livecode editor and vj instrument - both at the same time. you type the code on the screen, and it actually executes the code as you type it, but all the visuals are made of the onscreen code too (sort of like my cyberspaceland project, but with code, and much simpler). though i'm experimenting with including the ability to pull in the whole desktop sometimes as well. and like cyberspaceland, it generously abuses the concept of feedback (but in this case, the whole thing's a feedback loop anyway... this is somehow reassuring because the concept of video feedback in digital vj tools always disturbs me a little - since computers have no optics they can't "see" their own output like in the original video feedback with cameras. i like awake nights worrying about that.) of course the devil is in the details and i should really find some time make The Thingee a bit less rough than it is, and actually get good enough at performing it to make it entertaining to watch and not just a conceptual one-liner... time is not one of my assets these days, but hopefully i'll have enough time in the coming weeks/months to get this going at least on a basic level... if so, would be cool if there's some time within the livecode-related activities at the r&d camp - if you think it fits in, i would be honored to do some quickee performance of it... in keeping with my new "foot-operated software" crusade, i may also add an option where you can do the live coding with your feet on a DDR pad. (this would have the added benefit of bringing the world one step closer to the elusive goal of dancing about architecture.) i think i've actually figured out a warped hack for doing that, but i'd really have to be much less tired than i am now before i could attempt such a feet . -@ (you may stop groaning now.) Subject: Re: vislivecode slightly ot rambles (was Re: [livecode] another From: alex i've been thinking off and on about what it might be like to do something > like this with visuals, and somehow the above two lines from alex provided > the magical inspiration. so last night i threw together a rough quick > funny thingee that's a livecode editor and vj instrument - both at the > same time. you type the code on the screen, and it actually executes the > code as you type it, but all the visuals are made of the onscreen code too > (sort of like my cyberspaceland project, but with code, and much simpler). Amy, you just brightened up my day! Can't wait to see this. Definitely must be part of the runme/dorkbot citycamp. > though i'm experimenting with including the ability to pull in the whole > desktop sometimes as well. and like cyberspaceland, it generously abuses > the concept of feedback (but in this case, the whole thing's a feedback > loop anyway... Live coding does open up some interesting experiments in feedback loops. Which is good, because it seems to me that the whole point of performance are feedback loops of one kind to another... Between the performer and the audience, between the performer and the environment (accoustics, etc), between the performer and her creation and so on... alex From: "Dave Griffiths" On Wed, 19 May 2004, alex wrote: > > a> Here's another unedited ramble. With this I'm trying to describe > my a> live code editor, which has just been given the name of > feedback.pl. > > i've been thinking off and on about what it might be like to do something > like this with visuals, and somehow the above two lines from alex provided > the magical inspiration. so last night i threw together a rough quick > funny thingee that's a livecode editor and vj instrument - both at > the same time. you type the code on the screen, and it actually > executes the code as you type it I'd love to see this - I have a project here: http://www.pawfal.org/Software/fluxus/ which runs scheme scripts to read audio input and make realtime animations. I guess this is a bit different to your idea in that the movement is coming from audio, but the "scenes" you build are script driven. The actual script interface is in a seperate window to the renderer currently. At the moment it sort of resembles the maya mel script editor in that you write code, and highlight it to execute it in fragments. That way you can build up a palette of function calls and bits of code and execute them with the mouse and a keypress. The whole time you're doing this, the renderer is doing it's thing, so it's usable live. There's a rather old screenshot here: http://www.pawfal.org/Software/fluxus/images/using1.jpg > but all the visuals are made of the onscreen code too One day I'd like to make the code editor a quake style console thingy, so the code is visible to all, on top of the 3D. Figuring out the interface for this is a bit more tricky, but I guess it could work in a similar fashion. > in keeping with my new "foot-operated software" crusade, i may also > add an option where you can do the live coding with your feet on a > DDR pad. This sounds interesting - feet are an oft neglected input... :) cheers, dave ................................. www.pawfal.org/nebogeo Date: Thu, 20 May 2004 13:14:49 +0200 From: Julian Rohrhuber > in keeping with my new "foot-operated software" crusade, i may also >> add an option where you can do the live coding with your feet on a >> DDR pad. > >This sounds interesting - feet are an oft neglected input... :) and it demonstrates again how dense the relationship is between programming and weaving. Maybe it won't take a long time and we have programmers riots. -- . From: Adrian Ward and it demonstrates again how dense the relationship is between > programming and weaving. Maybe it won't take a long time and we have > programmers riots. Surely it'd be the Luddites (non-programmers) rioting? That would be quite a gig. From: "Dave Griffiths" >> in keeping with my new "foot-operated software" crusade, i may also > >> add an option where you can do the live coding with your feet on a > >> DDR pad. > > > >This sounds interesting - feet are an oft neglected input... :) > > and it demonstrates again how dense the relationship is between > programming and weaving. weaving as performance art? my mum used to have this massive floor loom in our house. I think she was the first generative art programmer I met :) dave ................................. www.pawfal.org/nebogeo Date: Thu, 20 May 2004 14:14:23 +0200 From: Julian Rohrhuber On Thu, 20 May 2004 13:14:49 +0200, Julian Rohrhuber wrote >> >> in keeping with my new "foot-operated software" crusade, i may also >> >> add an option where you can do the live coding with your feet on a >> >> DDR pad. >> > >> >This sounds interesting - feet are an oft neglected input... :) >> >> and it demonstrates again how dense the relationship is between >> programming and weaving. > >weaving as performance art? maybe there should be attempts to make performance weaving more interesting by somehow freeing the performer from having to start at one end and moving line by line. Maybe we should consider sewing - it also uses foot control. >my mum used to have this massive floor loom in our house. I think she was the >first generative art programmer I met :) yes I agree that generative art programming performance is primarily a domestic activity. -- . Date: Thu, 20 May 2004 14:38:28 +0200 From: Julian Rohrhuber On 20 May 2004, at 12:14 pm, Julian Rohrhuber wrote: > >> and it demonstrates again how dense the relationship is between >>programming and weaving. Maybe it won't take a long time and we >>have programmers riots. > >Surely it'd be the Luddites (non-programmers) rioting? > >That would be quite a gig. if programming becomes very easy then the salaries of a programmer will fall - then we'll have programmers riots, like that one: --============_-1127065785==_============ Content-Id: Subject: [livecode] paper authors From: alex a> Amy, you just brightened up my day! Can't wait to see this. Definitely a> must be part of the runme/dorkbot citycamp. a> cool! i hope to get its visuals more varied and a bit more controllable by then, so it can be at least mildly entertaining for a few minutes. right now it has a frequent tendency to look a bit too much like a bad knockoff of a mark napier piece. but hopefully won't be too much work to give it a bit more variety. and i can't wait to see feedback.pl in action! a> Live coding does open up some interesting experiments in feedback a> loops. Which is good, because it seems to me that the whole point of a> performance are feedback loops of one kind to another... Between the a> performer and the audience, between the performer and the environment a> (accoustics, etc), between the performer and her creation and so on... a> hmm, that really pushes/questions the definition of a feedback loop doesn't it! everything could be a feedback loop - feedback between a person and the world... kind of existential... but this brings me to a couple philsophical/practical live coding questions i ran into while working on the Thingee. Thingee is written in lingo, so at first i started just typing lingo commands on the screen for what i wanted to do. then i noticed, the commands were in some cases uncomfortably long; i had a tendency to mistype, forget syntax, etc. so i did what i would do in a "normal" application of course - write my own routines. then i can type shorter commands. ok, so now, is that still "live coding"? because by this line of thinking, i could just assign a keypress rather than typing the name of my routine. then it wouldn't seem like live coding at all. so this reminds me of the fact that using and programming are really a continuum when you get down to it... every language is a piece of software running on top of machine language; every programmer is a user of the language in which they're programming... so where do we draw the line? number of keystrokes seems not to be a productive way to think about it. i looked through the wiki and the paper draft, and sort of found this all being questioned but not quite answered, which is probably a Good Thing, because in my gut i think there's not an answer. the gist seems to be writing algorithms in live performance: but then again, we're almost always starting from algorithms that exist. for example, if i want an object to start continuously rotating, i could write in lingo sprite(3).rotation = sprite(3).rotation + 10 but typing "rot" or "rot 3" or "rot 3 10" would be preferable. can i predefine a "rot" routine or do i have to actually write it while i'm on stage? requiring it to be written on stage seems to me to privilege the existing language (lingo) too much as the basis from which everything must spring. and "rotation" is a pre-existing algorithm anyway. this example may be too simple to illustrate the dilemma effectively, but imagine it was something more complex, involving screengrabs, a for loop, and some other transformations perhaps... in my gut i feel like this is like the "what is interactivity" question from a few years ago. and the answer is, "admit there isn't any, but don't let that stop you from doing it." with interactivity there was a myth that the artist was giving away control of the piece to the user - but in fact, the artist was still in control. with live-coding, there is a myth that the performer is in control of the software/music/images - but in fact, he's still being controlled. and yet, once we see through the myth, there's nothing inherently bad about interactivity per se, and now that all the controversy's died down, it's often used quite well and without much fanfare in projects even by folks who decried its existence - just for what it is - giving the user some physical participation in a project without claiming to give him "control." the same might be true with live coding - perhaps it's important to recognize that live coding just moves the program along the scale of participation between user and CPU, but there's many levels of control in between. (and that even at the "user" level, it's possible to take an unexpected degree of control, just as at the programmer level it's more than possible to lose it.) what do you think? i also have a more directly practical question, but i think i will put that in reply to nick's post... -@ Date: Thu, 20 May 2004 14:40:36 -0700 (PDT) From: Amy Alexander DG> I'd love to see this - I have a project here: DG> http://www.pawfal.org/Software/fluxus/ which runs scheme scripts to read audio DG> input and make realtime animations. DG> cool! i will check it out once i'm in front of a linux console! DG> The actual script interface is in a seperate window to the renderer currently. DG> At the moment it sort of resembles the maya mel script editor in that you DG> write code, and highlight it to execute it in fragments. That way you can ok, so here's my 2-part question, for you and alex and everyone: 1) how do you go about telling the interpreter "execute now" when live coding? dave, you're using highlighting? does it execute as soon as you highlight? or you have to hit RETURN or something? i'm for now using the semicolon (which lingo does not use) to indicate to lingo "execute this now!" but i'm wondering if there's a better way than using a character or even RETURN within the "editor" - since any of these are subject to typos, and then i have to delete it to edit the line to make it do something else, and sometimes i forget and it tries to execute an incomplete line of code. i realize this will vary with the livecoding methodology being used, but just curious how others are doing this. 2) validation - i make typos when livecoding, and then the illegal commands generate a nasty crash or at least an error in performance! i may just integrate those into the performance, but, curious how others handling this problem? are you having some code validation go on before execution, or... ??? thx! -@ Date: Thu, 20 May 2004 14:47:35 -0700 (PDT) From: Amy Alexander JR> and it demonstrates again how dense the relationship is between JR> programming and weaving. Maybe it won't take a long time and we have JR> programmers riots. JR> Subject: Re: vislivecode slightly ot rambles (was Re: [livecode] another From: alex 1) how do you go about telling the interpreter "execute now" when live > coding? dave, you're using highlighting? does it execute as soon as you > highlight? or you have to hit RETURN or something? I have Perl recompile the whole file I'm editing all the time. State is preserved - it's just interpreting it as an extra bit of code, except old versions of sub-routines get replaced with real ones. I have two modes... One is to have the interpreter run over the code every time I type any key, or every time the running program edits its own code. This mode is of course completely impractical as you have to think of the effect of every single keypress. It also uses a lot of CPU, so I can't keep many running at a time without getting audio drop-outs... This is because my programs tend to edit their code a lot, putting in comments to let me know what's going on. The other mode triggers a recompile every time I hit ctrl-x. This seems to work for me. > 2) validation - i make typos when livecoding, and then the illegal > commands generate a > nasty crash or at least an error in performance! i may just integrate > those into the performance, but, curious how others handling this problem? > are you having some code validation go on before execution, or... ??? My code first tries to compile it into a scratch area first, so that if the program isn't syntactically correct it leaves the old version running. alex Subject: Re: [livecode] ramble From: alex Maybe the relationship between the score and the music could be expanded. > Adorno thought the score more important than the music played (in 'On the > Fetish Character in Music and the Regression of Listening'). How does the > relationship between the code and the execution of that code play out? I don't know Adorno's argument, but... On one hand, sourcecode exists as a set of instructions, and running code exists as the sourcecode being processed. When the sourcecode is being processed within a stable operating system, the results can be pre-determined. Every time you run the code, you get the same results. When live coding, when you describe an instruction, it is carried out immediately. In linguistic terms, this suggests the language of the live programmer is performative. In general life, if you say that the cat is blue, you don't make it so. When live coding, by declaring that $cat = "blue";, then $cat == "blue"; immediately becomes true. But then, perhaps this is an illusion, because I have power over a certain scope. I might tell myself that cats are blue and believe it. I might tell my computer the same thing, it has no reason to doubt me, I am its true master. But if I try to tell someone elses computer that cats are blue, I would fail - I probably don't have enough authority over that computer to make it so. But then (again), any performative use of language is a matter of power within a scope. A king can say "my shoes must be here," but that will only become true if a loyal subject is within earshot. A jury may say "You are guilty" in a court of law, but they lose that power once they leave. So, to read code is to execute it. Saying one is more important than the other is to draw a false separation. And unread code is less useful than code that has been read. > You > could describe this quite closely in relation to the live element. (although > clearly this applies to the conventional score and not the more experimental > type - such as in Cage's work). I'm not sure if there is such a difference as far as we're concerned. Experimental scores of this era seem very similar to computer programs. > This seems to relate to composition too. > Some aspects are predetermined and some are improvised or at least > interpreted. There is a slippage between the score and the individual > interpretation of that score. There's a reductive tendency to see the use of > computers as deterministic - but the live component tries to undermine this > view. Live programming tries to reintroduce these interpretative, > improvisational and unpredictable elements. ... to performance, yes. These three elements that you mention seem essential to music performance, and live programming allows them to be brought to the fore. > Maybe the craft aspect is important in this. Not all musicians take the > extreme view that they should make the instruments they play but they > certainly build a close and intimate relationship to them. There is also the > link of the tool to the user and the task performed. Florian Cramer explains > in the case of the typewriter this is important as it breaks down the false > distinction between the writing and the tool with which the writing is > produced, and in terms of the computer between code and data. I think there > is something in this logic that relates to the production of music, in that > the relationship of the code and the data or the score and the music are > brought closer together - perhaps more than with conventional music where > the distinction is emphasised (as with the Adorno comment). > > Does that follow? Yes, I think so. > But then, paraphrasing the famous Eno quote on generative music is that > people will not listen to music the same way twice. Again by keeping laptop > music live this is either emphasised or the statement is made too obvious to > be useful. I think the latter. To say that the same piece of music is ever heard twice is to belittle the role of the listener who listens differently with each play. alex Subject: RE: [livecode] ramble Date: Fri, 21 May 2004 07:38:47 +0100 From: "Geoff Cox" wrote: > > Who is thinking of adding something to the read_me paper? > > Olga from read_me is asking for a list of authors urgently. Ade + > Julian - I guess you'll be adding something at some point? Anyone else? > > alex > > Date: Fri, 21 May 2004 10:41:27 +0100 From: Nick Collins but typing "rot" or "rot 3" or "rot 3 10" would be preferable. can i > predefine a "rot" routine or do i have to actually write it while i'm on > stage? requiring it > to be written on stage seems to me to privilege the existing language > (lingo) too much as the basis from which everything must spring. and > "rotation" is a pre-existing algorithm anyway. There is freedom for the live coder to determine the language of their discourse. The continuum low to high level abstraction is part of that. Appreciating algorithms is an attempt for an educated audience to share the feeling of code running in their minds (to paraphrase Alex) as they also hear/see results. Connisseurial reaction may very based on whether you spontanteously create some algorithm variant in fine detail or call MySplendidShortcut(3.14158) As for the implicit controls of language assumptions, and the wrangling of control, Julian has spoken poetically about this, and I think he should be credited with anticipating these issues by calling an event 'changing grammars'. oops, hope that didn't sound too dictatorial; I admit all your points Amy, and these continuums of leeway are the aesthetic territory of future professional live coding critics... Date: Fri, 21 May 2004 10:54:34 +0100 From: Nick Collins NC> There is freedom for the live coder to determine the language of their NC> discourse. The continuum low to high level abstraction is part of that. NC> Appreciating algorithms is an attempt for an educated audience to share the NC> feeling of code running in their minds (to paraphrase Alex) as they also NC> hear/see results. Connisseurial reaction may very based on whether you NC> spontanteously create some algorithm variant in fine detail or call NC> NC> MySplendidShortcut(3.14158) NC> NC> As for the implicit controls of language assumptions, and the wrangling of NC> control, Julian has spoken poetically about this, and I think he should be NC> credited with anticipating these issues by calling an event 'changing NC> grammars'. NC> NC> NC> oops, hope that didn't sound too dictatorial; I admit all your points Amy, NC> and these continuums of leeway are the aesthetic territory of future NC> professional live coding critics... NC> NC> NC> NC> NC> From: "Dave Griffiths" ok, so here's my 2-part question, for you and alex and everyone: > > 1) how do you go about telling the interpreter "execute now" when > live coding? dave, you're using highlighting? does it execute as > soon as you highlight? or you have to hit RETURN or something? Yep, you have to hit F5 - in maya it's the enter key on the keypad, but on a laptop (which I have) that's more tricky :) If no code is selected, hitting F5 will execute the whole script BTW. Also in fluxus you can set a bit of code that's called each frame - if you make this a function call, you can modify the function, highlight and execute the scheme function definition again, and it will update live. I also made a simple gui (sorry ;)) that just consists of buttons that execute code snippets (again, usually function calls, but it can be whatever you want) Again, you can build this simple gui live, or load it from prebuilt scripts. > i'm for now using the semicolon (which lingo does not use) to > indicate to lingo "execute this now!" but i'm wondering if there's a > better way than using a character or even RETURN within the "editor" > - since any of these are subject to typos, and then i have to delete > it to edit the line to make it do something else, and sometimes i > forget and it tries to execute an incomplete line of code. i realize > this will vary with the livecoding methodology being used, but just > curious how others are doing this. As long as you don't stop running the visuals, script errors should not be a problem. > 2) validation - i make typos when livecoding, and then the illegal > commands generate a > nasty crash or at least an error in performance! i may just > integrate those into the performance, but, curious how others > handling this problem? are you having some code validation go on > before execution, or... ??? No crashes. Well, that's the idea anyway - I keep everything really simple, I have a few issues with the physics library (which is superb in other respects: http://www.ode.org/) but I've got them mostly under control now. For script errors the lower area of the window spews them out. with scheme it's mostly mismatching (((()))) so I added parentheses highlighting in the editor which makes it miles easier. cheers, dave ................................. www.pawfal.org/nebogeo From: "Dave Griffiths" On Thu, 2004-05-20 at 22:40, Amy Alexander wrote: > > 1) how do you go about telling the interpreter "execute now" when live > > coding? dave, you're using highlighting? does it execute as soon as you > > highlight? or you have to hit RETURN or something? > > I have Perl recompile the whole file I'm editing all the time. > State is preserved - it's just interpreting it as an extra bit of > code, except old versions of sub-routines get replaced with real ones. > > I have two modes... One is to have the interpreter run over the code > every time I type any key, or every time the running program edits > its own code. This mode is of course completely impractical as you > have to think of the effect of every single keypress. It also uses > a lot of CPU, so I can't keep many running at a time without getting > audio drop-outs... This is because my programs tend to edit their > code a lot, putting in comments to let me know what's going on. whooo, your programs talk to you via they're own comments? that makes my head hurt :] > The other mode triggers a recompile every time I hit ctrl-x. This seems > to work for me. > > > 2) validation - i make typos when livecoding, and then the illegal > > commands generate a > > nasty crash or at least an error in performance! i may just integrate > > those into the performance, but, curious how others handling this problem? > > are you having some code validation go on before execution, or... ??? > > My code first tries to compile it into a scratch area first, so that > if the program isn't syntactically correct it leaves the old version > running. I worried a lot about doing something like this, but in practise it hasn't been much of a problem, if there is a syntax error in a script the interpreter aborts the code there and it's quite easy to sort out. The only thing that seems to go wrong is if the code generates a mismatching (push) or (pop) trashing the renderer's state stack - but this often results in visually interesting ;) results anyway... cheers, dave ................................. www.pawfal.org/nebogeo From: "Dave Griffiths" Dave, you didn't design this for performance and you > haven't used this live, you're just claiming you could and would > like to soon? : ) it is very much designed for live use (and live coding) - and I would love to do so at some point. dave ................................. www.pawfal.org/nebogeo Date: Fri, 21 May 2004 15:22:30 +0100 From: Nick Collins My code first tries to compile it into a scratch area first, so that > if the program isn't syntactically correct it leaves the old version > running. DJ auditioning for live coders... If I ever get a soundcard with spare outputs I'll set up preview mode for new live coded audio objects before they get switched into the main output. At the moment I can control if they turn up with fader down, pause on. But Alex already seems to be heading into the territory of new spawned processes run in a controlled space before commitment...I look forward increasingly to seeing this beast in action... From: "Dave Griffiths" On Thu, 2004-05-20 at 22:40, Amy Alexander wrote: > > 1) how do you go about telling the interpreter "execute now" when live > > coding? dave, you're using highlighting? does it execute as soon as you > > highlight? or you have to hit RETURN or something? > > I have Perl recompile the whole file I'm editing all the time. State is > preserved - it's just interpreting it as an extra bit of code, except > old versions of sub-routines get replaced with real ones. > > I have two modes... One is to have the interpreter run over the code > every time I type any key, or every time the running program edits its > own code. This mode is of course completely impractical as you have to > think of the effect of every single keypress. It also uses a lot of > CPU, so I can't keep many running at a time without getting audio > drop-outs... This is because my programs tend to edit their code a lot, > putting in comments to let me know what's going on. *snip* This sounds like the best call for me to make some noise. :) I'm spending a lot of time at the moment with trying to work out a nice way of handling reparsing after every keystroke. One hope of this is that a chain could be established to avoid having to make a key press to refresh. The aim would be to reduce the effect of each key press because: XXXX only disturbs the lexer when a character changes. The lexer only disturbs the parser when a token is changed. The parser only disturbs YYY when the parse tree is error free. In this way errors don't get propogated along the compiler pipeline and hopefully edits which don't change tokens will get done localy by the lexer and nothing else will be disturbed. I have not got any good test data (any suggestions for per character edit logs of source files) but the current code would seem to be reasonably fast, of course parsing seems to me like the easy part of incremental compilation. Cheers, Rob. Date: Fri, 21 May 2004 20:14:05 +0200 From: Julian Rohrhuber On Tue, 2004-05-18 at 22:38, geoffcox wrote: >> Maybe the relationship between the score and the music could be expanded= =2E >> Adorno thought the score more important than the music played (in 'On th= e >> Fetish Character in Music and the Regression of Listening'). How does th= e >> relationship between the code and the execution of that code play out? the idea of a fetish is very much connected to the idea of having a total control over an isolated entity. I wonder if the separation of score and music played has in itself a certain fetishism. Most composers hear the music when they write it and many musicians see the score when they improvise. A certain fetishism is perhaps the eurocentristic concepts behind the ideas connected to tonality and in this respect the notation system has its role for sure. Also regarding the role of musician and composer (maybe not dissimilar to those of theoretican and practician) I would consider this. >On one hand, sourcecode exists as a set of instructions, and running >code exists as the sourcecode being processed. When the sourcecode is >being processed within a stable operating system, the results can be >pre-determined. Every time you run the code, you get the same results. > >When live coding, when you describe an instruction, it is carried out >immediately. In linguistic terms, this suggests the language of the >live programmer is performative. In general life, if you say that the >cat is blue, you don't make it so. When live coding, by declaring that >$cat =3D "blue";, then $cat =3D=3D "blue"; immediately becomes true. this reminds me of the discussion that was going on in parisian surrealist circles about film. For Luis Bu=F1uel, for example, the art of film was the perfect way of showing the dream logic of the real world. So what you say about assignment could lead to a view towards programming that shows this activity as an artform connected to the unconcious. I suggest cadavre exquis circles in code, as well as =E9criture automatique des programme.. >But then, perhaps this is an illusion, because I have power over a >certain scope. I might tell myself that cats are blue and believe it. >I might tell my computer the same thing, it has no reason to doubt me, I >am its true master. But if I try to tell someone elses computer that >cats are blue, I would fail - I probably don't have enough authority >over that computer to make it so. > >But then (again), any performative use of language is a matter of power >within a scope. A king can say "my shoes must be here," but that will >only become true if a loyal subject is within earshot. A jury may say >"You are guilty" in a court of law, but they lose that power once they >leave. yes the idea of a certain environment that represents a certain causality which is somehow arbitrary but consistent in itself is very much related to programming. I always wonder how to make this causality appear and be percievable in a good way. >So, to read code is to execute it. Saying one is more important than >the other is to draw a false separation. And unread code is less useful >than code that has been read. > >> You >> could describe this quite closely in relation to the live element. (alth= ough >> clearly this applies to the conventional score and not the more experime= ntal >> type - such as in Cage's work). > >I'm not sure if there is such a difference as far as we're concerned. >Experimental scores of this era seem very similar to computer programs. >> This seems to relate to composition too. >> Some aspects are predetermined and some are improvised or at least >> interpreted. There is a slippage between the score and the individual >> interpretation of that score. There's a reductive tendency to see the= use of >> computers as deterministic - but the live component tries to undermine t= his >> view. Live programming tries to reintroduce these interpretative, >> improvisational and unpredictable elements. There is an interesting aspect in this: people think live activity is not deterministic. So the causes and effects of what happens in music is very important to how the music is percieved. Generally a very common question confronting an artwork has become less 'what should it express?' than 'is this happening by chance or on purpose?'. But this is bound to lead to a much broader discussion. >... to performance, yes. These three elements that you mention seem >essential to music performance, and live programming allows them to be >brought to the fore. > >> Maybe the craft aspect is important in this. Not all musicians take the >> extreme view that they should make the instruments they play but they >> certainly build a close and intimate relationship to them. There is also= the >> link of the tool to the user and the task performed. Florian Cramer expl= ains >> in the case of the typewriter this is important as it breaks down the fa= lse >> distinction between the writing and the tool with which the writing is >> produced, and in terms of the computer between code and data. I think th= ere >> is something in this logic that relates to the production of music, in t= hat >> the relationship of the code and the data or the score and the music are >> brought closer together - perhaps more than with conventional music wher= e >> the distinction is emphasised (as with the Adorno comment). >> >> Does that follow? > >Yes, I think so. well somehow I'd like to go one step further - what's left of data when there is a program? Going into the details of how data is treated in computer language would be revealing probably. >> But then, paraphrasing the famous Eno quote on generative music is that >> people will not listen to music the same way twice. Again by keeping lap= top >> music live this is either emphasised or the statement is made too= obvious to >> be useful. > >I think the latter. To say that the same piece of music is ever heard >twice is to belittle the role of the listener who listens differently >with each play. -- =2E Date: Fri, 21 May 2004 20:20:34 +0200 From: Julian Rohrhuber Hi all, > >A coupla questions... you are touching sensitive points.. :) >Where do you stand on the use of genetic programming for performance? Is this >under the same category as live coding - the difference being that you grow >the code rather than write it? I would say that this is just simply a special type of algorithm. So it is one extreme of live coding - doing nothing and just letting the algorithm develop. As long as it is all redeable... >Also, what is the definition of code? Is it text only, what about graph based >languages? (max/msp, pd, modular synths - or even UML?) They can often be >appreciated more by a non programmer audience, and look pretty to boot. this is a difficult question. There can be no simple answer to it and I think it shows that language is a riddle. -- . Date: Fri, 21 May 2004 16:56:12 -0700 (PDT) From: Amy Alexander I would encourage Amy to get involved- if you don't have time, I can try to NC> work in your comments from this list without corrupting them and you could NC> double check. It is great to have a visual live coding perspective, and NC> you're challenging our assumptions; I particularly enjoyed the 20/5/04 NC> 2.26pm posting. NC> NC> I will make a draft attempt at describing some live coding projects soon, NC> hoping to take in ChucK, the Hamburg symposium, Alex's new system, my draft NC> setup and I hope Amy will write about hers too (you already did on this NC> list so that can be quickly adapted, plus add a screenshot...). We can also NC> attempt to link in further projects- fluxus etc. Dave, you didn't design NC> this for performance and you haven't used this live, you're just claiming NC> you could and would like to soon? : ) NC> NC> Oh yes, and a manifesto draft- even if people don't have time to be NC> contributing authors, they might like to be signatories... NC> NC> NC> --On Thursday, May 20, 2004 8:30 pm +0100 alex wrote: NC> NC> > NC> > Who is thinking of adding something to the read_me paper? NC> > NC> > Olga from read_me is asking for a list of authors urgently. Ade + NC> > Julian - I guess you'll be adding something at some point? Anyone else? NC> > NC> > alex NC> > NC> > NC> NC> NC> NC> NC> Subject: Re: [livecode] questions... From: Dave Griffiths >Hi all, > > > >A coupla questions... > > you are touching sensitive points.. :) > > >Where do you stand on the use of genetic programming for performance? Is this > >under the same category as live coding - the difference being that you grow > >the code rather than write it? > > I would say that this is just simply a special type of algorithm. So > it is one extreme of live coding - doing nothing and just letting the > algorithm develop. Ah, I think this is the distinction, it can still be interactive - you can choose code that produces good results manually, while you perform, "navigate the geneome space" if you want to be academic about it... > As long as it is all redeable... Genetic programs should be shown more, they tend to get hidden away by the software. This makes quite a good sound: (cos(+(cos(+(cos(+(*(- 0.929524(*(- 0.746627 0.741314) 0.972336))(+(*(cos(*(+ 0.246092 0.244990) 0.263769))(+ 0.938006 0.140775)) 0.685194))(+(cos(* 0.648392(+ 0.582452 0.313122)))(cos(*(+ 0.410632 0.126564) 0.194440)))))(cos(-(-(cos time)(-(+(+ 0.773449 0.903293) 0.364460)(-(* 0.544917 0.167748) 0.263252)))(cos(* 0.046184 time))))))(*(*(cos(cos(-(+(*(- 0.002302(* 0.361310 0.805768)) 0.276971)(- 0.945598 0.616702))(-(* 0.730980 0.238658)(+ 0.423594(+ 0.544588 0.245835))))))(+(+ 0.783156(- 0.526238 0.581044))(cos(cos(*(cos 0.665828) 0.177759)))))(-(-(- 0.088104(cos 0.752380))(+(*(+ 0.564310 0.202989) 0.718946)(cos 0.545804)))(*(cos(*(* 0.129965(* 0.877471 0.799488)) time))(+(-(*(- 0.168888 0.531057) 0.691737)(* 0.790665 0.200985)) 0.965454)))))) > >Also, what is the definition of code? Is it text only, what about graph based > >languages? (max/msp, pd, modular synths - or even UML?) They can often be > >appreciated more by a non programmer audience, and look pretty to boot. > > this is a difficult question. There can be no simple answer to it and > I think it shows that language is a riddle. Good answer :) I agree. cheers, dave -- ................................. www.pawfal.org/nebogeo Subject: Re: [livecode] paper authors From: alex oof, i've lost track of the thread already! but sure, i'm happy to > contribute if it can be useful.. your comment-weaving proposal sounds fine > to me if you don't mind, as i seem to be getting a wee bit behind on the > treadmill of things i've committed to get done. (in fact, the whim to > suddenly throw together a visual-livecoding thingee was born mostly out of > the fact that i was so burnt out on committments, my brain just had to go > into truancy mode for a while... come to think of it, most of my projects > have started that way. :-) ) Heh! So shall I combine the authors of the last article (Ade, Julian, Nick and I), with those others active on the list (Amy, Dave, Geoff)? Quite a few authors! Olga needs to know by the end of the weekend. It's just to put a list of papers up on the site and in promotion - we can change the list of authors later as we wish. Having signatories as well as authors sounds a good idea. This will be quite a historical document I think. I edited the name of the paper to "Live algorithmic performance and a temporary organisation for its promotion." It was "Live audio programming ..." before, but clearly we need to include visual programming to now! We should probably have a few alternative names for it... Right, I'm off to Pescara for my first live coding gig! I haven't had any time to prepare, so I don't know what's going to happen. Exciting! Alex Date: Sat, 22 May 2004 13:34:42 +0200 From: Julian Rohrhuber > >Where do you stand on the use of genetic programming for >>performance? Is this >> >under the same category as live coding - the difference being that you grow >> >the code rather than write it? >> >> I would say that this is just simply a special type of algorithm. So >> it is one extreme of live coding - doing nothing and just letting the >> algorithm develop. > >Ah, I think this is the distinction, it can still be interactive - you >can choose code that produces good results manually, while you perform, >"navigate the geneome space" if you want to be academic about it... what I meant is that I would try to avoid to define live coding by a concept of interactivity that is widely criticised. Like a book is an extremely interactive medium as you can read it in so many ways. So I'd quote Alex on this, as he wrote: >To say that the same piece of music is ever heard >twice is to belittle the role of the listener who listens differently >with each play this means that interactivity itself cannot be the means to define live coding. as this makes it hard then to define live coding then, I would suggest two properties that, so to say, can be of help for orientation (this is totally open to dicussion). - a program which does not appear as a tool for something else - a program that is being read and written while using it and where the readability/changeability is relevant. >> As long as it is all redeable... > >Genetic programs should be shown more, they tend to get hidden away by >the software. > >This makes quite a good sound: > >(cos(+(cos(+(cos(+(*(- 0.929524(*(- 0.746627 0.741314) >0.972336))(+(*(cos(*(+ 0.246092 0.244990) 0.263769))(+ 0.938006 >0.140775)) 0.685194))(+(cos(* 0.648392(+ 0.582452 0.313122)))(cos(*(+ >0.410632 0.126564) 0.194440)))))(cos(-(-(cos time)(-(+(+ 0.773449 >0.903293) 0.364460)(-(* 0.544917 0.167748) 0.263252)))(cos(* 0.046184 >time))))))(*(*(cos(cos(-(+(*(- 0.002302(* 0.361310 0.805768)) >0.276971)(- 0.945598 0.616702))(-(* 0.730980 0.238658)(+ 0.423594(+ >0.544588 0.245835))))))(+(+ 0.783156(- 0.526238 0.581044))(cos(cos(*(cos >0.665828) 0.177759)))))(-(-(- 0.088104(cos 0.752380))(+(*(+ 0.564310 >0.202989) 0.718946)(cos 0.545804)))(*(cos(*(* 0.129965(* 0.877471 >0.799488)) time))(+(-(*(- 0.168888 0.531057) 0.691737)(* 0.790665 >0.200985)) 0.965454)))))) oh, nice. it took me quite a while to find the "time" argument.. >> >Also, what is the definition of code? Is it text only, what about >>graph based >> >languages? (max/msp, pd, modular synths - or even UML?) They can often be >> >appreciated more by a non programmer audience, and look pretty to boot. >> >> this is a difficult question. There can be no simple answer to it and >> I think it shows that language is a riddle. > >Good answer :) >I agree. > >cheers, > >dave > >-- >................................. www.pawfal.org/nebogeo -- . Date: Sat, 22 May 2004 13:54:31 +0200 Subject: Re: vislivecode slightly ot rambles (was Re: [livecode] another From: Fredrik Olofsson you type the code on the screen, and it actually executes the > code as you type it, but all the visuals are made of the onscreen code > too hi amy, rest, curious to see your setup. checking those screenshots as sending this. let me brag a little below (not)... i did a disasterous gig exactly a year ago in the south of sweden using something similar. one and only time i had my now defunct supercollider2 redFrik livecoding framework running which was er... more about routing sounds through effects, changing tempo, fading in/out things with a syntax interface - not true livecoding. it just looked cool. in the background i had my max/nato videopatch and it grabbed a small area of the screen. whatever was in that area got passed through 2-3 effects, scaled up to fullscreen and sent to the beamer. the trashing/pixelating video effects were in turn controlled by parameters in the music (iirc tempo and drumfile loopingpoints (bbcut info)). so the code was visually transformed in sync with the sound. it was a big event and i screwed up totally. didn't rehears due to coding to the last minute and my brain went into schimpansee mode as always when performing. couldn't keep all things in my head controlling both video and music so both stagnated for long periods of time while i tried to think clear. add to that the projector was crap and my video effects possibly a little too brutal so the mess was totally unreadable :-( me still shivers when thinking about it. so i figured i need to start rehearse livecoding on a daily basis as any instrument and also fully automate certain things (like the video part) and that's where i'm still at. my framework did have one nice feature tho. a command like filling a pattern with 60% random hits could be undone with the revert n steps command. o.send("/punkBassdrum/patAmp",16,0.6); //fill with new o.send("/punkBassdrum/patAmp",\revert,1); //go back to previous if bad or similar filling a pattern with random pitches between 20 and 60. with revert i could jump back to good sounding patterns. just needed to remember which index was good sounding -sigh. o.send("/punkBassdrum/patPitch",16,20,60); o.send("/punkBassdrum/patPitch",\revert,1); no cure for the halting problem but having safe positions, in this case good sounding settings, to retreat back to as you branch out i think is vital. like a mountain climber at regular intervals drilling for firm stops. and these couldn't be prepared in advance - they're made up as you go. _f #| fredrikolofsson.com klippav.org |# Date: Tue, 25 May 2004 12:34:22 +0100 From: Nick Collins what's our eventual paper format by the way, Alex? We just keep it as the > swiki version, or I can prepare a PDF of the completed thing for > permanence... olga just sent through a word document with submission guidelines, which I shall forward. Date: Tue, 25 May 2004 11:56:50 +0000 From: f Subject: [livecode] q for a hi alex, care to elaborate a little on your new setup? using supercollider these days? how did your recent first concert turn out? what's new since slub msg system? curious _f Subject: Re: [livecode] q for a From: alex care to elaborate a little on your new setup? using supercollider > these days? how did your recent first concert turn out? what's new > since slub msg system? I wanted to get more into synthesis and so bought a couple of curtis roads books, began to learn SuperCollider and wrote a Perl library for sending OSC commands to it, with timestamps for nice precision. So I'm just using it to render the sound, like you suggested to me in Hamburg. I read some of the curtis roads book while on holiday but haven't got that far, so I'm using a simple and dodgy patch for synthesis that I don't understand. I managed to write a sampler patch for loading playing wavs after being surprised not to find a demo patch for that. It works nicely although I haven't worked out how to limit the number of simulatanious voices yet so I can push it a bit too far sometimes. I can share the source when I get home if you're interested. I'll write a report of my performace in Pescara soon. alex Subject: [livecode] [Fwd: read_me text format & specs] From: alex A, D, F, please check and approve and edit or > whatever your sections; I've just taken them from your mailings and > done a tiny bit of punctuation and editing, trying to preserve all > the great character and excitement of those posts. I've modified the fluxus section slightly, so sound a bit less email-ish. Feel free to edit it more. > Any other discussion; you're all strangely quiet, I get worried... I'm busy writing decidedly non-live code :( cheers, dave ................................. www.pawfal.org/nebogeo Date: Wed, 26 May 2004 12:07:38 -0400 From: Ge Wang Frederick's problems were particularly annoying, they kept projecting >the MAX objects instead of the output of them. may be that those vj's were ahead of us all... worshipping the syntax/ recipes instead of the actual output. "anyone can see what this is gonna produce" so the actual output isn't interesting! "show us the nifty stuff!". code fetishism. /me duck ;-) _f Date: Thu, 27 May 2004 16:43:41 +0200 From: Julian Rohrhuber Alex's section to sort, I love your spiel on feedback.pl from the >list but please decide how you want it! (plus screenshot) > >manifesto signatories, feedback > >Any other discussion; you're all strangely quiet, I get worried... I am very short of time and my eyes fall close every minute currently. I would like to make additions, but I have no idea where this is productive. for me I see a slight problem in the definition of the term live programming. I have never thought that it is restricted to performance situations. I wouldn't mind at all, as 'live coding' could be set against 'interactive programming' or such terms, but as our organisation is dedicated to live programming, it does matter somehow. I think the performer-audience separation as something I would try to break up by live programming, and not reassure it. Anyway, if you can give me a hint where additions to the paper is welcome, please tell me. best, Julian -- . Date: Thu, 27 May 2004 20:32:46 +0100 From: Nick Collins performer-audience separation as something I would try to break up by > live programming, and not reassure it. Please could you give some examples if your eyes fall open; you've spoken before of playing off stage. where is the importance of 'now' in such actions? Does just listening always leads to someone asking about the nature of what they hear and get us back into trouble? interactive programming solo is prototyping/practise? Composition, not improvisation under real-time constraints. I don't want to cause schism but I would fear a diluting of message if we did an either or case on performance perhaps; depends how couched. Please feel free to suggest anything to resolve this! (I've tried already to avoid out and out 'look at the virtuoso typer' stuff but its hard not to talk up the intellectual appreciation of algorithm. Any other tacks we could take?) n Date: Thu, 27 May 2004 13:20:31 -0700 (PDT) From: Amy Alexander interactive programming solo is prototyping/practise? Composition, not NC> improvisation under real-time constraints. NC> NC> I don't want to cause schism but I would fear a diluting of message if we NC> did an either or case on performance perhaps; depends how couched. Please NC> feel free to suggest anything to resolve this! NC> just a thought; i've had similar feelings about the improvisiational "realtime" nature of coding at home at times, which audiences for a work (performance work or not) can't experience, and my actual livecoding performance, if i perform the Thingee, probably won't be at all like that - it will be more concerned with outward performance than inward... alex's "hacking sound in context" paper i think captures that vibe well... i'm rambling cause in a rush, but i mean: if we explain the visceral immediacy of programming that occurs when programming non-performatively as a starting off point, that may help cover this aspect of livecoding while also providing readers an important insight, esp. to non-programmers, into why we think livecoding is so meaningful. i will also try to make some edits soon to my section. shall we also note where are sections are derived from email discussion? i would prefer that people not think i would write in such a rambling, incoherent manner for a formal paper (even though that may also be true :-). btw, how are we doing on time? i'm trying to take it easy on typing these next couple days - arm's a bit sore from tetanus shot... -@ Date: Thu, 27 May 2004 23:12:00 +0100 From: Nick Collins btw, how are we doing on time? weekend to go. We'll compile a finished thing on Monday I guess. > shall we also note where are sections are derived from email discussion? i would prefer that people not think i would write in such a rambling, incoherent manner for a formal paper (even though that may also be true :-). I like the loose style and fuck the formality. I prefer the strong character shining through... No, of course, we can note that the large part of the field studies is from discussion on the TOPLAP mailing list. And please edit as you want anyway. Date: Thu, 27 May 2004 20:28:07 -0700 (PDT) From: Amy Alexander NC> > btw, how are we doing on time? NC> NC> weekend to go. We'll compile a finished thing on Monday I guess. NC> ok, i've made some edits and added 2 small screengrabs... but if 2's a problem, we can trade for one (i can include it that resolution or higher, whichever works better). i also added a bit in my section about my thoughts on the user->author continuum... not sure if everyone will agree with what i've said, but i've tried to clearly indicate that it's my opinion only and not everybody's... but if anyone thinks it should be changed/clarified please let me know! NC> NC> > shall we also note where are sections are derived from email discussion? NC> i would prefer that people not think i would write in such a rambling, NC> incoherent manner for a formal paper (even though that may also be true NC> :-). NC> NC> I like the loose style and fuck the formality. I prefer the strong NC> character shining through... NC> agreed, i've been known to write actual papers that way too... just wasn't sure in this case if we should clarify why it changes degrees of formality when it gets to the email stuff... NC> No, of course, we can note that the large part of the field studies is from NC> discussion on the TOPLAP mailing list. NC> sounds good to me! Subject: Re: [livecode] finishing paper From: alex I am very short of time and my eyes fall close every minute currently. > I would like to make additions, but I have no idea where this is productive. > > for me I see a slight problem in the definition of the term live programming. > I have never thought that it is restricted to performance situations. > I wouldn't mind at all, as 'live coding' could be set against > 'interactive programming' or such terms, but as our organisation is > dedicated to live programming, it does matter somehow. I think the > performer-audience separation as something I would try to break up by > live programming, and not reassure it. This seems key, I think we need to make this point in the paper. Can you explain more? I think you mean that live programming isn't just what you do when you have an audience, but also making music alone. That the sense of engaging with live programming is the same as engaging with live electricity. I think this is a subtley different to the sense of performing live to an audience, which also applies, but you can work on electricity alone. Is this a distinction you would make, or do I misunderstand? How can live coding help break performer-audience separation? alex From: "Dave Griffiths" How can live coding help break performer-audience separation? I think the choice to use live programming is simply an interface choice - scripts are user interfaces as much as anything else. More importantly though, they are a very rigorous description of what is going on, more so than a load of graphics - and a bit more honest. The reason that it might be better for a performer-audience situation may be due to this honesty - not patronising the audience and hiding the process away behind a ton of graphics, and not having that stupid secrecy that seems to be so common in computer art generally - like by giving away the details of the processes used, the artists may lose something. The processes used in computer art are just as individual and unique as the result (sadly often more interesting:), and they should be open for all to see. A performance that is open and honest in this way is a story, and much more engaging for it - I lose count of the number of bands I've seen with laptops "live" and come away feeling a little empty and cheated. cheers for listening to my little rant :) dave Date: Fri, 28 May 2004 15:59:37 +0200 From: Julian Rohrhuber On Thu, 2004-05-27 at 15:43, Julian Rohrhuber wrote: >> I am very short of time and my eyes fall close every minute currently. >> I would like to make additions, but I have no idea where this is productive. >> >> for me I see a slight problem in the definition of the term live >>programming. >> I have never thought that it is restricted to performance situations. >> I wouldn't mind at all, as 'live coding' could be set against >> 'interactive programming' or such terms, but as our organisation is >> dedicated to live programming, it does matter somehow. I think the >> performer-audience separation as something I would try to break up by >> live programming, and not reassure it. > >This seems key, I think we need to make this point in the paper. Can >you explain more? I think you mean that live programming isn't just >what you do when you have an audience, but also making music alone. >That the sense of engaging with live programming is the same as engaging >with live electricity. I think this is a subtley different to the sense >of performing live to an audience, which also applies, but you can work >on electricity alone. Is this a distinction you would make, or do I >misunderstand? I would say that doing live programming alone, I am the audience and programmer in one, which is not a trivial unity, but a quite heterogeneous one, in which the language, my expectations, my perceptions, errors and my poetic and/or programming style play their own roles. The situation with one or more persons as audience, or, not to forget, as coperformers, is an interesting extension of this situation, but it is by no means primary. There are many specific problems of this larger interaction, such as readablility performance style etc. which are well worth discussing, but I would not constrain interactive programming to this specific public situation. For the film "Alles was wir haben" this was the situation that was responsible for how the whole idea worked and it was what I did while working on jitlib. I could describe the compositional work I did more in details, but maybe this is already a general description of a type of situation which I find very interesting in itself. > >How can live coding help break performer-audience separation? In my experience it allows a very spontaneous interaction with the music in combination with talking about the music. This means I can, while playing, talk with others about what they hear, think, etc. or about other topics even and as the sound keeps playing I can react to these conversations by changing the code. Then I am not in the situation of the one who is looked at (especially as it is not my body movements which are so interesting, hmhm..) but the code / sound relation is in the center of attention. Of course, one step further, in networked live coding there is a flow of code between all participants which can, to different degrees, interact and contribute. Nobody knows who does what anyways. -- . Subject: [livecode] David Toop From: alex I got some feedback from david toop about the live coding article. He >enjoyed it: > >"... the paper in Organised Sound shone out as something original and >genuine, rather than an extension of computer music's rather bankrupt >academic history." nice. that sounds like a text on the bookcover printed in gold. -- . Date: Fri, 28 May 2004 19:40:06 +0200 From: Julian Rohrhuber I think the >> performer-audience separation as something I would try to break up by >> live programming, and not reassure it. > >Please could you give some examples if your eyes fall open; you've >spoken before of playing off stage. > >where is the importance of 'now' in such actions? Does just >listening always leads to someone asking about the nature of what >they hear and get us back into trouble? > >interactive programming solo is prototyping/practise? Composition, >not improvisation under real-time constraints. > >I don't want to cause schism but I would fear a diluting of message >if we did an either or case on performance perhaps; depends how >couched. Please feel free to suggest anything to resolve this! I think the term 'performance' is not needed in a definition. * Live coding is the activity of writing (parts of) a program while it runs. It thus deeply connects the algorithmic causality with the resulting perceptions and by deconstructing the idea of the temporal dichotomy of tool and product it allows to bring into play code as an artistic process. * well, just a try. >(I've tried already to avoid out and out 'look at the virtuoso >typer' stuff but its hard not to talk up the intellectual >appreciation of algorithm. Any other tacks we could take?) ______________________________ I've added a description to the readme paper, check it out whether you like it. -- . Date: Fri, 28 May 2004 18:27:22 +0000 From: f Subject: Re: [livecode] finishing paper >weekend to go. We'll compile a finished thing on Monday I guess. hm, wondering if my text has any relevance what so ever. it's crap and better be left out. perhaps it'd fit if we decide to do the second part of the paper as suggested 'current discussions from the mailinglist'. but i'll look through and give it a go tonight. maybe i can come up with something better. else dump it i say. other comments on the text. "exploring the art of coding in live performance" etc. and alex: "How can live coding help break performer-audience separation?" i agree with julian that there's an unfortunate basis towards the live-in- front-of-an-audience performance situation in the text. maybe should we try to find another word? maybe 'performace' here should be read more like 'the performance _art_ of live coding'. what i'm trying to say is that the art direction/genre called 'performance art' or 'aktion kunst' might be something to refer to and perhaps soften this audience separation issue a bit. judging from the performances i've experienced there is generally a much looser aim, purpose, finished work or genius to put on a pillar. often the audiences is provoked to take part in the piece but at least not sit still and watch/listen. it's most often ongoing processes that you can witness for a little while - stay or leave, come and go. many even not meant to or impossible to follow from beginning to the end. a little addition (introduction - in the middle) to try to catch some of the above... "So live coding allows the exploration of abstract algorithm spaces as an intellectual improvisation + and by revealing, provoking and challenging the audience with the bare bone syntax, hopefully make them follow along or even take part in the expedition." see it as a suggestion. i don't have the english chops to do it i'm afraid. besides you may all disagree :-) detail: can't we give a precise year for the hub happening? annoying to have to look that up in other papers. _f #| fredrikolofsson.com klippav.org |# Date: Fri, 28 May 2004 18:27:58 +0000 From: f Subject: Re: [livecode] q for a a: >to learn SuperCollider and wrote a Perl library for >sending OSC commands to it, with timestamps for nice precision. So I'm >just using it to render the sound i've looked at doing livecoding from a shell (python maybe) which seems more practical than sc's text editor (simple things like arrow-up to edit and redo your last command = heavens!) but then i'm not keen on implementing the whole timing, threads and tasks thing myself. also had ideas about building my own text interpreter in sc (based on julian's DocumentArea?) but it seems like the Document class is too limited for that atm. perhaps irrelevant and only personal hookups but i'm having trouble with the way supercolliders [mac] text editor works. i think it promotes select, copy&paste code plus a lot of unnecessary mousework in between. what do you say nick, julian, adc? me think this is bad as you tend to just reuse snippets of code and not changing it - at least not in drastic ways. also there's a lack of feedback when you evaluate something. sc just posts some useless info in a status window by default. jitlib is a little bit better often giving you an overview of proxies at hand but still long way from satisfactory. i've noticed a great difference in outcome if i force myself to do a jitlib session without copy&paste of code. often worse sounding :-( but that's only a question of practice i hope. how do you deal with this alex? do you have a library of code to copy&paste from? and do you have some visual representation of processes running? or are they more or less always in the form of ascii graphics? think we haven't had a proper discussion yet about this basic tool we all share... the code editor or command line. _f Date: Fri, 28 May 2004 22:07:21 +0100 From: Nick Collins detail: can't we give a precise year for the hub happening? annoying to > have to look that up in other papers. years unknown (Hub is itself 1985 and later I believe). general description of forth live coding in hub concerts by tim perkis. outstanding mails to both bischoff and perkis to resolve this exact issue, sent last week. Still waiting. I think they're afraid that Ron predates them (1985) and won't admit it... Date: Fri, 28 May 2004 22:15:08 +0100 From: Nick Collins I think the term 'performance' is not needed in a definition. > * > Live coding is the activity of writing (parts of) a program while it runs. > It thus deeply connects the algorithmic causality with the resulting > perceptions and by deconstructing the idea of the temporal dichotomy of > tool and product it allows to bring into play code as an artistic process. > * > well, just a try. Very nice, I will try to revise the intro to take this into account this weekend; probably leading with your definition. Sorry if overly performance angle; good to be getting this feedback! > > I've added a description to the readme paper, check it out whether you > like it. -- > I like it, and I liked the film. Date: Sat, 29 May 2004 09:23:13 +0100 From: Nick Collins in our attempts, well that is good, but I guess we have more precendants > than the ones already mentioned. If I had more refs I would have put them in the paper; I suspect there must be some (non computer based) live algorithm redecision in a few live text pieces/happenings of the 60s. Don't know any specific example, and it may be particularly hard to find any case where this was a continuous act of overt recomposition, or made public as it was decided. There are probably tenuous artistic precednets as always, which you can tease to make them look like the first mutterings... Mozart can be attributed with changing the musical dice game rules at one riotous Masonic pissup... Date: Sat, 29 May 2004 11:04:56 +0200 From: Julian Rohrhuber Mozart can be attributed with changing the musical dice game rules >at one riotous Masonic pissup... Maybe even go back to greek harpists who tuned while playing.. -- . Date: Sat, 29 May 2004 10:38:58 +0100 From: Nick Collins PS Fredrik, Dave, Julian, do you want to add in some of your recent >comments to your sections? > >Julian, in particular this is very useful: ... ok, I've added that. thanks for your changes! -- . Subject: Re: [livecode] q for a From: alex i've looked at doing livecoding from a shell (python maybe) which seems more > practical than sc's text editor (simple things like arrow-up to edit and > redo your last command = heavens!) but then i'm not keen on implementing the > whole timing, threads and tasks thing myself. Maybe we should design a general use OSC server that looks after the timing at some point, and allows switching between languages and livecode environments. Unless something like that already exists? > how do you deal with this alex? do you have a library of code to copy&paste > from? No... I just start with a routine called: sub bang { } ... which gets called every 'tick'. To trigger a synth noise I do something like $self->play({num => 140 + $tune->[$self->{bangs} % @$tune], formfreq => 9, bwfreq => 7, ts => 3 + rand(30) } ); to trigger a sample I do $self->trigger({sample => '/slub/samples/drum/drum5.wav', pan => 0.4, ts => 10, crackle => 100 + rand(50) }); Well, most often I start with an existing sourcefile, change it a lot and save it as something else. > and do you have some visual representation of processes running? or > are they more or less always in the form of ascii graphics? With the live coding I just have the editor and that's it. The only feedback is by the process editing its sourcecode... Changing values directly in the source and adding comments to indicate the state of the process. However after my first live coding gig I'm doubting whether this is really interesting for people to look at unless they know Perl. There was not much reaction when I did live coding, but when I ran my ascii art dancing man script, some people whooped. But then there's no reason why an ascii man or woman can't dance all over some sourcecode... Somehow I think more movement has to be exposed anyway. Static source doesn't seem to be enough on its own. alex Date: Mon, 31 May 2004 00:20:10 -0700 (PDT) From: Amy Alexander a> However after my first live coding gig I'm doubting whether this is a> really interesting for people to look at unless they know Perl. There a> was not much reaction when I did live coding, but when I ran my ascii a> art dancing man script, some people whooped. a> a> But then there's no reason why an ascii man or woman can't dance all a> over some sourcecode... Somehow I think more movement has to be exposed a> anyway. Static source doesn't seem to be enough on its own. a> a> alex a> a> Date: Mon, 31 May 2004 09:00:14 +0100 From: Nick Collins > I'm going to replace much of 'my' section too, by the way. > > alex > I guess our deadline is this evening- Alex, you want me to compile a finished RTF tonight? Is there an e-mail address to send it to? I imagine a flurry of last minute maneouvres... I hope most things in the paper now get people's approval. I guess I'd leave the manifesto signitories implicit now since almost everyone is an author by this stage. We're probably over 25000 characters due to recent additions but I hope this won't be judged too strictly. Subject: Re: [livecode] finishing paper From: alex I guess our deadline is this evening- Alex, you want me to compile a > finished RTF tonight? That'd be great. I'll try to make my changes some time before 6pm. > Is there an e-mail address to send it to? olga goriunova alex Date: Mon, 31 May 2004 13:56:37 +0100 From: Nick Collins wrote: > On Mon, 2004-05-31 at 09:00, Nick Collins wrote: >> I guess our deadline is this evening- Alex, you want me to compile a >> finished RTF tonight? > > That'd be great. I'll try to make my changes some time before 6pm. >> Is there an e-mail address to send it to? > > olga goriunova > > > alex > Subject: Re: [livecode] finishing paper From: alex wrote: > Here is my replacement "feedback.pl" section. I hope it reads ok, I'm a > bit close to it at the moment. It's crying out for an illustration, but > I haven't got the resources or the time at the moment... > > feedback.pl > > A painter, Paul Klee, moves to make a mark on a canvas. Immediately > after this initial moment the first counter motion occurs; that of > receptivity, as the painter sees what he has painted. Klee therefore > controls whether what he has produced so far is good. > > This is the artistic process described by Klee in his excellent > Pedagogical Sketchbook [Klee, 1968]. The same process occurs when an > artist edits a computer program. The artist types an algorithm into > the computer, the artist considers it in its place and moves to make > the next mark in response. > > However, live programming allows us to play with this relationship > somewhat. Let me describe how my live programming environment, > 'feedback.pl' works. > > The marks that the artist is making in feedback.pl don't just affect > the artist, but also affect the running code. While the artist > considers the possibilities of a fresh algorithm that they have just > typed, feedback.pl is already executing it. > > What's more, the algorithm running in feedback.pl can edit its source > code. I should explain carefully... While the artist types in a > computer program, feedback.pl is running it; that running process may > edit its own source code. Those self-modifications take immediate > effect, changing the running program. And so it goes on. > > The running code may also do other things. I use feedback.pl as an > environment to make music. So I write live code in feedback.pl that > makes live music. > > And so we have at least three complementary feedback loops. One loop > is that of the artist and their work in progress - expressions enter > the program, the artist observes the effect on the sourcecode and > reacts with further expressions. A second loop is that between the > sourcecode and the running process - the process reads the sourcecode > and carries out instructions which modify the sourcecode, then reads > in its modifications and follows those. > > The third, outer feedback loop appears when the program creates > external output. The program might generate some sounds that the > artist interprets as music, reacts to by changing the sourcecode, > immediately changes the process which in turn alters the generated > sound, which the artist reacts to once more. > > When I'm performing before a live audience, then there is a fourth > feedback loop, encompassing the audience who are interpreting the > music in their own individual ways, making their own artistic > expressions by dancing to it, which affects the choices I make, in > turn changing the music. > > If the programmer is to share the continuous artistic process of a > painter, their performative movements must be must be reactive, they > must be live. Code must be interpreted by the computer as soon as it > can be written, and the results reacted to as soon as they can be > experienced. The feedback loops must be complete. > > Date: Mon, 31 May 2004 21:35:39 +0100 From: Nick Collins This is just a temporary secret location, please don't link from the TOPLAP site. I assume README put it up somewhere we can officially link to. Also I can make sure the swiki version matches the RTF (there are a few corrections I did) once someone tells me why I can't edit any further stuff into the swiki page- character limit? Was fun to work on this, thanks for all your contributions. See you at README? (I've booked my flight anyway) N Date: Mon, 31 May 2004 23:48:00 +0100 From: Nick Collins A little report on livecoding lsystem descriptions of melodies. > >I'm not sure this is manefesto compliant or even counts as code, but I've >recently got a simple livecoding thing going on in some of my music software. >You basically enter the axiom and rules of an lsystem which are then expanded >as the melody is playing. > >There is a simple score language the lsystem produces where: >o = note on >+ = up one note (the exact frequency change is configured elsewhere) >- = down one note >! = reset to root note >. = rest one beat > >and numeric tokens are used as rule replacements for the lsystem. > >so: > >axiom = 1 >rule '1' = o+1-1 > >expands to: > >generation 1: o+o+1-1-o+1-1 >generation 2: o+o+o+1-1-o+1-1-o+o+1-1-o+1-1 >generation 3: o+o+o+o+1-1-o+1-1-o+o+1-1-o+1-1-o+o+o+1-1-o+1-1-o+o+1-1-o+1-1 > >well, you get the idea (just basic lsystem search/replace) :) > >the results are actually quite nice, and playing it live feels pretty much >like programming to me. simple rules expand into quite complex melodies >quickly. I've used this lsystem score idea before, but in a genetic >programming environment where you had a choice of lsystem rules to breed from >- but doing it by hand is more satisfying (if a little slower) > >cheers, > >dave -- . Date: Thu, 3 Jun 2004 10:45:14 +0000 From: f Subject: Re: [livecode] live coding lsystems >A little report on livecoding lsystem descriptions of melodies. funny, i read... http://www.generativeart.com/papersga2003/a26.htm the other day in search for early live coding references. a composer friend of mine, that also have been into computer music related research for a long time, told me to check out Peter Beyl's OSCAR system. apparently he does some kind of on-the-fly programming. but i have limited internet access these days so i haven't yet surfed for more info than what's in this paper. _f From: "Dave Griffiths" >A little report on livecoding lsystem descriptions of melodies. > > funny, i read... > http://www.generativeart.com/papersga2003/a26.htm They have cellular automata in there as well, I've been playing with them too - live coding CA's in my system consist of flipping rule bits on and off though. There is a lot of genetic searching going on in that system (is that OSCAR?). >From the screenshots it looks like you can search quickly for acceptable results, and then tweak the system by hand if need be. That paper looks damn interesting, I'll have to read it through properly when I get time. As an aside, something I've been thinking about recently is that most livecoding seems to consist of quite high level scripts (or in these cases - rule systems). Something that may also be interesting would be lower level code, assembly and microcode. In evolving code systems like tierra (http://www.his.atr.jp/~ray/tierra/) we have unbreakable machine code - every instruction is valid IIRC. Could be fun to livecode a system you can't get wrong :) Related to this, sort of, is the current fashion for 8bit computers - livecoding zx spectrum basic for visuals :] anyone done this? cheers, dave Subject: Re: [livecode] live coding lsystems From: alex There is a simple score language the lsystem produces where: > o = note on > + = up one note (the exact frequency change is configured elsewhere) > - = down one note > ! = reset to root note > . = rest one beat Nice! My most successful attempt at making a tune is similar to this. I just use a two dimensional structure of +'s and -'s, and step through structure recursively. I'm not sure what an lsystem is, but I think my naive approach is similar but a lot simpler. The first bit of this recording from pescara is using live coding: http://nosignal.slab.org/placard/mp3s/yaxu.mp3 I lost my nerve after the first track and reverted to slub club classic Perl scripts. I've been on internet holiday at home, our line has been down. It's fixed now - will catch up with things later, like your mail Amy. alex From: "Dave Griffiths" On Tue, 2004-06-01 at 18:21, Dave Griffiths wrote: > > There is a simple score language the lsystem produces where: > > o = note on > > + = up one note (the exact frequency change is configured elsewhere) > > - = down one note > > ! = reset to root note > > . = rest one beat > > Nice! > > My most successful attempt at making a tune is similar to this. I just > use a two dimensional structure of +'s and -'s, and step through > structure recursively. Sounds cool, but I don't understand fully... Most of my attempts at creating melody (which work) have just consisted of sampling low frequency sine waves for pitch and note on/off information and tweaking them to cycle patterns, but this is getting off topic... > I'm not sure what an lsystem is, but I think my naive approach is similar but a lot simpler. An lsystem is really just a set of search replace rules you iteratively apply to a string. Most of the work is in how you interpret the symbols. (good description here: http://en.wikipedia.org/wiki/Lindenmayer_system) Branching is also important, but in this system it doesn't make use of that for the melody. > The first bit of this recording from pescara is using live coding: > http://nosignal.slab.org/placard/mp3s/yaxu.mp3 Cool - I think I like the livecoded bit the best :) I meant to post some lsystem music earlier but forgot: http://www.archive.org/download/a-long-walk-take1/a-long-walk-take1.ogg http://www.archive.org/download/a-long-walk-take2/a-long-walk-take2.ogg Not really strictly livecoded as I got the lsystems set up first and tweaked them, I'll try recording some written from scratch soon... cheers, dave Subject: Re: [livecode] q for a From: alex Sounds cool, but I don't understand fully... Sorry, I didn't explain at all, partly because I can't remember how it works. This is the two dimensional structure: + + - + - + - - That gets rendered into something like this. - - - - + - - - + + - - + - - +- - - + + + + - - + + + + + By this recursive subroutine: { my $count; sub test { my ($self, $value, $level) = @_; if (not $level) { $level = $count = 0; @score = (); } return if @$structure <= $level; my $substruct = $structure->[$level]; foreach my $dir (@$substruct) { ++$count; if ($dir eq '+') { ++$value; } elsif ($dir eq '-') { --$value; } push @score, $value; $self->test($value, $level + 1, $count); } } } I think there's probably a bug but it sounds ok, but then I am cheating by using a rather boring scale. > An lsystem is really just a set of search replace rules you iteratively apply > to a string. Most of the work is in how you interpret the symbols. (good > description here: http://en.wikipedia.org/wiki/Lindenmayer_system) > Branching is also important, but in this system it doesn't make use of that > for the melody. Aha, thanks for that. > Cool - I think I like the livecoded bit the best :) I meant to post some > lsystem music earlier but forgot: > http://www.archive.org/download/a-long-walk-take1/a-long-walk-take1.ogg > http://www.archive.org/download/a-long-walk-take2/a-long-walk-take2.ogg Both are lovely! A lot more successful than my attempts at melody I think. Goodnight, alex Date: Mon, 07 Jun 2004 23:54:16 +0100 From: Nick Collins NC> Why don't you just use Fruity Loops to make music? Also, the musical output NC> of your work could be done better as a record, preferably by people other NC> than you who know what they're doing as musicians, rather than being NC> programmers, you pretenders. Yes, the output is definitely not using NC> anything that isn't available in commercial software. NC> NC> I've noticed that the gestural content of this typing is not like the NC> haptic joy I get from my musical instrument. Therefore it must be worse as NC> music. And your fetishism reminds me of early music practitioners who NC> insist on period instruments. (by the way, in secret I'm an electronic NC> music composer, ie, this is socratic irony, but I'd like a big round of NC> applause as if I'm a pure unsullied musician of the romantic era). NC> Date: Fri, 2 Jul 2004 19:32:40 -0700 (PDT) From: Amy Alexander .... just wondering if it's supposed to do that, or is it just doing strange things on my system? -@ Date: Sat, 03 Jul 2004 10:09:56 +0100 From: Nick Collins wrote: > hey, just looking at the RTF version of the paper (i know, i'm a little > slow, but i was traveling for awhile, so i have a partial excuse :-) ). > > when i open it in MS Word, the names of the images show up where the > images should be, i.e. .... just wondering if it's supposed > to do that, or is it just doing strange things on my system? > > -@ > > > Date: Sat, 03 Jul 2004 10:12:06 +0100 From: Nick Collins it's supposed to do that, those are just preferred locations, the image NC> files are separate for easy reformatting by README (I think they'll be NC> putting it back into HTML?). NC> i'm not sure it's going back into HTML; the main gist is to print it in a book... does README know they're supposed to print the images and not the URLs? just asking as in particular, the URLs for mine are more or less temporary - i'd prefer to upload my images into the wikispace somehow for the wiki, and really would prefer not to have the URLs appear in a book... NC> Anyway, the swiki page HTML version should match the .rtf now if you want NC> to see it with images in place. NC> ok, thanks! i will check it out! -@ Date: Sat, 03 Jul 2004 18:56:43 +0100 From: Nick Collins refers to the enclosed images, but I see there is a danger of misconstruing as web links: just looked back at the rtf we put in. Thanks N --On Saturday, July 3, 2004 9:41 am -0700 Amy Alexander wrote: > On Sat, 3 Jul 2004, Nick Collins wrote: > > NC> it's supposed to do that, those are just preferred locations, the > image NC> files are separate for easy reformatting by README (I think > they'll be NC> putting it back into HTML?). > NC> > > i'm not sure it's going back into HTML; the main gist is to print it in a > book... does README know they're supposed to print the images and not the > URLs? just asking as in particular, the URLs for mine are more or less > temporary - i'd prefer to upload my images into the wikispace somehow for > the wiki, and really would prefer not to have the URLs appear in a > book... > > > NC> Anyway, the swiki page HTML version should match the .rtf now if you > want NC> to see it with images in place. > NC> > > ok, thanks! i will check it out! > > -@ > > > Date: Sat, 3 Jul 2004 16:34:37 -0700 (PDT) From: Amy Alexander NC> Anyway, the swiki page HTML version should match the .rtf now if you want NC> to see it with images in place. NC> Date: Sun, 04 Jul 2004 10:44:29 +0100 From: Nick Collins wrote: > oops, am i getting the right file? i've got the URL as > > http://www.cus.cam.ac.uk/~nc272/papers/pdfs/toplap.tar.gz > > but it still seems to have URLs instead of embedded images... or is there > a different URL? > > anyway, no biggee... i've made a copy with the images i can upload > someplace if it helps... am preparing my biannual "what has amy had her > thumbs in lately?" file for my university, and would like to include a > copy of the paper, but i can just print from the one i've made... just > figured for read_me and other reference would be good to have the > rtf/images thing sorted out anyway... let me know if you want the RTF > i've made... > > cu, > -@ > > On Sat, 3 Jul 2004, Nick Collins wrote: > > NC> > NC> Anyway, the swiki page HTML version should match the .rtf now if you > want NC> to see it with images in place. > NC> > > From: Hi all, > > Very busy but a quick note to introduce Tom Nullpointer to the list, > welcome Tom. It seems Tom's been thinking about/practicing live coding > independently of TOPLAP. I'm a great admirer of nullpointer's work with > PD, maybe he can help us with questions about MAX-style live coding. > > He's going to be at Aarhus for some live coding, what you going to be > doing Tom? > > alex > > > > Subject: Re: [livecode] brighton mock From: alex if it's of any use, one of the more constructive responses seems to be the > Photography Defense, which provides historical perspective on similar > debates over technology and art. i.e. "Did you know that when artists > first began to work with photography, painters rejected it as mechanical > and antithetical to art? It took awhile for people to understand and > appreciate the artistic possibilities of photography, and how they are > different than those of painting...." ... it gets head nods from the > computer artists, at least. :-) Perhaps it is useful to say "Here is something that was once rejected and is now accepted" to open minds a little, but it isn't a strong argument - it doesn't follow that programming will be accepted later just because it is rejected now. Actually I think the original question was a valid one and a deserves a direct answer, even though it was raised in a over-indulgent manner. I played the guitar for several years but never got far enough to compose something of my own. So with a maesure of honesty I could say that I program music because I failed to play the guitar. However, I understand that Nick can play real instruments very well but is still compelled to write musical software, so there must be something in it. So why do we feel the need to distance ourselves from the experience of plucking each individual sound out of a musical instrument? The answer for me is that I find it more satisfying to work in this way. Any distance from the sound is an illusion. We may be a level of abstraction away from the sound but that doesn't necessarily distance us from it. We still have full control over the timbre, but we are working on the composition at the same time. In fact we are able to think of timbre and composition as different resolutions of the same thing, which of course they are. alex -- alex slab laboratories Subject: [livecode] coding from scratch From: alex slab laboratories Date: Thu, 8 Jul 2004 02:08:22 -0400 Subject: Re: [livecode] coding from scratch From: Ge Wang So what I'm aiming for now is to be able to start with a blank text > editor and write code from scratch during the performance. Has anyone > tried this approach yet? Perry and I did this in both our Princeton performance and NIME performance in Japan (will post report). In the Princeton performance (our first on-the-fly performance), we started with blank editors (I in full-screen ms visual studio and perry in pico) and would type out complete (but short, 2-10 line) ChucK programs which we then saved to disk and ran concurrently from the command line in cygwin shell (perry and I each had a single virtual machine to run all program/shreds). The audience saw the whole process. We had no backups, which caused the audience to watch us type in silence for 2 minutes because my first program just wouldn't work (probable cause due to parser bug I later found). The video that was shown was taken from this performance. At NIME, we were on os x - both in TextEditor with pt 50+ font. We started with blank programs, but this time we also had simple programs from which we started from (such as drum loops written in ChucK) and edited both as parameters and structure. Pictures of my desktop, our score can be found here: http://soundlab.cs.princeton.edu/listen/on-the-fly/nime2004.html An important reflection from both shows was that it actually REALLY helps to have typed in and thought about some of the code during practice, more so in the way of a traditional instrument than a piece of software. This is true especially if you are starting in tabula rasa mode. The audience can see how fluid you negotiate the code - and they can get a sense of your thought process through your pauses and "punctuations". It totally give both the performer and audience some strange notion and measure of virtuosity. > I suppose it lends a certain structure to the performance, starting off > with one very simple element, developing it a little, adding a little > bit of structure until it becomes something like a melody, then > starting > up another editor, introducing another simple element, developing that > into a simple rhythm, then coding in some interactions between the two > processes perhaps, and going from there. We did nearly exactly what you described. Our NIME performance was more advanced than our first - the language was more complete, we had more unit generators. We also really learned a lot from our first attempt, so we felt more mentally stable (ha) this time. ChucK programs can start/replace/stop other ChucK programs with sample-precision, so higher level programs were written to manage lower-level programs with precise timing. Perry's drum loops morph'ed over time as he changed code and swapped in new programs. The processes interacted with each other using timing directives. > Just because code exists on disk after a performance, it doesn't > necessarily mean that it should be run again... It could be thought of > as a kind of fossilised improvisation. Not even a record of a > performance, but a frozen impression of the performance at the moment > of > its death. We are working on ways to playback a ChucK performance with precision. But, Alex's idea of the frozen impression is much more poetic... Ge! Date: Thu, 8 Jul 2004 02:29:27 -0400 Subject: [livecode] ChucK released + Audicle link From: Ge Wang > I've found live programming during a performance a problem. In recent > experiments I've been preparing some scripts in advance, running them > and then finding myself tweaking variables rather than working the code > in any significant way. > > So while I have found live coding to be an inspiring and productive way > of writing music at home and in my studio, the way I'm performing with > these pre-prepared scripts seems a step back from how I was working > before. I've thrown away the user interfaces to reveal the code below, > but am then just editing numbers that are scattered around the screen. > > So what I'm aiming for now is to be able to start with a blank text > editor and write code from scratch during the performance. Has anyone > tried this approach yet? I seem to remember you saying something about > pd wars Tom, where the artists compete against each other, starting with > an empty patch and seeing how quickly they can make something good. > > I think it could work well. Many times I've spent hours on a problem, > come up with an ugly solution, then re-written a better version later in > a fraction of the time. There are some things, such as chat systems or > website backends that I've made many times over with the same ideas, and > yet they've ended up behaving quite differently for whatever reasons, > each creation satisfying in its own right. > > I suppose it lends a certain structure to the performance, starting off > with one very simple element, developing it a little, adding a little > bit of structure until it becomes something like a melody, then starting > up another editor, introducing another simple element, developing that > into a simple rhythm, then coding in some interactions between the two > processes perhaps, and going from there. > > Just because code exists on disk after a performance, it doesn't > necessarily mean that it should be run again... It could be thought of > as a kind of fossilised improvisation. Not even a record of a > performance, but a frozen impression of the performance at the moment of > its death. > > So I'm collaborating on a set with a drummer at a headphone festival in > london next week (http://state51.org/placard/). Even though the set > will only be twenty minutes it seems that I'm going to have to try this > out. > > I'd better sleep now anyway, apologies for my late-night ramblings! No > conclusions here, just stray thoughts... > > alex > > -- > alex > slab laboratories > > > > Date: Thu, 8 Jul 2004 12:49:24 -0700 (PDT) From: Amy Alexander a> Perhaps it is useful to say "Here is something that was once rejected a> and is now accepted" to open minds a little, but it isn't a strong a> argument - it doesn't follow that programming will be accepted later a> just because it is rejected now. a> a> Actually I think the original question was a valid one and a deserves a a> direct answer, even though it was raised in a over-indulgent manner. a> a> I played the guitar for several years but never got far enough to a> compose something of my own. So with a maesure of honesty I could say a> that I program music because I failed to play the guitar. However, I a> understand that Nick can play real instruments very well but is still a> compelled to write musical software, so there must be something in it. a> a> So why do we feel the need to distance ourselves from the experience of a> plucking each individual sound out of a musical instrument? a> a> The answer for me is that I find it more satisfying to work in this a> way. Any distance from the sound is an illusion. We may be a level of a> abstraction away from the sound but that doesn't necessarily distance us a> from it. We still have full control over the timbre, but we are working a> on the composition at the same time. In fact we are able to think of a> timbre and composition as different resolutions of the same thing, which a> of course they are. a> a> alex a> a> Subject: Re: [livecode] brighton mock From: Dave Griffiths Why don't you just use Fruity Loops to make music? Also, the musical output > of your work could be done better as a record, preferably by people other > than you who know what they're doing as musicians, rather than being > programmers, you pretenders. Yes, the output is definitely not using > anything that isn't available in commercial software. No, the output can be better than is availible in commercial software - the key word is "live". Livecoded music in a pure sense would be live music in the way that a fruity loops setup would not be. I may be well out of line in saying this, but a lot of electronic music seems to read "live" as "I'll mix prewritten/recorded sections and tweak some effects". Live code should develop (no pun intended ;) ) as the audience/performer hears it (for the first time). I think its a return to good old fashioned bashing and stumming of (real) objects a la jazz/improv music. Livecoding should allow just that bit more flexibility to do this with software. > I've noticed that the gestural content of this typing is not like the > haptic joy I get from my musical instrument. Therefore it must be worse as > music. And your fetishism reminds me of early music practitioners who > insist on period instruments. (by the way, in secret I'm an electronic > music composer, ie, this is socratic irony, but I'd like a big round of > applause as if I'm a pure unsullied musician of the romantic era). I kind of agree, the aesthetic of watching someone programming argument seems a little weak to me too ;) Must find a way of programming via midi... dave Subject: Re: [livecode] introducing nullpointer From: Dave Griffiths Hi, > > I think you probably mostly know me anyhow... > I did qqq, bitmapsequencer etc. > I'm currently co-running rand()% www.r4nd.org > which i strongly urge you all to contribute to... any chance of a linux server? I have some code - but it's linux/alsa/jack. > anyhow, I'm quite into live coding, but as an aspect of generative systems, > in otherwords most live coding is evolutionary in its practice, > trying various combos/subroutes etc and pruning/growing chosen directions. > Generative systems are often a major aid to such coding approaches, > allowing the experimentation to occur independently within evolving/changing > subroutes etc. I'm interested in this approach too - genetic programming could be seen as a very rapid way to livecode, the abilitity to get in there and modify code manually which is then present in the genome, or mutate code once you had handwritten it live - interchangably - would be very powerful. I'm playing with a very simple form of this for my placard performance, but as it's my first time playing any form of music live, it err, could be interesting... ;) dave From: On Mon, 2004-07-05 at 13:08, tom@hiddenuk wrote: > > Hi, > > > > I think you probably mostly know me anyhow... > > I did qqq, bitmapsequencer etc. > > I'm currently co-running rand()% www.r4nd.org > > which i strongly urge you all to contribute to... > > any chance of a linux server? I have some code - but it's > linux/alsa/jack. > > > anyhow, I'm quite into live coding, but as an aspect of generative systems, > > in otherwords most live coding is evolutionary in its practice, > > trying various combos/subroutes etc and pruning/growing chosen directions. > > Generative systems are often a major aid to such coding approaches, > > allowing the experimentation to occur independently within evolving/changing > > subroutes etc. > > I'm interested in this approach too - genetic programming could be seen > as a very rapid way to livecode, the abilitity to get in there and > modify code manually which is then present in the genome, or mutate code > once you had handwritten it live - interchangably - would be very > powerful. > > I'm playing with a very simple form of this for my placard performance, > but as it's my first time playing any form of music live, it err, could > be interesting... ;) > > dave > > > From: Ge Wang I'm quite intrigued by this aspect of multiple systems on multiple > machines.. > I know alex runs a lot of perl apps concurrently and I've taken to > multi > tasking audio apps on one machine, What software(s) were you using for your performance? It sounds intriguing. Do you program notes or recording? ChucK is designed to be concurrent. You can add and modify each process (shred in ChucK) during runtime, and because writing timing is part of the program flow, all shreds are automatically synchronized to each other by time - this means new shreds on be written on-the-fly, and can easily discover and share the audio, precise to the sample. I am sorry that I am trumpeting ChucK so much - it's that this type of multitasking is what ChucK is built for. > I guess SC server sort of works in a similar way, but have people > worked > much with this sort of distributed > composition/processing? It doesn't exist yet, but the audicle (another current, giant software disaster zone of ours) is designed to be an environment to collaboratively code audio. http://audicle.cs.princeton.edu/ Best, Ge! > > > Tom > http://www.nullpointer.co.uk > http://www.r4nd.org > ----- Original Message ----- > From: "Dave Griffiths" > To: > Sent: Saturday, July 10, 2004 10:30 AM > Subject: Re: [livecode] introducing nullpointer > > >> On Mon, 2004-07-05 at 13:08, tom@hiddenuk wrote: >>> Hi, >>> >>> I think you probably mostly know me anyhow... >>> I did qqq, bitmapsequencer etc. >>> I'm currently co-running rand()% www.r4nd.org >>> which i strongly urge you all to contribute to... >> >> any chance of a linux server? I have some code - but it's >> linux/alsa/jack. >> >>> anyhow, I'm quite into live coding, but as an aspect of generative > systems, >>> in otherwords most live coding is evolutionary in its practice, >>> trying various combos/subroutes etc and pruning/growing chosen > directions. >>> Generative systems are often a major aid to such coding approaches, >>> allowing the experimentation to occur independently within > evolving/changing >>> subroutes etc. >> >> I'm interested in this approach too - genetic programming could be >> seen >> as a very rapid way to livecode, the abilitity to get in there and >> modify code manually which is then present in the genome, or mutate >> code >> once you had handwritten it live - interchangably - would be very >> powerful. >> >> I'm playing with a very simple form of this for my placard >> performance, >> but as it's my first time playing any form of music live, it err, >> could >> be interesting... ;) >> >> dave >> >> >> > Date: Thu, 15 Jul 2004 20:21:08 +0200 From: Julian Rohrhuber I guess SC server sort of works in a similar way, but have people worked >much with this sort of distributed >composition/processing? There is a couple of projects that worked over the network. SC2 had an OSC layer that was used by quite a few people. For example Ron Kuivila did a workshop (ICMC Berlin 2000) with networked sound compostion (Rhizome Caf=E9, an extension of David Tudor's Rainforest). (There was actually a short breakout of live coding after I was fed up with gui) Some other projects are mentioned here: http://swiki.hfbk-hamburg.de:8888/MusicTechnology/194 http://swiki.hfbk-hamburg.de:8888/MusicTechnology/161 I had written a shared control space for sc2 we sometimes used in seminars. Alberto de Campo and I have a workshop/performance series called Warteraum (like waiting space or waitingroom) which is about networked live coding. It uses two paradigms, one is algorithmic granular synthesis, the other is using a shared version of jitlib that allows to share continuous sound processes. SC3 is always a server client structure, so the scaling towards a multiclient multiserver architecture is fairly straightforward. I've written a broadcast server that helps to distribute messages to various users. We also use a chat to transmit code, which is a simple thing but has proved very effective. Out of this seminars also comes a performance group, 'powerbooks unplugged'. -- =2E Subject: Re: [livecode] multiple sources From: alex hmmm, actually, what i was trying to do with the photography example was > not compare rejection, but compare mechanical forms of artmaking. Ah, yes that is a good point. Sorry, I'm so used to photography being accepted as a medium that I forget that it is mechanised, and was blind to that aspect of the comparison. > as to your question alex about, "why do we use computers" - i can - or > could - play several instruments and got far enough with some to be a > performance major in college - but i was always frustrated because my > craft was ok but not excellent, despite lots of practice - my mind knew > what it was trying to express, but my fingers didn't move quickly or > deftly enough to express it. (i'm also lousy at most sports). I was lousy at sports at school too, at least I was always picked last for football/rugby, although in his essays Paul Graham tells us that this is because we didn't have the time for popularity contents that are part of all school activities. I took beginners courses in the guitar and the trumpet but although I never got to be good, I don't think it was just the craft aspects that let me down, but I just got frustrated with not being able to make anything new. I'd play someone elses tune or copy someone elses rhythm but I just couldn't find a way of making my own tune or rhythm. I think programming has helped me really forget about the mechanics of reciting and forced me to try to create something new. I don't think I'm particularly successful but it has got me further than the guitar did at least. alex Subject: Re: [livecode] re: coding from scratch From: alex I've been doing some from-scratch music performance with Squeak ( > http://squeak.org ). This sounds really good, it would be really good to see/hear this one day. Maybe we should organise some kind of TOPLAP conference. > At the minimalist extreme, I've been working on making the virtual > machine and object memory absolutely as small as possible but still able > to start and extend itself. Wow... How fast can you build up a piece of music in such a minimalist environment? > Your point about frozen performances also struck a chord with me... in > addition to being changeable during performance, Squeak can make perfect > snapshots of a running object memory, for resumption later (e.g., on a > different machine, with a different host operating system). Handy, although I suppose one problem is that it can't save what you were thinking about at that moment. Maybe this is less of a problem for other people though, my memory is terrible. It says on your website that you're synthaesthetic. You're probably quite bored of explaining this all the time but I have to ask, which of your senses does this effect? Do you see different colours/taste different foods when you look at different algorithms? :) How does this affect your programming? alex Date: Sun, 18 Jul 2004 17:34:40 +0100 From: Nick Collins hope placard went well, Dave, Alex. I had fun, a slight panic with a dodgy mixer, but after that I got into it. The whole thing was improvised, live coded from scratch (still not sure if my lsystems rule writing really counts as code). I found that in the heat of the performance I actually had a lot more time to do things than I had imagined. I failed, however, to get my screen projected - too much else to worry about I guess. Next time. A mate of mine recorded the gig, am I allowed to upload it Alex? The two Alex'es performance of drum & perl was a highlight of the day, slowly building up beats and the odd blast of acid melody as drummer-Alex freeformed on top. Lots of terminal window action - bit difficult to see the code itself due to the projector (and the white sheet that wouldn't stay stuck to the wall :] ) You could kind of manually mix the drumming + electronics by adusting your headphones, I was a bit close to the kit to take full advantage of this though... > Is Dave Griffiths joining us on live video with fluxus? Not coming > to Aarhus? I dunno much about it (and would have to get time off work, which might be tricky then) - really got the live performance bug now though so.... > There are haptics in programming- I can enjoy typing. But feel of typing > doesn't lead to musical themes in the same way themes fit keys or the > breath. In musical improvisation actions are often so automated out of > conscious control that you get a rush of pleasure in the flow of your own > mind/body. Live algorithm programming is overly cerebral? How do we > compromise between innovation and smoothness of execution? I find the keyboard/mouse combination pretty useless for expression personally, although I am stuck with it. I also think they make it difficult to express process to an audience. I'm really interested in moving code away from text (or boxes and wires) and into more immediate forms. I have some embryonic ideas of a system where you would set up chain reactions with big primary coloured lego blocks (a bit like a build your own mr driller level): http://digilander.libero.it/calimerosegg/gallery/gbadvance/mrdrill2.gif Would such a thing still be code? What does "code" mean? A turing complete language? Perhaps I need more sleep :) dave Subject: Re: [livecode] re: coding from scratch From: alex I had fun, a slight panic with a dodgy mixer, but after that I got into > it. Sorry about that - the problem wasn't with the mixer in the end but with my dodgy soldering further along the chain... With all the problems that had caused the projection completely escaped my mind. > The whole thing was improvised, live coded from scratch (still not > sure if my lsystems rule writing really counts as code). I found that in > the heat of the performance I actually had a lot more time to do things > than I had imagined. I was really pleased with my live coding, it was much more fun programming from scratch than using the "one i'd made earlier" approach. The results were quite simplistic, mainly building rhythms out of modulos and sinewaves... But as a first (and second) from-scratch live programming performance, it did feel like a whole new world opening up. It was a lot of fun performing with french Alex, we'd been practising together for a while until I made the sudden decision to stop using pre-prepared code. The following (and final) practice session was far better than the previous ones. I was able to react to what he was doing in a far more satisfying way than before. We did four run-throughs in that session, each one being really quite different. French Alex is actually more of a pro bass player I believe, it was the first time he'd played drums in front of a live audience. He was sampling and processing his drum hits with max/msp, and as a side effect we stayed in perfect sync (when we wanted to). > I failed, however, to get my screen projected - too much else to worry > about I guess. Next time. Sorry for my part in this - had had a lot of problems with missing channels in the first few performances which we traced to my dodgy soldering which had deteriorated since last year! > A mate of mine recorded the gig, am I allowed to upload it Alex? Of course - they're your recording rights! To be honest I took a recording too, I have the entire festival (apart from some bits lost through a couple of power outages) as uncompressed wavs. Sadly I was running about too much to listen to your set properly, although I listened to it today. Ace! A few of the others at state51 paused from their work to listen, raise their eyebrows and mumble approval too. I listened to your set too Tom, it rocked out, and I enjoyed that infinite loop you mentioned... There's still a couple of weeks left of the placard festival, maybe we should make a live coding placard session before its over? Linked via the Internet, with toplap-ratified video streams of our desktops. More info at http://placard7.ath.cx/ > I dunno much about it (and would have to get time off work, which might > be tricky then) - really got the live performance bug now though so.... The runme/dorkbot citycamp is 25/26/27 august in Aarhus, Denmark. It'll be a nice three days of workshops, presentations and performances. The two days before, the 23rd and 24th, will see the readme software art conference, more of an academic conference with presentation of papers including the TOPLAP one. They are like two parts of the same thing with a lot of crosstalk, but I guess a lot (most?) people will not attend both. The free accommodation is officially full up but there may be a chance to fit you in - let me know asap if you want me to ask on your behalf. Likewise I think that the travel funds are used up but there's no harm asking. More info at http://readme.runme.org/ > I find the keyboard/mouse combination pretty useless for expression > personally, although I am stuck with it. I also think they make it > difficult to express process to an audience. I'm really interested in > moving code away from text (or boxes and wires) and into more immediate > forms. On the other hand I think text has some advantages that other forms might not. Text can be startlingly evocative, and have a rhythm and structure that relates well with music. It can be as precise or vague as you like, and the different extremes may be traversed with the introduction or omission of a single word. But yes, as far as writing text and in particular code is concerned, it is rather slow. I'm a really slow writer. The Perl language is a lot faster than many computer languages to use, but even so, typing an algorithm to make a rhythm can be a lot slower than hitting it out a drum. However by writing that text you're invoking actions that are much faster than a drummer. One way of looking at it is that writing code is an extremely fast form of expression, it just has a lot of latency! > I have some embryonic ideas of a system where you would set up chain > reactions with big primary coloured lego blocks (a bit like a build your > own mr driller level): > http://digilander.libero.it/calimerosegg/gallery/gbadvance/mrdrill2.gif > > Would such a thing still be code? What does "code" mean? A turing > complete language? I read somewhere recently (probably wikipedia) that turing complete computers are impossible to build because they require infinite memory. In any case, turing completeness seems an arbitrary constraint. I think our earlier discussions led us to realise that it is probably not possible to strictly define the scope of toplap code. So we have to content ourselves with defining live coding as working closely with the composition of the music, rather than being almost completely abstracted away from it by consumer software. There were a lot of ableton live performances at the london placard. Because the performances were back to back and only 20 minutes long, we had four stages, so up to three acts can be setting up while the fourth person plays. At one point all four stages had laptops running ableton live! See ya, alex Date: Tue, 20 Jul 2004 08:17:23 +0100 From: Nick Collins live! grrr well, not entirely serious. But I finally got to see a copy of Delaney's new book on laptop music (PC publishing)- talk about Ableton Live centric! Not even Max/MSP gets a proper mention, and as for PD, SC, custom software, live coding, alternative controllers beyond the obvious MIDI slider stuff... Dave, lego blocks sound fine for computing, tangible interfaces are a very current research topic. See also Ge's NIME04 ref to the Barcelona group's reacTable project. Oh, I e-mailed EMF on the off chance about their laptops festival, awaiting details (probably we have to organise our own sub event somewhere, we shall see), this is from Joel Chadabe: Hi Nick, This sounds great! We'd be delighted to include it. We'll be in touch again in August with details. Best, Joel >X-Sieve: CMU Sieve 2.2 >Date: Sun, 18 Jul 2004 17:14:45 +0100 >From: Nick Collins >Sender: nc272@hiddenuk >To: laptops@hiddenorg >Subject: TOPLAP >X-Cam-ScannerInfo: http://www.cam.ac.uk/cs/email/scanner/ > >Hi, > I'm contacting you on behalf of TOPLAP, the Transnational Organisation for the Promotion of Live Algorithm Programming (see toplap.org) in connection with your laptop festival of next year. I was wondering if there might be any part for a live coding event as part of festival, to which we could bring together performers from around the globe (such as VJ Ubergeek, Ge Wang (ChucK), klipp av, yaxo paxo and Julian Rohrhuber/Alberto de Campo). A prototype meeting of this type is occuring at the README software arts festival in Aarhus this summer, but we're thinking ahead to future chances to promote live programming. > >I heard about the EMF's festival from a contact, but found no details on the web site, so I hope you don't mind this approach, and could possibly tell me more. > >Best wishes, >Nick Collins From: "Tom Betts" On Mon, 2004-07-19 at 16:26, Dave Griffiths wrote: > > I had fun, a slight panic with a dodgy mixer, but after that I got into > > it. > > Sorry about that - the problem wasn't with the mixer in the end but with > my dodgy soldering further along the chain... With all the problems > that had caused the projection completely escaped my mind. > > > The whole thing was improvised, live coded from scratch (still not > > sure if my lsystems rule writing really counts as code). I found that in > > the heat of the performance I actually had a lot more time to do things > > than I had imagined. > > I was really pleased with my live coding, it was much more fun > programming from scratch than using the "one i'd made earlier" > approach. The results were quite simplistic, mainly building rhythms > out of modulos and sinewaves... But as a first (and second) > from-scratch live programming performance, it did feel like a whole new > world opening up. > > It was a lot of fun performing with french Alex, we'd been practising > together for a while until I made the sudden decision to stop using > pre-prepared code. The following (and final) practice session was far > better than the previous ones. I was able to react to what he was doing > in a far more satisfying way than before. We did four run-throughs in > that session, each one being really quite different. > > French Alex is actually more of a pro bass player I believe, it was the > first time he'd played drums in front of a live audience. He was > sampling and processing his drum hits with max/msp, and as a side effect > we stayed in perfect sync (when we wanted to). > > > I failed, however, to get my screen projected - too much else to worry > > about I guess. Next time. > > Sorry for my part in this - had had a lot of problems with missing > channels in the first few performances which we traced to my dodgy > soldering which had deteriorated since last year! > > > A mate of mine recorded the gig, am I allowed to upload it Alex? > > Of course - they're your recording rights! To be honest I took a > recording too, I have the entire festival (apart from some bits lost > through a couple of power outages) as uncompressed wavs. > > Sadly I was running about too much to listen to your set properly, > although I listened to it today. Ace! A few of the others at state51 > paused from their work to listen, raise their eyebrows and mumble > approval too. > > I listened to your set too Tom, it rocked out, and I enjoyed that > infinite loop you mentioned... > > There's still a couple of weeks left of the placard festival, maybe we > should make a live coding placard session before its over? Linked via > the Internet, with toplap-ratified video streams of our desktops. More > info at http://placard7.ath.cx/ > > > I dunno much about it (and would have to get time off work, which might > > be tricky then) - really got the live performance bug now though so.... > > The runme/dorkbot citycamp is 25/26/27 august in Aarhus, Denmark. It'll > be a nice three days of workshops, presentations and performances. The > two days before, the 23rd and 24th, will see the readme software art > conference, more of an academic conference with presentation of papers > including the TOPLAP one. They are like two parts of the same thing > with a lot of crosstalk, but I guess a lot (most?) people will not > attend both. The free accommodation is officially full up but there may > be a chance to fit you in - let me know asap if you want me to ask on > your behalf. Likewise I think that the travel funds are used up but > there's no harm asking. > > More info at > http://readme.runme.org/ > > > I find the keyboard/mouse combination pretty useless for expression > > personally, although I am stuck with it. I also think they make it > > difficult to express process to an audience. I'm really interested in > > moving code away from text (or boxes and wires) and into more immediate > > forms. > > On the other hand I think text has some advantages that other forms > might not. Text can be startlingly evocative, and have a rhythm and > structure that relates well with music. It can be as precise or vague > as you like, and the different extremes may be traversed with the > introduction or omission of a single word. > > But yes, as far as writing text and in particular code is concerned, it > is rather slow. I'm a really slow writer. The Perl language is a lot > faster than many computer languages to use, but even so, typing an > algorithm to make a rhythm can be a lot slower than hitting it out a > drum. However by writing that text you're invoking actions that are > much faster than a drummer. > > One way of looking at it is that writing code is an extremely fast form > of expression, it just has a lot of latency! > > > I have some embryonic ideas of a system where you would set up chain > > reactions with big primary coloured lego blocks (a bit like a build your > > own mr driller level): > > http://digilander.libero.it/calimerosegg/gallery/gbadvance/mrdrill2.gif > > > > Would such a thing still be code? What does "code" mean? A turing > > complete language? > > I read somewhere recently (probably wikipedia) that turing complete > computers are impossible to build because they require infinite memory. > In any case, turing completeness seems an arbitrary constraint. > > I think our earlier discussions led us to realise that it is probably > not possible to strictly define the scope of toplap code. So we have to > content ourselves with defining live coding as working closely with the > composition of the music, rather than being almost completely abstracted > away from it by consumer software. > > There were a lot of ableton live performances at the london placard. > Because the performances were back to back and only 20 minutes long, we > had four stages, so up to three acts can be setting up while the fourth > person plays. At one point all four stages had laptops running ableton > live! > > See ya, > > alex > > > Subject: Re: [livecode] re: coding from scratch From: Dave Griffiths There's still a couple of weeks left of the placard festival, maybe we > should make a live coding placard session before its over? Linked via > the Internet, with toplap-ratified video streams of our desktops. More > info at http://placard7.ath.cx/ not sure I have the technology to do this, but I'm up for more > > I dunno much about it (and would have to get time off work, which might > > be tricky then) - really got the live performance bug now though so.... > > The runme/dorkbot citycamp is 25/26/27 august in Aarhus, Denmark. It'll > be a nice three days of workshops, presentations and performances. ok, definately gonna try to make it to this, free accommodation would definately help if it's at all possible... > > I find the keyboard/mouse combination pretty useless for expression > > personally, although I am stuck with it. I also think they make it > > difficult to express process to an audience. I'm really interested in > > moving code away from text (or boxes and wires) and into more immediate > > forms. > > On the other hand I think text has some advantages that other forms > might not. Text can be startlingly evocative, and have a rhythm and > structure that relates well with music. It can be as precise or vague > as you like, and the different extremes may be traversed with the > introduction or omission of a single word. yeah, I change my opinion a lot on this subject - I like the idea of trying to look at programming in different ways, but when it comes down to it I won't let go of my text editor... (endured a spate of UML flirtation from managment a few years ago at work, and got all precious about text) > But yes, as far as writing text and in particular code is concerned, it > is rather slow. I'm a really slow writer. The Perl language is a lot > faster than many computer languages to use, but even so, typing an > algorithm to make a rhythm can be a lot slower than hitting it out a > drum. However by writing that text you're invoking actions that are > much faster than a drummer. > > One way of looking at it is that writing code is an extremely fast form > of expression, it just has a lot of latency! I think at the very least we need to come up with some good languages for live programming - maybe even design one for the purpose. > > I have some embryonic ideas of a system where you would set up chain > > reactions with big primary coloured lego blocks (a bit like a build your > > own mr driller level): > > http://digilander.libero.it/calimerosegg/gallery/gbadvance/mrdrill2.gif > > > > Would such a thing still be code? What does "code" mean? A turing > > complete language? > > I read somewhere recently (probably wikipedia) that turing complete > computers are impossible to build because they require infinite memory. > In any case, turing completeness seems an arbitrary constraint. it's the idea that if you can use a machine to duplicate the properties of a turing machine (which is a universal computer) even given infinite memory and time - it can be used to create any other machine - which is nice, and important somehow. Cellular automata can be made into turing machines (glider guns can be made into logic gates etc...) Most programming languages obviously, but not languages like HTML, and probably not my melodic lsystems. > I think our earlier discussions led us to realise that it is probably > not possible to strictly define the scope of toplap code. So we have to > content ourselves with defining live coding as working closely with the > composition of the music, rather than being almost completely abstracted > away from it by consumer software. I'm happy with this definition, as it's closer to the point than going off down the computer science route. I think there is a continuum of interface between the knobs and sliders of ableton live and the code we are using - the fact we seem to need to do this highlights a shortcoming of accepted ideas in GUI land maybe. more waffle, cheers, dave Subject: Re: [livecode] re: coding from scratch From: Dave Griffiths At one point all four stages had laptops running ableton > > live! > > grrr > > well, not entirely serious. But I finally got to see a copy of Delaney's > new book on laptop music (PC publishing)- talk about Ableton Live centric! well, it probably does cover 90% of live laptoppery these days - and you can't fault them on design principles, throwing away features until everything fits on the screen :) > Not even Max/MSP gets a proper mention, and as for PD, SC, custom software, > live coding, alternative controllers beyond the obvious MIDI slider stuff... > > Dave, lego blocks sound fine for computing, tangible interfaces are a very > current research topic. See also Ge's NIME04 ref to the Barcelona group's > reacTable project. I love stuff like this - I wonder if they have connections with the guys at MIT: http://web.media.mit.edu/~jpatten/audiopad/ dave Subject: Re: [livecode] coding from scratch From: Dave Griffiths Yeah, I had fun at placard (shame I couldnt stay longer), > I didnt start from scratch though (boo, hiss etc), > But decided instead to do some on the fly recoding of some semi-complete > patches. I liked the development of your set, you had a good pace (I got a bit lost and mine ended up a tad samey) was it all PD - synthesis too? Should have grabbed you for a chat, but it was all quite busy by then (and competition for sockets was fierce ;)) > 3. text interface. although above i was being keen on different form of gui > and representation i think that text can be a very powerful > method of interacion and display. Theres something that seems more permanent > in a text phrase than an icon. > and text interfaces can also represent the history of a performance in a way > icon/gui based work cant.. > revisions are easier to see and theres a closer allusion to poetry and the > elegance/efficiency implied I think there is some ground between text and gui that isn't often covered, flow graphs are close, but I think there may be more ways of doing it. I'd like a complete programming language that is visual (shapes, colours, no text) and clear enough for an audience to see a clear correlation between what they are seeing in the "code" and what is happening. I'm probably asking for too much... dave From: Ge Wang I think at the very least we need to come up with some good languages > for live programming - maybe even design one for the purpose. I agree. ChucK currently allows code to be added/changed on the fly, you can do this from the command line, or from within the ChucK program, which can write new programs on-the-fly. Perry and I are working on making this happen at an even finer level (per line/per instruction). ChucK is also totally open to new ideas - especially from people like all of us live coders. Also, I am totally down with designing other languages. Perhaps an updated version of Forth - the ultimate write-only language! That is really what we want (sometimes). write-only. If others are interested, let's look into this more! Environments are also crucial. we have an audicle site: http://audicle.cs.princeton.edu/ Thoughts? Best, Ge! From: > On Jul 21, 2004, at 11:58 AM, Dave Griffiths wrote: > > > I think at the very least we need to come up with some good languages > > for live programming - maybe even design one for the purpose. > > I agree. ChucK currently allows code to be added/changed on the fly, > you can do this from the command line, or from within the ChucK program, > which can write new programs on-the-fly. Perry and I are working on > making this happen at an even finer level (per line/per instruction). > ChucK is also totally open to new ideas - especially from people like > all of us live coders. > > Also, I am totally down with designing other languages. Perhaps an > updated version of Forth - the ultimate write-only language! That is > really what we want (sometimes). write-only. If others are interested, > let's look into this more! > > Environments are also crucial. we have an audicle site: > > http://audicle.cs.princeton.edu/ > > Thoughts? > > Best, > Ge! > > > From: Ge Wang wrote: > I've just been lookinjg at the docs for chuck, seems like a good idea. > Seems a bit like sc server? At a high level, some stuff (like the listener/client setup) is like SC Server. the language and programming model is quite different, with the timing model and the precise support for concurrency (both really work together) ChucK, in some sense, is more like a poor-man's Max, where you can connect ugen's quickly. Where it differs from Max/SC and many other languages is that the control for the ugen's can be completely separated from the patching itself, and can be exerted at arbitrary time granularities. The concurrency is very useful here to organize everything according to the needs of the program. > I'm interested in the system but I'm also intereseted in the > einvironment, We totally agree on this. both the language(s) and environment(s) are crucial pieces to effective live coding. > In chuck I can see it would be beneficial to have the ability to run > multiple windows holding code fragments that can be cut/paste > and 'shredded' etc. allowing a multitasking fluidity to the > environment. Definitely. The audicle will strive to do this, but the exact semantics is still in the air. Currently, the audicle is designed to have a flow graph of windows denoting parent child shred relationships and timing. But this is still not well-defined. Hopefully we can all work on making this aspect fluid, like playing a good video game or even better, a musical instrument. Good thoughts Tom, let's keep going on this. And what do others think about this? > It would be nice to allow users to define chuck files as 3rd > party generators etc > (as in max/pd as both compiled code and as language built > subroutines), and > have these to > be easily accessible in the performance environment. Yeah! I think because ChucK has a well-defined set of byte-code instructions and sample-synchronous timing, we can do a compiled "plug-in" (an internal, ha) natively in ChucK. This feature doesn't exist yet. Currently, it IS possible to import ugen's and generic libraries from c++ (not documented yet). > This may well be a case of adding extra shells to the > system but it could radically effect the usability/performance flow etc An interesting question with this is how to organize the predefined stuff both as the performer and also make them available for the audience, to perhaps hint and preview the audience for a taste of things to come. I think we all may be experienced difficulty convey some ideas/algorithm even the screen is projected. This is really an interesting HCI-esque research problem - how to effectively convey the process visually to the audience? > mmmmmmmmm i think i've just realised that you have a project 'audicle' > that > is > looking at this sort of stuff hmmmmm... ah well i've written this so > I'm > still gonna send it ;) I am very glad you did. In addition to new ideas, it shows that we have been thinking in similar directions for the things to try/research next! ChucK is very young (currently implementing arrays, ha), and it really isn't based on any one language. we are free to move it in new and better directions (certainly from its current rather fledgling state). Come and join the discussion list - currently 70+ souls are subscribed. general list: https://lists.cs.princeton.edu/mailman/listinfo/chuck developer: https://lists.cs.princeton.edu/mailman/listinfo/chuck-dev Best, Ge! From: "Dave Griffiths" A recording of a collaboration between French Alex on drums processed > with his MAX patches and me (English Alex) live coding from scratch with > Perl at the london placard headphone festival: > http://yaxu.org/table-placard-20040717.ogg > > The setting: > http://state51.org/placard/pictures/~placard2004-0/ > > This is us: > http://state51org.static3.state51.co.uk/63/84/3010054_j6CZ/original.jpeg > > alex Subject: Re: [livecode] placard table recording From: alex should we add toplap wiki pages for past events, recordings and so on? > maybe also live coding apps/frameworks languages, that sort of thing? Sure why not? The deadline for submissions to the transmediale award is coming up, 15th September: http://www.transmediale.de/05/pdf/tm05_call.pdf Shall we enter the TOPLAP manifesto? We should put forward a proposal for the club transmediale thing at least. alex Date: Mon, 09 Aug 2004 09:23:04 +0100 From: Nick Collins wrote: > On Wed, 2004-08-04 at 14:26, Dave Griffiths wrote: >> should we add toplap wiki pages for past events, recordings and so on? >> maybe also live coding apps/frameworks languages, that sort of thing? > > Sure why not? > > The deadline for submissions to the transmediale award is coming up, > 15th September: > > http://www.transmediale.de/05/pdf/tm05_call.pdf > > Shall we enter the TOPLAP manifesto? We should put forward a proposal > for the club transmediale thing at least. > > alex > > Subject: Re: [livecode] placard table recording From: alex This would be fun- certainly a good chance to reconnect with Julian, > Alberto again. (Ok to involve you J?) > > Perhaps also a London event for December or so? That'd be good - I'd like to host it here but it can be quite cold in the factory then. Otherwise we could do something at the ICA perhaps. > We can discuss in Aarhus; even do the submission from there. Yes, lets draft a submission there and run it past those who can't make it via the mailing list. alex Subject: [livecode] ultrasound, huddersfield From: alex I'd go. And gladly concede any funding towards getting Julian over too. I'm happy to pool funds too. > Looks like the sort of thing you could propose an evening for. Yep, I'll propose some TOPLAP presentations and performances. > Again, we can discuss/write proposals next week. I'm happy to type some > bumpf. They want some brief info for early marketing - I'll send them a little bit of text and some images. alex From: On Mon, 2004-08-16 at 15:01, Nick Collins wrote: > > I'd go. And gladly concede any funding towards getting Julian over too. > > I'm happy to pool funds too. > > > Looks like the sort of thing you could propose an evening for. > > Yep, I'll propose some TOPLAP presentations and performances. > > > Again, we can discuss/write proposals next week. I'm happy to type some > > bumpf. > > They want some brief info for early marketing - I'll send them a little > bit of text and some images. > > alex > > > > Date: Tue, 17 Aug 2004 11:07:51 +0100 Subject: Re: [livecode] ultrasound blurb From: Alex Some computer languages allow changing a running process on the fly by > rewriting the code that defines it. Applied to computer music this means > that one can write sound compositions while they are already playing. The relationship between computer languages and "computer music" is for you natural and when your are talking about it, I hear a guitarist or a piano player talking about its instrument. But the link is far from being obvious to people generally and moreover for traditional musicians who are still very much going towards computers for their abilities to sequence and edit music and not so much for composing. For this you may have to go through a rough explanation of the entity of electronic music or how a computer "understand" music. A good introduction to this has been written by Miller Puckette, as an online book: http://crca.ucsd.edu/~msp/techniques.htm Sure, for a short proposal, it may difficult to condense this in one or two phrases but I think they are quite essential. WOOOW this is very interesting guys, I keep on listening.... Alex (the French one). From: "N. Collins" Why not. Do we have any god live coder shots? We have screenshots, not > so much of venue+coder+projected backdrop of code under > construction+audience (difficult to get in one shot perhaps)- digital > cameras must be on stand-by next week I think... > > --On Wednesday, August 18, 2004 9:15 am +0100 alex > wrote: > >> >> Thanks for this Nick, they're also asking for images - shall I just >> send >> the logo? >> >> alex >> > > > > From: "N. Collins" So I improvised the TOPLAP presentation here in Aarhus and learnt=20 > the subject of this message. >=20 > In particular, start with some (basic) templates for the work you=20 > do. (ie SC, don=B4t write all SynthDefs from scratch) hmm, I guess there is a danger to being too puritan with these things. =20 > I did an entirely blank slate demo- after ten mins, had a starting=20 > point, but had bored audience during the setup, though I might have=20 > proved some point or other. Its a good point though, I've set up a livecoding template script for flu= xus that just renders a cube in a little render function, but is a good start= ing point for building on (are we still doing a livecoding jam btw?) - and I = was feeling a little dirty and unpure - but otherwise it takes a few minutes = to write really boring code. > cheers, see some of you this week... see you later, tomorrow probably... erm, I'm the one with messy dreads bt= w :) dave Date: Tue, 24 Aug 2004 14:10:17 +0200 From: Julian Rohrhuber On 24 Aug 2004 11:08:28 +0100, N. Collins wrote >> So I improvised the TOPLAP presentation here in Aarhus and learnt >> the subject of this message. >> >> In particular, start with some (basic) templates for the work you >> do. (ie SC, don=B4t write all SynthDefs from scratch) I mean really interesting it gets when you write the C code from scratch, say a new ugen ;) have fun.. >hmm, I guess there is a danger to being too puritan with these things. > >> I did an entirely blank slate demo- after ten mins, had a starting >> point, but had bored audience during the setup, though I might have >> proved some point or other. > >Its a good point though, I've set up a livecoding template script for fluxu= s >that just renders a cube in a little render function, but is a good startin= g >point for building on (are we still doing a livecoding jam btw?) - and I wa= s >feeling a little dirty and unpure - but otherwise it takes a few minutes to >write really boring code. > >> cheers, see some of you this week... > >see you later, tomorrow probably... erm, I'm the one with messy dreads btw = :) > >dave -- =2E Date: Tue, 24 Aug 2004 19:35:12 +0200 From: Julian Rohrhuber more responsibility - makes code a little more complex (uh.) - introduces some cases of ambiguity: -> using event streams, which do not have a rate information, it is not guaranteed to be correct -> retrieving the index from outside, say for using synth messages, index needs to be updated ________ well if you like, try it, (you'll have to replace ProxySpace.sc by the attached new version) and tell me what you think. best whishes, Julian --============_-1118753494==_============ Content-Id: Content-Type: multipart/appledouble; boundary="============_-1118753494==_D============" --============_-1118753494==_D============ Content-Transfer-Encoding: base64 From: "N. Collins" We'd also appreciate a general tidying up of the TOPLAP site. We have > video footage of last night to add, and could tidy some of the > profiles etc. > > Thoughts, disclaimers, excitement? I'd be happy to have a bash at smartening up the site visually. A CSS for the wiki would be a good start and the logo should be incorporated into the page layout instead of just being pasted onto the front page. Or is it in TOPLAP's interest to retain the 1996 HTML chic? Subject: Re: [livecode] transmediale submission From: alex I'd be happy to have a bash at smartening up the site visually. A CSS > for the wiki would be a good start and the logo should be incorporated > into the page layout instead of just being pasted onto the front page. I'm definitely for this, you should have sudo access so feel free to fiddle. Only problem is that wiki.slab.org/wiki uses the same installation, hopefully it's easy to change them separately. alex Subject: Re: [livecode] transmediale submission From: alex I'm definitely for this, you should have sudo access so feel free to > fiddle. Only problem is that wiki.slab.org/wiki uses the same > installation, hopefully it's easy to change them separately. It's all embedded in the cgi script which is installed from a debian package so will get upgraded with the rest of the system, so can't really be edited as it is. You can specify a css file in the config file though which is in /etc/usemod-wiki/config alex Subject: [livecode] radio + more wiki From: alex We could also use some sections describing > each of the systems that we use This would be good, and could include excerpts from existing toplap related papers. > and perhaps even a tutorial of sorts. There's an interesting idea. What level would it operate on? I suppose that because there are several different systems it wouldn't be possible to write a TOPLAP tutorial that says in detail how someone can work with live code. Indeed, it would probably be against the fine principles of TOPLAP to be so prescriptive about how things are done. However it might work on a higher level, for example a tutorial for how to make your own live coding environment... Running through the issues that we have highlighted, for example, how a load / save function should work, how much should be predefined, the role of graphical maps and programming, and so on. So really a tutorial that asks questions rather than providing easy answers. alex Date: Sun, 29 Aug 2004 12:48:45 +0100 From: Nick Collins While promoting the placard presentation french Alex and I were > forced into appearing on Resonance FM by a force of great evil. > However it turned out to be a lot of fun. We ended up playing one > of Dave's live coded lsystem tracks. > > Anyway they were interested in featuring TOPLAP in a future program. > I don't think their listening base is particularly huge but it > might be an interesting challenge to try to get the idea of TOPLAP > across without any visual clues. > > Would anyone be interested in that? I'd be interested. Maybe get someone to sing the code over the music... :) I think resonance's web stream is quite popular, even if the actual radio broadcast is quite small. dave Date: Sun, 29 Aug 2004 18:51:34 +0100 From: Nick Collins wrote: > On Sun, 29 Aug 2004 05:03:13 +0100, Dave Griffiths wrote >> While promoting the placard presentation french Alex and I were >> forced into appearing on Resonance FM by a force of great evil. >> However it turned out to be a lot of fun. We ended up playing one >> of Dave's live coded lsystem tracks. >> >> Anyway they were interested in featuring TOPLAP in a future program. >> I don't think their listening base is particularly huge but it >> might be an interesting challenge to try to get the idea of TOPLAP >> across without any visual clues. >> >> Would anyone be interested in that? > > I'd be interested. > > Maybe get someone to sing the code over the music... :) > > I think resonance's web stream is quite popular, even if the actual radio > broadcast is quite small. > > dave > Date: Sun, 29 Aug 2004 20:31:36 +0200 Subject: Re: [livecode] big toplap performance movie and a small image From: Fredrik Olofsson Fredrik didn't publicise but he has put a massive Quicktime movie of > Thursday's TOPLAP performance up on a wiki: er, no i haven't. was someone else's clip up there (toplap1.mov). mine was too big i think -see below. > TOPLAP live jam rocked! yeah, that was great fun. i have an almost complete movie of the happening. only a couple of minutes missing here and there when resting arm / doing camera hand-over (you'll notice the cuts). compressed the movie is 260mb (42min, mpeg4, 360x288). sadly i put quite some effort to get it onto the dorkbot wiki but failed. instead there are now small clips... http://projects.dorkbot.org/rd04/wiki/ MediaFiles?action=AttachFile&do=get&target=livecoding-read_me04-01.mov http://projects.dorkbot.org/rd04/wiki/ MediaFiles?action=AttachFile&do=get&target=livecoding-read_me04-02.mov http://projects.dorkbot.org/rd04/wiki/ MediaFiles?action=AttachFile&do=get&target=livecoding-read_me04-03.mov http://projects.dorkbot.org/rd04/wiki/ MediaFiles?action=AttachFile&do=get&target=livecoding-read_me04-04.mov http://projects.dorkbot.org/rd04/wiki/ MediaFiles?action=AttachFile&do=get&target=livecoding-read_me04-05.mov http://projects.dorkbot.org/rd04/wiki/ MediaFiles?action=AttachFile&do=get&target=livecoding-read_me04-06.mov http://projects.dorkbot.org/rd04/wiki/ MediaFiles?action=AttachFile&do=get&target=livecoding-read_me04-07.mov http://projects.dorkbot.org/rd04/wiki/ MediaFiles?action=AttachFile&do=get&target=livecoding-read_me04-08.mov http://projects.dorkbot.org/rd04/wiki/ MediaFiles?action=AttachFile&do=get&target=livecoding-read_me04-09.mov http://projects.dorkbot.org/rd04/wiki/ MediaFiles?action=AttachFile&do=get&target=livecoding-read_me04-10.mov http://projects.dorkbot.org/rd04/wiki/ MediaFiles?action=AttachFile&do=get&target=livecoding-read_me04-11.mov and a few stills... http://projects.dorkbot.org/rd04/wiki/ MediaFiles?action=AttachFile&do=get&target=livecoding-read_me04-01.jpg http://projects.dorkbot.org/rd04/wiki/ MediaFiles?action=AttachFile&do=get&target=livecoding-read_me04-02.jpg http://projects.dorkbot.org/rd04/wiki/ MediaFiles?action=AttachFile&do=get&target=livecoding-read_me04-04.jpg http://projects.dorkbot.org/rd04/wiki/ MediaFiles?action=AttachFile&do=get&target=livecoding-read_me04-06.jpg http://projects.dorkbot.org/rd04/wiki/ MediaFiles?action=AttachFile&do=get&target=livecoding-read_me04-07.jpg http://projects.dorkbot.org/rd04/wiki/ MediaFiles?action=AttachFile&do=get&target=livecoding-read_me04-10.jpg http://projects.dorkbot.org/rd04/wiki/ MediaFiles?action=AttachFile&do=get&target=livecoding-read_me04-26.jpg (non-permanent links) these are just quick semi-random grabs and i hope to get the whole thing online so we all together can agree on good selections for promo material. also i'm keeping the dv original if this compression doesn't work out or if someone wants high-quality media. line-up left-to-right: dave - fluxus - visuals alex - feedback.pl - sound (beginning&end) nick - supercollider - sound (beginning&end) fredrik - supercollider/jitlib - sound (second half) tom - pd - sound (not end) joe gilmore - max - sound amy - the thingee - visuals slightly annoyed that i can't finish this and concentrate on other work. anyway, thanks all for a very good week. great to meet up in person as always. _f #| fredrikolofsson.com klippav.org musicalfieldsforever.com |# Date: Sun, 29 Aug 2004 20:34:50 +0100 (BST) Subject: Re: [livecode] big toplap performance movie and a small image From: "alex" It's from 21:30 to 22:30 on fortnightly Wednesdays - would you be able > to make that Nick? er, whatever that 'fortnightly' means i'll be in london wednesday 15th. _f #| fredrikolofsson.com klippav.org musicalfieldsforever.com |# Date: Mon, 30 Aug 2004 08:09:38 +0100 From: Nick Collins wrote: >> It's from 21:30 to 22:30 on fortnightly Wednesdays - would you be able >> to make that Nick? > > er, whatever that 'fortnightly' means i'll be in london wednesday 15th. > _f > > #| > fredrikolofsson.com klippav.org musicalfieldsforever.com > |# > > Date: Mon, 30 Aug 2004 08:23:42 +0100 From: Nick Collins osc has quite a good system for timing I think. > It uses the NTP time server, to create time stamps according to the > osc time format (64 bit, first 32bit is seconds since 1970, second 32 > is divisions of a second). > You then simply schedule the packets ahead in time, so the maximum > lag time is accounted for. This way you get very exact timing. just got osc up and running (via Steve Harris's liblo), it seems pretty nice. I see that you can timestamp event bundles to happen at a particular time. this is a slightly different matter to syncing multiple livecoding sessions on different machines though. as far as I can see it (and I might be being thick here, not sure what apps like sc and pd do) we'd need a master clock - either specifying times for the next tick - or broadcasting occasional sync times every x ticks to keep everyone in time. and this is assuming we're all using the same real time clock... confused... dave Subject: Re: [livecode] toplap-perf-postmortem From: Dave Griffiths Hi All, > > Was nice to see you all in Aarhus and run with the livecoding massive.. > > However I was wondering if we should try some sort of informal postmortem on > the gig. good idea > I felt that one aspect of live coding should be to de-mystify the process > and > reveal the act of coding to the audience so that they can gain a better > understanding > of the processes involved and have acloser relationship with the realtime > development of the > preformance.. Demystification is the aim, but I don't think it's realistic at this point for people to understand the languages we are using (I don't understand any of the sc code). I think simply showing it is enough for people to see that things are happening and doing so live. It's a start. People I spoke to said there was a tangable feeling of improvisation and/or panic :) > I felt that in aarhus the 1 screen didnt really allow for this > as it didnt allow the audience to > follow a single visual source to understand the coding process and relate it > to the audio evoilution etc. Yep, one screen leads to total chaos (kudos to Amy for dealing with the mixing though) - in the ideal setting we'd have a more controlled view of the coding. Getting everyone to use a black background is essential if the screens are overlayed - the pd window blew everything else away :) Maybe a split screen aproach would be better - with current eq levels of that performer, so you could see who was contributing what at a given time. Whatever, we need something pretty complex in the video mixing department with lots of people playing. > In connection with this i was a bit confused by daves imagery, i wasnt quite > sure how it related to the > evolving performance (were you taking a feed from someone?) I had audio out of the mixer. > and i was keen > to see some of the code that > was actually generating the visuals... The little green window floating around is the fluxus script editor. The font's a bit small though. Will be fixed. > I guess that it being a club > environment it was inevitable that > the performance would be geared in that direction, but i wonder how a less > 'party' setting would effect the > experience/peformance...perhaps people maight be put more 'on the spot' > which could be interesting... big spotlights on the current performers? :) > p.s. i reckon maybe we should comment our livecode as we go along! ;) A few observations from my POV, firstly a confession, I did load some scripts up - really as new starting points to continue from. I think I'll write some livecode template scripts for different types of visuals. This might be a bit different from doing sound, not sure. I reloaded the starting script a couple of times to take things in a different direction. I had a great time too though - much more fun than I had imagined (even with gl lockups), I am a total beginner at this performance malarky compared to you guys though :) cheers, dave Subject: Re: [livecode] big toplap performance movie and a small image From: alex Thankyou for all the effort in preparing these Fredrik! > > There is some great stuff here. Yep thanks a million, the full mov is now up here: http://static.state51.co.uk/alex/toplap/livecoding-read_me04.mov I'll need to check that it can stay there before we publicise that URL though. alex Date: Mon, 30 Aug 2004 15:00:25 +0200 Subject: Re: [livecode] big toplap performance movie and a small image From: Fredrik Olofsson http://static.state51.co.uk/alex/toplap/livecoding-read_me04.mov thanks for hosting. best part imo is definitely around 33 minutes when dave crashes ;-) now i just wonder _who_ put that microphone stand right in front of the screen. annoying. _f #| fredrikolofsson.com klippav.org musicalfieldsforever.com |# Date: Mon, 30 Aug 2004 17:56:32 +0200 From: Julian Rohrhuber Hi all, > >Just boring technical questions, how do you propose keeping timing >locked between performers, is this a job for OSC, or ye olde midi, or >something else (a big drum?) :) > >Also a question for Alex - how do you specify timings using feedback.pl >and OSC? are timestamps in seconds or ticks? > >I'm just rewriting my sequencing app and I just need some pointers >really... osc has quite a good system for timing I think. It uses the NTP time server, to create time stamps according to the osc time format (64 bit, first 32bit is seconds since 1970, second 32 is divisions of a second). You then simply schedule the packets ahead in time, so the maximum lag time is accounted for. This way you get very exact timing. -- . Subject: Re: [livecode] timing is everything From: alex osc has quite a good system for timing I think. > It uses the NTP time server, to create time stamps according to the > osc time format (64 bit, first 32bit is seconds since 1970, second 32 > is divisions of a second). > You then simply schedule the packets ahead in time, so the maximum > lag time is accounted for. This way you get very exact timing. Yep but OSC doesn't really have a system for timing beyond being able to specify that timestamp. I mean, OSC doesn't directly interact with NTP beyond borrowing its timestamp format, does it? http://www.cnmat.berkeley.edu/OpenSoundControl/OSC-spec.html NTP is great for synchronising clocks before and during a live jam, but the various systems also need to start at the same time and go at the same speed. If one joins in while others are running then it needs to be able to work out when to start, and when the speed is to change all the systems need to know about it. It would be great to work out a TOPLAP standard for time synch, a simple one that is easy to implement. Does anyone feel like working something out together? I don't mind taking my system apart and describing it in detail if it would help, but bear in mind that while it works in practice without drifting, it is fairly naive. The time server is here: http://cpan.org/authors/id/Y/YA/YAXU/perl-music-article/examples/tm-0.1.pl and the client side stuff is here, in the spread_event_loop() routine: http://cpan.org/authors/id/Y/YA/YAXU/perl-music-article/examples/feedback-0.1.pl I'm using the 'spread' protocol and libraries to send timing information around out of programmer's laziness, it would be better to use OSC for this. When it comes to sending messages to SuperCollider, I'm using timestamped OSC bundles, with a little bit of buffering as Julian describes. Or does an OSC based timing protocol already exist without me knowing about it? How does SuperCollider deal with this? Klippav, how do you keep SuperCollider and Max in synch? alex From: "Tom Betts" On Mon, 2004-08-30 at 16:56, Julian Rohrhuber wrote: > > osc has quite a good system for timing I think. > > It uses the NTP time server, to create time stamps according to the > > osc time format (64 bit, first 32bit is seconds since 1970, second 32 > > is divisions of a second). > > You then simply schedule the packets ahead in time, so the maximum > > lag time is accounted for. This way you get very exact timing. > > Yep but OSC doesn't really have a system for timing beyond being able to > specify that timestamp. I mean, OSC doesn't directly interact with NTP > beyond borrowing its timestamp format, does it? > > http://www.cnmat.berkeley.edu/OpenSoundControl/OSC-spec.html > > NTP is great for synchronising clocks before and during a live jam, but > the various systems also need to start at the same time and go at the > same speed. If one joins in while others are running then it needs to > be able to work out when to start, and when the speed is to change all > the systems need to know about it. > > It would be great to work out a TOPLAP standard for time synch, a simple > one that is easy to implement. Does anyone feel like working something > out together? I don't mind taking my system apart and describing it in > detail if it would help, but bear in mind that while it works in > practice without drifting, it is fairly naive. > > The time server is here: > http://cpan.org/authors/id/Y/YA/YAXU/perl-music-article/examples/tm-0.1.pl > > and the client side stuff is here, in the spread_event_loop() routine: > http://cpan.org/authors/id/Y/YA/YAXU/perl-music-article/examples/feedback-0.1.pl > > I'm using the 'spread' protocol and libraries to send timing information > around out of programmer's laziness, it would be better to use OSC for > this. When it comes to sending messages to SuperCollider, I'm using > timestamped OSC bundles, with a little bit of buffering as Julian > describes. > > Or does an OSC based timing protocol already exist without me knowing > about it? How does SuperCollider deal with this? Klippav, how do you > keep SuperCollider and Max in synch? > > alex > > > > Subject: Re: [livecode] toplap-perf-postmortem From: alex On Mon, 2004-08-30 at 16:56, Julian Rohrhuber wrote: >> osc has quite a good system for timing I think. >> It uses the NTP time server, to create time stamps according to the >> osc time format (64 bit, first 32bit is seconds since 1970, second 32 >> is divisions of a second). >> You then simply schedule the packets ahead in time, so the maximum >> lag time is accounted for. This way you get very exact timing. > >Yep but OSC doesn't really have a system for timing beyond being able to >specify that timestamp. I mean, OSC doesn't directly interact with NTP >beyond borrowing its timestamp format, does it? > >http://www.cnmat.berkeley.edu/OpenSoundControl/OSC-spec.html > >NTP is great for synchronising clocks before and during a live jam, but >the various systems also need to start at the same time and go at the >same speed. If one joins in while others are running then it needs to >be able to work out when to start, and when the speed is to change all >the systems need to know about it. yes, this is the funny thing - time stays relative and interwined with space, whatever we try.. On the osc list there was expressed the need for a synching method for OSC, so that on a request anyone can join in. I think that would be the best standard, because once you have synchronized clocks, the rest is easy. > >It would be great to work out a TOPLAP standard for time synch, a simple >one that is easy to implement. Does anyone feel like working something >out together? I don't mind taking my system apart and describing it in >detail if it would help, but bear in mind that while it works in >practice without drifting, it is fairly naive. we could agree on an osc message that contains a tempo information, maybe even, as a start, a simple clock tick without a timestamp, as LANs are quite fast anyway. >The time server is here: >http://cpan.org/authors/id/Y/YA/YAXU/perl-music-article/examples/tm-0.1.pl > >and the client side stuff is here, in the spread_event_loop() routine: >http://cpan.org/authors/id/Y/YA/YAXU/perl-music-article/examples/feedback-0.1.pl > >I'm using the 'spread' protocol and libraries to send timing information >around out of programmer's laziness, it would be better to use OSC for >this. When it comes to sending messages to SuperCollider, I'm using >timestamped OSC bundles, with a little bit of buffering as Julian >describes. > >Or does an OSC based timing protocol already exist without me knowing >about it? How does SuperCollider deal with this? Klippav, how do you >keep SuperCollider and Max in synch? > >alex -- . Subject: Re: [livecode] timing is everything From: alex yes, this is the funny thing - time stays relative and interwined > with space, whatever we try.. Heh > On the osc list there was expressed the need for a synching method > for OSC, so that on a request anyone can join in. Did they get anywhere? > we could agree on an osc message that contains a tempo information, > maybe even, as a start, a simple clock tick without a timestamp, as > LANs are quite fast anyway. I had a lot of problems trying to get that to work with more than one sound renderer. I think each process needs its own timer. Hmm, well maybe I'll try to come up with a working prototype that uses OSC. alex Date: Mon, 30 Aug 2004 23:31:52 +0200 From: Julian Rohrhuber On Mon, 2004-08-30 at 20:26, Julian Rohrhuber wrote: >> yes, this is the funny thing - time stays relative and interwined >> with space, whatever we try.. > >Heh > >> On the osc list there was expressed the need for a synching method >> for OSC, so that on a request anyone can join in. > >Did they get anywhere? no, but maybe there would be some resonances if we brought it up again. Maybe you should all join the osc list anyway? osc_dev-request@hiddenedu?subject=subscribe >> we could agree on an osc message that contains a tempo information, >> maybe even, as a start, a simple clock tick without a timestamp, as >> LANs are quite fast anyway. > >I had a lot of problems trying to get that to work with more than one >sound renderer. I think each process needs its own timer. Hmm, well >maybe I'll try to come up with a working prototype that uses OSC. maybe that works in sc because the server thread runs at a very high priority. But I haven't read the sources carefully yet. -- . Subject: Re: [livecode] timing is everything From: alex no, but maybe there would be some resonances if we brought it up again. > Maybe you should all join the osc list anyway? Yep, ok I subscribed. Remembering the open source mantras though I'll try to come up with some working code before bringing the subject up. alex Date: Tue, 31 Aug 2004 09:45:05 +0200 From: Julian Rohrhuber On Mon, 2004-08-30 at 22:31, Julian Rohrhuber wrote: >> no, but maybe there would be some resonances if we brought it up again. >> Maybe you should all join the osc list anyway? > >Yep, ok I subscribed. Remembering the open source mantras though I'll >try to come up with some working code before bringing the subject up. I also found the thread where it was discussed: http://www.create.ucsb.edu/pipermail/osc_dev/2003-May/subject.html -- . Subject: [livecode] timing + netmage From: alex I also found the thread where it was discussed: > > http://www.create.ucsb.edu/pipermail/osc_dev/2003-May/subject.html Wow, that looks a great + involved thread, I will read it carefully, thanks Julian. Another festival, this one in Bologna, Italy in January - should we apply? Deadline is in October. http://www.netmage.it/ alex From: Ge Wang looks good- with 1000 euro funds plus flights we could certainly > involve Julian and even Ge if they'll do transatlantic... I would love to be a part of this! Especially if a lot of us toplappers are going to be there. Let me know how I can help get us there. Best, Ge! > > --On Tuesday, August 31, 2004 12:34 pm +0100 alex > wrote: > >> On Tue, 2004-08-31 at 08:45, Julian Rohrhuber wrote: >>> I also found the thread where it was discussed: >>> >>> http://www.create.ucsb.edu/pipermail/osc_dev/2003-May/subject.html >> >> Wow, that looks a great + involved thread, I will read it carefully, >> thanks Julian. >> >> Another festival, this one in Bologna, Italy in January - should we >> apply? Deadline is in October. http://www.netmage.it/ >> >> alex >> From: Ge Wang > and perhaps even a tutorial of sorts. > > There's an interesting idea. What level would it operate on? It could give a jumpstart tutorial of several systems. To give people both an idea of how to get rolling, and that he/she can use more than one system. Basic features upon which to build live code experiments. Learning (the systems, not a particular style necessarily) by example. > I suppose > that because there are several different systems it wouldn't be > possible > to write a TOPLAP tutorial that says in detail how someone can work > with > live code. Indeed, it would probably be against the fine principles of > TOPLAP to be so prescriptive about how things are done. Good point. > However it > might work on a higher level, for example a tutorial for how to make > your own live coding environment... Running through the issues that we > have highlighted, for example, how a load / save function should work, > how much should be predefined, the role of graphical maps and > programming, and so on. So really a tutorial that asks questions > rather > than providing easy answers. It should also inspire and pique the desire to live code. Best, Ge! Date: Wed, 1 Sep 2004 21:20:13 +0200 From: Julian Rohrhuber looks good- with 1000 euro funds plus flights we could certainly >involve Julian and even Ge if they'll do transatlantic... > >--On Tuesday, August 31, 2004 12:34 pm +0100 alex wrote: > >> On Tue, 2004-08-31 at 08:45, Julian Rohrhuber wrote: >>> I also found the thread where it was discussed: >>> >>> http://www.create.ucsb.edu/pipermail/osc_dev/2003-May/subject.html >> >> Wow, that looks a great + involved thread, I will read it carefully, >> thanks Julian. >> >> Another festival, this one in Bologna, Italy in January - should we >> apply? Deadline is in October. http://www.netmage.it/ >> >> alex >> >> >> -- . From: Adrian Ward There's an extremely well informed discussion about live coding on > slashdot right now! Extremely well informed? Did I type the wrong URL or something? "I usually go to nightclubs to dance and meet women, but I guess this is the Slashdot Way." "geek gets into a night club with real live women and what does he do? He programs perl. jeez" "The guy is a DJ... instead of spinning CD's or albums, he's triggering sound files via Perl. He needs to lay off the ecstasy." "You can take the geek out of the nightclub, but you can't take the nightclub out of the geek." Subject: Re: [livecode] live coding and slashdot From: alex On 2 Sep 2004, at 3:16 pm, alex wrote: > > There's an extremely well informed discussion about live coding on > > slashdot right now! > Extremely well informed? Did I type the wrong URL or something? Heh, well you get the impression that some of them read the title of the story before posting at least. "This couldn't be more wrong. Code done properly is never spontaneous. It is planned, it is architected, it is designed." alex Date: Thu, 2 Sep 2004 08:23:42 -0700 (PDT) From: Amy Alexander On Thu, 2004-09-02 at 15:46, Adrian Ward wrote: a> > On 2 Sep 2004, at 3:16 pm, alex wrote: a> > > There's an extremely well informed discussion about live coding on a> > > slashdot right now! a> > Extremely well informed? Did I type the wrong URL or something? a> a> Heh, well you get the impression that some of them read the title of the a> story before posting at least. a> a> "This couldn't be more wrong. Code done properly is never spontaneous. a> It is planned, it is architected, it is designed." a> a> alex a> a> a> Date: Thu, 02 Sep 2004 17:15:51 +0100 From: Nick Collins wrote: > ahhh slashdot :) > we have to put some of those quotes on the wiki... > nice one alex! > > On Thu, 2 Sep 2004 15:46:32 +0100, Adrian Ward wrote >> On 2 Sep 2004, at 3:16 pm, alex wrote: >> >> > There's an extremely well informed discussion about live coding on >> > slashdot right now! >> >> Extremely well informed? Did I type the wrong URL or something? >> >> "I usually go to nightclubs to dance and meet women, but I guess >> this is the Slashdot Way." >> >> "geek gets into a night club with real live women and what does he >> do? He programs perl. jeez" >> >> "The guy is a DJ... instead of spinning CD's or albums, he's >> triggering sound files via Perl. He needs to lay off the ecstasy." >> >> "You can take the geek out of the nightclub, but you can't take the >> nightclub out of the geek." > > Date: Thu, 02 Sep 2004 19:30:57 +0100 From: Nick Collins corpseflower I like the idea of grade exams in livecoding. I started writing a proposed scheme for how the ABRSM could assess livecoding skills, but I need some help finishing it. http://www.toplap.org/?LivecodingGrades Date: Fri, 03 Sep 2004 09:53:37 +0100 From: Nick Collins wrote: > > On 2 Sep 2004, at 7:30 pm, Nick Collins wrote: > >> corpseflower > > I like the idea of grade exams in livecoding. I started writing a > proposed scheme for how the ABRSM could assess livecoding skills, but I > need some help finishing it. > > http://www.toplap.org/?LivecodingGrades > > Subject: Re: [livecode] new site layout looks good- added a small paper From: alex I like the idea of grade exams in livecoding. I started writing a > proposed scheme for how the ABRSM could assess livecoding skills, but I > need some help finishing it. > > http://www.toplap.org/?LivecodingGrades Fantastic! I wouldn't like to step in the beautiful prose in my current state of mind, but some ideas: 1/ something about being able to encode a given algorithm without any syntax errors or bugs, or perhaps to be fairer, no more than (say) three goes at compiling and running it. 2/ the student being able to execute a given piece of psuedo code in their head 3/ a later level could involve 2/, but with three levels of recursion 4/ on top of the 'getting 100 people to dance' point in grade 5, a much later grade could require "getting 100 electro-accoustic musicians to dance without the use of intoxicants" 5/ Show ability to dance convincingly to a given algorithm without it being sonified alex Subject: Re: [livecode] live coding and slashdot From: alex wow, congratulations alex! despite the, um, mixed range of > comments, it's got a whole lot of new people thinking about livecoding. > it's a confusing topic for people new to the idea, so initial confusion is > to be expected. Yep, nice to drop some toplap ideas and watch the ripples! At some point I'll go through all the comments and find the ones that have are actually good and insightful - there was at least one post that appeared recently that suggested some more early prior-art. > i'm going to go proudly posting the link to the dorkfest and readme lists. Thanks! > some of the slashdotters are asking for a sample mp3 as proof of the > pudding. is there one we can post somewhere without bringing down the > house on bandwidth? (or maybe we could post a copy of the short quicktime > from the dorkfest site so they could see screens too?) I quite like the idea of the text standing alone, letting people just imagine what it might sound like. When we're ready for the big toplap push I guess we'll have some sound and video recordings on the site... Those who will have their interest piqued will then be able to see and hear the recordings within the fuller context of TOPLAP. alex Subject: Re: [livecode] wiki formatting From: alex Is there a page with example formatting commands in the toplap wiki somewhere? > I have too many different wiki format commands in my head already :) Here you go: http://www.usemod.com/cgi-bin/wiki.pl?TextFormattingRules alex Date: Fri, 03 Sep 2004 18:05:04 +0100 From: Nick Collins I've added a place to archive performances. Please correct, add any >links etc. Anything else important people feel should be mentioned? >It's a bit difficult because there is the choice between being >selective and all inclusive... > >http://www.toplap.org/?HistoricalPerformances > >Julian, I recall you did a JITLib performance in NY in 2001...could >you check the dates. You also might want to replace this with any >earlier jitlib premiere (Berlin?). I've added the ones I remember. Only the first incidents I don't really remember, I'm pretty sure that they were on radio, but well, ... >cheers >N > >PS Not sure whether to take me out or not- I wanted to mention >Fabrice though, since we used to practise live code jamming in Sc in >2002. I expect Julian and Alberto were practising well before that >though. -- . Subject: [livecode] live coding of graphics From: Dave Griffiths I've found the experiences of dancing and programming to have > a great deal in common. Yeah me too, I don't always know what I'm doing but I have a great deal of fun doing it. A more generally interesting thing is that within two minutes of the article appearing, someone had written a fairly long comment about the Perl programming language. It seems to me as though this was an automated response by a bot. A trollbot, perhaps grabbing comments from other stories based on keywords. It was very successful anyway, getting a score of 5, set to "interesting," and a huge number of responses, despite being very little to do with the subject of the article... In fact it seemed to be in response to a story about mobile phones. But after reading through all the comments, there's only one that is particularly interesting to us. It's worth putting on the timeline, shame it's an anonymous post. I love the final sentence. I wrote code on stage in xlisp and a midi control layer I wrote myself back 'around 87. This was on an amiga 1000. I was actually wearing a guitar when I called medium level functions to generate sound, which I then played over. (The project was called masked men and purveyed a mix of both experimental and pop/dance oriented music) I thought at the time I might have been the only person to have written code/lisp code, but I posted about it on venerable online system "the well" when I joined in 1990 and somebody said they'd written lisp on stage in the 70s. Maybe with a guitar strapped on. heh. The music had to work around garbage collection, which I forced at the end of phrases. http://developers.slashdot.org/comments.pl?sid=120335&cid=10139642 Subject: [livecode] choon on eu-gene From: alex ; Fri, 3 Sep 2004 23:14:37 +0100 Received: from imp2-q.free.fr (imp2-q.free.fr [212.27.42.2]) by postfix3-2.free.fr (Postfix) with ESMTP id 508DBC0E6 for ; Sat, 4 Sep 2004 02:53:07 +0200 (CEST) Received: by imp2-q.free.fr (Postfix, from userid 33) id 1F313571A7; Sat, 4 Sep 2004 00:57:32 +0200 (MEST) Received: from vaugirard-9-82-227-220-96.fbx.proxad.net (vaugirard-9-82-227-220-96.fbx.proxad.net [82.227.220.96]) by imp2-q.free.fr (IMP) with HTTP for ; Sat, 4 Sep 2004 00:57:32 +0200 Message-ID: <1094252252.4138f6dc0e866@hiddenfr> Date: Sat, 4 Sep 2004 00:57:32 +0200 From: n++k > [+] http://lambda-the-ultimate.org/ [++] http://www.stephensykes.com/choon/choon.html -- -- 'Welcome to lens flare hell' To unsubscribe from eu-gene visit http://www.generative.net/mailman/listinfo/eu-gene --=-rwwM1o8VKAloR1bI2s1v-- Date: Wed, 08 Sep 2004 20:04:31 +0100 From: Nick Collins But do we have a whomever@hiddenorg e-mail to send this from? livecode@hiddenorg would reach us or i can create something like nick@hiddenorg to forward to you if you wish. Cheers! alex Subject: Re: [livecode] transmediale submission From: alex ok, I sent the submission as from livecode@hidden > > My mail program is stupid and wants to know an incoming mail server of > course. Is livecode an alias to your e-mail account like you've done nick > to alias to mine here? livecode@hiddenorg is this mailing list :) So there won't be an incoming mail server for that alias. The replies will go to all of us. I tried to get the mailing list to default to livecode@hiddenorg instead of livecode@hiddenorg the other day but the mailing list software doesn't want me to change it... And yes, you can be whoever you like with smtp! That's why we should all sign our messages with pgp. alex --- PGP SIGNATURE v8.23 glguhlughlughlguhlguhlguhlughlghughlughlughlughlughghluglhuglhughlgh ghluglhuglhuglhuglhughluglhugh glhu ghlug hlguhghlu ghlgu hglhug hlguh. Date: Thu, 09 Sep 2004 21:47:26 +0100 From: Nick Collins wrote: > On Thu, 2004-09-09 at 09:04, Nick Collins wrote: >> how do I send from one of those? > > There should be a way of getting your email program to send with a from > address of whatever you like (smtp doesn't have authentication). If not > I can send it, let me know. > > alex > Date: Thu, 09 Sep 2004 22:41:13 +0100 From: Nick Collins : > Dear Madam/Sir, > We, TOPLAP, enclose an entirely digital entry for the > transmediale award and club transmediale, considering the medium of > paper a little old fashioned. > We hope this is fine with you and will not disqualify us from > participation. > With thanks for your upcoming judging efforts, > TOPLAP > > > application form details: > institution: TOPLAP > address- virtual, contact via livecode@hiddenorg or the website. > title of work: TOPLAP authors: TOPLAP > > three keywords: interactive performative instrument > > URL: www.toplap.org > > Country: at least 5 Year: 2004 > > accompanied by proposal/explanation below, plus the site itself. > > We agree to the conditions of entry and wish to enter the work for > selection in the award as well as for club transmediale. > > > > http://www.toplap.org/?TransmedialeSubmission > > Some computer languages allow a programmer to change a running process > on the fly by rewriting the code that defines it. If the output of that > code is a process being revealed to human senses, an interactive > performance is possible. Applied to live computer music this means > sound compositions can be heard as they form, and musical algorithms > adjusted while already playing. > > TOPLAP is the Temporary Organisation for the Promotion of Live > Algorithm Programming, set up to explore the application of live > programming to composition and performance. TOPLAP advocates treating > algorithms as live artistic material, allowing them to be written and > manipulated while they create music or video, perhaps in front of an > audience. > > The interests of the TOPLAP membership encompass live coding > performances with output in both audio and visual modalities, the > possibilities of immediate feedback given by prototyping compositions > in interpreted programming languages and new languages for on-the-fly > programming. This international organisation is gathering membership > and profile. Recent performances and demonstrations have taken place in > Japan, USA, Sweden, Germany, England and Denmark. > > Further information on TOPLAP artists, events and theory is available > at: > > http://www.toplap.org > > The site contains links to archived performances, papers on live coding > practise and history as identified by the group. > > For the transmediale awards TOPLAP submits itself. In particular, we > would like to highlight the LUBECK04 Manifesto for Live Coding > Performance, as linked: > > http://toplap.org/?ManifestoDraft > > For instance, criticism of laptop performers is rampant (by an old > guard?) but it is quite possible to fight such rejection on a better > surface: projection of laptop screens to reveal the whole instrument. > For audio especially, it is vital that we confront the issue of a level > playing field between laptop musicians and conventional acoustic > instrumentalists with such full disclosure. Even whilst the immediacy > of gesture is still open to debate, the game is at least honest and > education of audiences paramount. (Those who worry about trivialisation > of such pedagogy are ignoring the possibility of closing your eyes > whilst listening to music...). The information can be discrete, not > overbearing: but definitely available. > > The organisation has identified many avenues of exploration for text > based interaction which we are seeking to address. Novel engagement > with adaptable level control of as great a flexibility as code affords > is a possibility too rich to ignore. > > For club transmediale TOPLAP proposes a session of performances from > live coders, with both audio and visual outputs. The audience will see > and hear the created musical and visual material, and also the > projected screens of the participants so that the processes are > entirely open and the craft apparent. The coded improvisations show a > novel blend of intellectual and emotional content that challenges > existing notions of both traditional haptic instrumental and DJ like > laptop performance. > > Attendance of particular TOPLAP members would be negotiated, but the > organisation is happy to also talk about its work in an appropriate > forum. > > Particular TOPLAP artists are linked from the site, which the judges > are encouraged to explore to see the breadth of activity and enthusiasm > in this new field. > > --------------------------------------------------- transmediale.05 - 4 - 8 feb 2005 international media art festival berlin --------------------------------------------------- transmediale - Klosterstr.68-70 - 10179 Berlin tel. +49 (0)30.24749-761 fax. +49 (0)30.24749-814 info@hiddende - http://www.transmediale.de --------------------------------------------------- Member of the European Coordination of Film Festivals Subject: Re: [livecode] Hello (& chaos) From: alex Maybe of interest - a little 'fractal' midi music player I did a while > back (in Java): > http://dannyayers.com/2000/Frac.zip > > it chooses the notes from a preset list according to the logistic > formula, one of the simplest to behave chaotically: > > x[i+1] = k*x[i]*(1-x[i]) Can you modify the formula while it is playing? alex Date: Fri, 10 Sep 2004 10:36:22 +0100 From: Rob Myers wrote: >Not as it stands, nope - Java probably isn't a good language for that >kind of change, though changing the value of the feedback constant >would have a marked (and fairly unpredictable) effect. There's a number of ways of doing it. Java has a number of interpreters for other languages available, so the changeable code could be written in (say) JavaScript. But that kinda breaks the model. Writing bytecodes by hand is a no-go. :-) So the best way would be to use Java's dynamic class loading and reflection functionality: 1. Have a text window with the running algorithm code in. 2. When the user changes the algorithm, get the text and wrap it in some boilerplate (a class with a single well-known method). 3. Compile it either by having linked the program to JavaC or by calling Runtime.exec("javac name-of-file-you-saved-the-text-in.java"), and check for errors. 4. Load the compiled class, get the method, and swap it for the one currently being called. Sounds insane but it's doable. Modern Lisp systems compile code typed at the toplevel before running it, so this has some precedent (although the Lisp systems have the compiler built in and optimised for this). I'm not suggesting you should actually do this, though. :-) - Rob. From: Fredrik Olofsson your entry form is ok, but we definetly need a hardcopy of the entry > formular > (we need the signature and some other things). > Please send us the formular if possible by fax +49 30 24 749 814), we > will put > it together with the rest of your submission! > > Best, > > Magdalena Rothweiler > assistant festivalmanagement Subject: Re: [livecode] Hello (& chaos) From: alex Not as it stands, nope - Java probably isn't a good language for that > kind of change, though changing the value of the feedback constant > would have a marked (and fairly unpredictable) effect. > What language(s) have you been using? I've been mostly using Perl, live coding is not a huge problem as it's an interpreted language. I interpret code changes into a separate area, and if that parses ok, I interpret it again, over the running code. The new code continues running with all variables intact. But I'm wondering if you got the wrong impression from the Perl.com article... While it mentions both generative music and TOPLAP, TOPLAP isn't an organisation to proliferate/promote autonomous generative music. Quite the opposite, we're here to promote live manipulation of processes. Generative music is about sowing seeds, comparable with genetic modification; altering DNA, putting the DNA in eggs, and watching it grow to see/hear the effects. In contrast, TOPLAP is more about splicing different live animals together, piecing them together from scratch in the womb, modifying their DNA while they're still growing, then experimenting with different ways of slaughtering them to get the best audio/visual/aromatic effects. Note the above is a metaphor, I would not condone any such activities on real life-forms (computer processes are not known to experience pain, and don't have faces). The TOPLAP papers might be enlightening: http://toplap.org/?ToplapPapers So while your fractal composition is interesting, I don't think TOPLAP will ratify it until you can modify the algorithm while it is being played. Ade wrote something about what TOPLAP should expect from its students, which you might find helpful: http://toplap.org/?LivecodingGrades Cheers, alex From: "Dave Griffiths" On Fri, 2004-09-10 at 10:22, Danny Ayers wrote: > > Not as it stands, nope - Java probably isn't a good language for that > > kind of change, though changing the value of the feedback constant > > would have a marked (and fairly unpredictable) effect. > > What language(s) have you been using? > > I've been mostly using Perl, live coding is not a huge problem as > it's an interpreted language. I interpret code changes into a > separate area, and if that parses ok, I interpret it again, over the > running code. The new code continues running with all variables intact. I think interpreted languages are the natural choice for live programming. Despite all the jokes on the subject, compiling stuff live would not be fun! :) Something I'd like to research is changing code on the machine level live. There may even be precedents for this. I'm thinking of a system where you have a virtual machine with a very simple instruction set (as in evolving machine code systems like tierra). You'd be able to grab processes, and move the program counter around and hack their instructions *while they are running* - no write/interpret cycle at all. For the ultra livecoding geek - it may be possible to get your head round a real system without a virtual machine in simpler 8bit computers, or microprocessors that allow code to be modified. dave Date: Fri, 10 Sep 2004 14:06:45 +0100 From: Rob Myers wrote: >I think interpreted languages are the natural choice for live programming. >Despite all the jokes on the subject, compiling stuff live would not be fun! :) Here we go: http://www.javaworld.com/javatips/jw-javatip131.html I do agree that dynamic languages are better for this, though. >Something I'd like to research is changing code on the machine level live. >There may even be precedents for this. I'm thinking of a system where you have >a virtual machine with a very simple instruction set (as in evolving machine >code systems like tierra). You'd be able to grab processes, and move the >program counter around and hack their instructions *while they are running* - >no write/interpret cycle at all. Cool. There's plenty of VMs around, varying from simulators for everything from the Apollo guidance computer (!) through PDP-8s to Z80s and PPC chips, to modrn runtimes like Parrot and the CLR. I hear Spectrum emulators are a favourite for washing machines, but I digress. :-) A VM designed for live music hacking could have midi or sound ops, score operations, etc. as part of the instruction set. The registers, cache and memory could all be structures to support this. You could indeed hack that live at the machine code level, or have a higher-level language that interprets or compiles to it. Prolog, Lisp, Java, Perl, Interactive Fiction and various other games all have their own VMs or even hardware machine implementations. Having a VM encapsulates the conceptual model of the language or the task, adding expressive power or aiding optimisation. This is a very important and popular technique. >For the ultra livecoding geek - it may be possible to get your head round a >real system without a virtual machine in simpler 8bit computers, or >microprocessors that allow code to be modified. Use a keyboard to generate bytecodes, playing the instructions live. A postmodern version of the old flip-switches and blinking lights seen on old computers (especially in films). :-) - Rob. Subject: [livecode] faq From: alex > Something I'd like to research is changing code on the machine level >> live. Perry and I are doing this with ChucK. http://chuck.cs.princeton.edu/ ChucK runs on a VM with special virtual instructions (many for dealing with time), and is type-checked and compiled down to these instructions. In this respect, it is like Java. However, it also supports truly precise audio timing and also the ability to swap sub-processes (shreds) in and out of the VM. It has complete control over scheduling (shreduling) as well as the instructions. We are working on something now which can replace sections of virtual ChucK instructions at runtime. This is a VM designed for precise audio programming, on-the-fly. Check it out and fire it up! Best, Ge! From: "Dave Griffiths" >no write/interpret cycle at all. > > Cool. > > There's plenty of VMs around, varying from simulators for everything > from the Apollo guidance computer (!) through PDP-8s to Z80s and PPC > chips, to modrn runtimes like Parrot and the CLR. I hear Spectrum > emulators are a favourite for washing machines, but I digress. :-) Yeah, thats the sort of thing I had in mind. Most speccy emulators I've seen are just huge switch statements with 256 cases, it would be cool to start with something like that - or even more esoteric like a guidence system! > A VM designed for live music hacking could have midi or sound ops, > score operations, etc. as part of the instruction set. The registers, > cache and memory could all be structures to support this. You could > indeed hack that live at the machine code level, or have a higher- > level language that interprets or compiles to it. no high level languages - I'm actually thinking of a visual version of asm (so each instruction has an icon that you can stack up) or something... so your microcode statements form visuals which create the music. I haven't thought it through properly yet as you can probably tell. > Prolog, Lisp, Java, Perl, Interactive Fiction and various other > games all have their own VMs or even hardware machine > implementations. Having a VM encapsulates the conceptual model of > the language or the task, adding expressive power or aiding > optimisation. This is a very important and popular technique. > > >For the ultra livecoding geek - it may be possible to get your head round a > >real system without a virtual machine in simpler 8bit computers, or > >microprocessors that allow code to be modified. > > Use a keyboard to generate bytecodes, playing the instructions live. > A postmodern version of the old flip-switches and blinking lights > seen on old computers (especially in films). :-) This sort of thing starts to approach circuit bending in it's ethics, which I consider a good thing - software bending? dave From: "N. Collins" it's OK, just printing application, filling in and faxing now, no prob, > don't worry. From: "Dave Griffiths" >> Something I'd like to research is changing code on the machine level > >> live. > > Perry and I are doing this with ChucK. > > http://chuck.cs.princeton.edu/ > > ChucK runs on a VM with special virtual instructions (many for > dealing with time), and is type-checked and compiled down to these > instructions. If you're programming a high level language that is compiled into bytecode, thats not what I'm talking about - that still seperates the actual running processes from the coder (with a write/compile cycle). I'm thinking of a system where you edit the machine code as it's running. In most sensible situations this would only ever result in crashes :) Just friday afternoon waffle... dave From: "Tom Betts" On Fri, 10 Sep 2004 11:20:28 +0100, alex wrote > > On Fri, 2004-09-10 at 10:22, Danny Ayers wrote: > > > Not as it stands, nope - Java probably isn't a good language for that > > > kind of change, though changing the value of the feedback constant > > > would have a marked (and fairly unpredictable) effect. > > > What language(s) have you been using? > > > > I've been mostly using Perl, live coding is not a huge problem as > > it's an interpreted language. I interpret code changes into a > > separate area, and if that parses ok, I interpret it again, over the > > running code. The new code continues running with all variables intact. > > I think interpreted languages are the natural choice for live programming. > Despite all the jokes on the subject, compiling stuff live would not be fun! :) > > Something I'd like to research is changing code on the machine level live. > There may even be precedents for this. I'm thinking of a system where you have > a virtual machine with a very simple instruction set (as in evolving machine > code systems like tierra). You'd be able to grab processes, and move the > program counter around and hack their instructions *while they are running* - > no write/interpret cycle at all. > > For the ultra livecoding geek - it may be possible to get your head round a > real system without a virtual machine in simpler 8bit computers, or > microprocessors that allow code to be modified. > > dave > > > From: "Tom Betts" If you're programming a high level language that is compiled into bytecode, > thats not what I'm talking about - that still seperates the actual running > processes from the coder (with a write/compile cycle). I'm thinking of a > system where you edit the machine code as it's running. In most sensible > situations this would only ever result in crashes :) While working on my system I did allow you to edit while executing, but you are right, it just lead to lots of crashes and other wierd effects.. for instance.. i am typing 10 Mov 20 but while i am typing it gets executed at 10 Mov 2 oops ! we get a valid execution but an unintentional process.. this started to get really silly with commands like 10 Mov 100 if executed at 10 Mov 10 and so on... So now it only commits the line when you press return.. otherwise you find yourself racing to type before the execution point comes around again!! fun though! Tom --------------------------------------------- http://www.nullpointer.co.uk http://www.r4nd.org http://www.q-q-q.net ----- Original Message ----- From: "Dave Griffiths" On Fri, 10 Sep 2004 09:54:06 -0400, Ge Wang wrote > > >> Something I'd like to research is changing code on the machine level > > >> live. > > > > Perry and I are doing this with ChucK. > > > > http://chuck.cs.princeton.edu/ > > > > ChucK runs on a VM with special virtual instructions (many for > > dealing with time), and is type-checked and compiled down to these > > instructions. > > > Just friday afternoon waffle... > > dave > > > From: Ge Wang On Fri, 10 Sep 2004 09:54:06 -0400, Ge Wang wrote >>>> Something I'd like to research is changing code on the machine level >>>> live. >> >> Perry and I are doing this with ChucK. >> >> http://chuck.cs.princeton.edu/ >> >> ChucK runs on a VM with special virtual instructions (many for >> dealing with time), and is type-checked and compiled down to these >> instructions. > > If you're programming a high level language that is compiled into > bytecode, > thats not what I'm talking about - that still seperates the actual > running > processes from the coder (with a write/compile cycle). The idea is that you can program ChucK virtual instructions directly. In the "Turing Machine" sense, the virtual instructions are identical to native machine code, but have the added benefit of being run in a controllable environment (VM). Check out: http://audicle.cs.princeton.edu/ In this sense, we can use any VM, however ChucK instructions are made for sound synthesis. There are low level operations (add,shift,or,mem etc) as well as higher level instructions for scheduling. There are probably other systems that supports this directly. But the benefits of the VM is that it makes on-the-fly "assembly" programming feasible and maybe even fun... =) What do you think, Dave and others? Best, Ge! From: "Dave Griffiths" Hi All,, > > Hey dave, > > I'm writing a system based on this kind of idea.. > basically it resembles a corewar style memory space with > instructions of ASM style > (but a little more verbose). Each mem line can hold 3 values which > are read as a command with 2 variables.. it includes pointer based > stuff and the ability to write/copy from one memory location to > another.. You can implement corewar style 'battling' code fragments > quite easily.. woahhh - that's what I'm talking about! :] > Its all written in C++ with an OPengl interface and a > text editor for the built in language. I'm still wrtiting in more > command features and have to finish the save/load functions but its > working out very welll.. If you'd like I can post a demo when its in > a decent state (probably not till xmas) I showed alex a bit of it > briefly in aarhus, but really it needs a lot more work done > (especially drag/drop library routines etc) I'd definately be interested in seeing this. So whats the OpenGL display - the state of the registers and memory? Or do you have other instructions for drawing to pixel data? The thing I always found annoying with core wars, tierra and avida was that it's really difficult to visualise whats going on. I guess it's difficult to visualise a 1 dimensional memory space in a nice way. I can imagine that you could get a lot out of the visualisation of the internals of the VM though. > I'm really into this sort of low level modular code writing and the > resulting sonification of such algos.. > Its a sort of VM which has a sort of interpret phase but very fast.. > (Basically you can edit lines and then commit them) > It only operates on a sample playback/manipulation basis at the > moment, but i'm thinking of putting some synthesis stuff into it.. > But then I might just add an OSC/UDP library of commands and be able > to use SC or PD etc for the real DSP stuff. Theres some great > potential inthese sort of systems though, even if it can get a > little crazy with lots of nasty pointer stuff.! the nasty pointer stuff is where the potential is :) dave From: "Dave Griffiths" On Sep 10, 2004, at 10:17 AM, Dave Griffiths wrote: > > > On Fri, 10 Sep 2004 09:54:06 -0400, Ge Wang wrote > >>>> Something I'd like to research is changing code on the machine level > >>>> live. > >> > >> Perry and I are doing this with ChucK. > >> > >> http://chuck.cs.princeton.edu/ > >> > >> ChucK runs on a VM with special virtual instructions (many for > >> dealing with time), and is type-checked and compiled down to these > >> instructions. > > > > If you're programming a high level language that is compiled into > > bytecode, > > thats not what I'm talking about - that still seperates the actual > > running > > processes from the coder (with a write/compile cycle). > > The idea is that you can program ChucK virtual instructions > directly. In the "Turing Machine" sense, the virtual instructions > are identical to native machine code, but have the added benefit of > being run in a controllable environment (VM). ah, all the documentation I'd read looked like a high level scripting language - I haven't seen the machine instructions. so the question is - how much would it cost to convert these sorts of systems into hardware? :) cheers, dave Date: Sun, 12 Sep 2004 19:18:57 +0200 From: Julian Rohrhuber Here's a netmage draft. They'd need to know who's available up front >(roughly, I guess we can renegotiate later if accepted). I thought I >might put in klipp av separately so left me + Fredrik off the list. >Plus its unfair for me to try to be at every event... Are >Perry/Alberto available? Julian, is the thought of being on stage >horrendous and you'd prefer an off stage table to work during the >set at your own pace undisturbed? well, I find it better to make the audience sit on the stage. -- . From: Ge Wang > The idea is that you can program ChucK virtual instructions >> directly. In the "Turing Machine" sense, the virtual instructions >> are identical to native machine code, but have the added benefit of >> being run in a controllable environment (VM). > > ah, all the documentation I'd read looked like a high level scripting > language > - I haven't seen the machine instructions. We haven't documented them yet. =) But here they are: All ChucK programs are type-checked and emitted into byte-code consisting of the following, which are really no different than native machine instructions. At the end, there is an example of a ChucK program and the corresponding ChucK VM program. The VM uses two stacks for each concurrent shred: one for memory and one as registers. Also note the operations dealing with time. One the reasons we implemented the VM with such an IS is so we may later optimize and also replace sections of code at runtime. Though the interface needs inventing, on-the-fly byte-coding is very possible in the current VM. arithmetic --- Chuck_Instr_Add_int Chuck_Instr_Inc_int Chuck_Instr_Dec_int Chuck_Instr_Complement_int Chuck_Instr_Mod_int Chuck_Instr_Minus_int Chuck_Instr_Times_int Chuck_Instr_Divide_int Chuck_Instr_Add_uint Chuck_Instr_Inc_uint Chuck_Instr_Dec_uint Chuck_Instr_Complement_uint Chuck_Instr_Mod_uint Chuck_Instr_Minus_uint Chuck_Instr_Times_uint Chuck_Instr_Divide_uint Chuck_Instr_Add_single Chuck_Instr_Minus_single Chuck_Instr_Times_single Chuck_Instr_Divide_single Chuck_Instr_Mod_single Chuck_Instr_Add_double Chuck_Instr_Minus_double Chuck_Instr_Times_double Chuck_Instr_Divide_double Chuck_Instr_Mod_double Chuck_Instr_Add_dur Chuck_Instr_Add_dur_time control: --- Chuck_Instr_Branch_Lt_int Chuck_Instr_Branch_Gt_int Chuck_Instr_Branch_Le_int Chuck_Instr_Branch_Ge_int Chuck_Instr_Branch_Eq_int Chuck_Instr_Branch_Neq_int Chuck_Instr_Branch_Lt_uint Chuck_Instr_Branch_Gt_uint Chuck_Instr_Branch_Le_uint Chuck_Instr_Branch_Ge_uint Chuck_Instr_Branch_Eq_uint Chuck_Instr_Branch_Neq_uint Chuck_Instr_Branch_Lt_single Chuck_Instr_Branch_Gt_single Chuck_Instr_Branch_Le_single Chuck_Instr_Branch_Ge_single Chuck_Instr_Branch_Eq_single Chuck_Instr_Branch_Neq_single Chuck_Instr_Branch_Lt_double Chuck_Instr_Branch_Gt_double Chuck_Instr_Branch_Le_double Chuck_Instr_Branch_Ge_double Chuck_Instr_Branch_Eq_double Chuck_Instr_Branch_Neq_double compare: --- Chuck_Instr_Lt_int Chuck_Instr_Gt_int Chuck_Instr_Le_int Chuck_Instr_Ge_int Chuck_Instr_Eq_int Chuck_Instr_Neq_int Chuck_Instr_Not_int Chuck_Instr_Negate_int Chuck_Instr_Negate_uint Chuck_Instr_Negate_single Chuck_Instr_Negate_double Chuck_Instr_Lt_uint Chuck_Instr_Gt_uint Chuck_Instr_Le_uint Chuck_Instr_Ge_uint Chuck_Instr_Eq_uint Chuck_Instr_Neq_uint Chuck_Instr_Lt_single Chuck_Instr_Gt_single Chuck_Instr_Le_single Chuck_Instr_Ge_single Chuck_Instr_Eq_single Chuck_Instr_Neq_single Chuck_Instr_Lt_double Chuck_Instr_Gt_double Chuck_Instr_Le_double Chuck_Instr_Ge_double Chuck_Instr_Eq_double Chuck_Instr_Neq_double bitwise | logical --- Chuck_Instr_Binary_And Chuck_Instr_Binary_Or Chuck_Instr_Binary_Xor Chuck_Instr_Binary_Shift_Right Chuck_Instr_Binary_Shift_Left Chuck_Instr_And Chuck_Instr_Or Chuck_Instr_Goto register/memory stack manipulation --- Chuck_Instr_Reg_Pop_Word Chuck_Instr_Reg_Pop_Word2 Chuck_Instr_Reg_Pop_Mem Chuck_Instr_Reg_Pop_Mem2 Chuck_Instr_Reg_Push_Imm Chuck_Instr_Reg_Push_Imm2 Chuck_Instr_Reg_Push_Now Chuck_Instr_Reg_Push_Mem Chuck_Instr_Reg_Push_Mem2 Chuck_Instr_Reg_Push_Deref Chuck_Instr_Reg_Push_Deref2 Chuck_Instr_Mem_Push_Imm Chuck_Instr_Mem_Push_Imm2 Chuck_Instr_Mem_Pop_Word Chuck_Instr_Mem_Pop_Word2 special --- Chuck_Instr_Nop Chuck_Instr_EOC higher level --- Chuck_Instr_Chuck_Assign Chuck_Instr_Chuck_Assign2 Chuck_Instr_Chuck_Assign_Deref Chuck_Instr_Chuck_Assign_Deref2 Chuck_Instr_Chuck_Assign_Object Chuck_Instr_Chuck_Assign_Object_Deref Chuck_Instr_Chuck_Assign_Object2 Chuck_Instr_Chuck_Release_Object func call --- Chuck_Instr_Func_Call Chuck_Instr_Func_Call2 Chuck_Instr_Func_Call3 Chuck_Instr_Func_Call0 Chuck_Instr_Func_Return Chuck_Instr_Spork more higher level shreduling/io --- Chuck_Instr_Print_Console Chuck_Instr_Print_Console2 Chuck_Instr_Time_Advance Chuck_Instr_Midi_Out Chuck_Instr_Midi_Out_Go Chuck_Instr_Midi_In Chuck_Instr_Midi_In_Go Chuck_Instr_ADC Chuck_Instr_DAC Chuck_Instr_Bunghole Chuck_Instr_UGen_Link Chuck_Instr_UGen_UnLink Chuck_Instr_UGen_Alloc Chuck_Instr_UGen_DeAlloc Chuck_Instr_UGen_Ctrl Chuck_Instr_UGen_CGet Chuck_Instr_UGen_Ctrl2 Chuck_Instr_UGen_CGet2 Chuck_Instr_UGen_PMsg Chuck_Instr_UGen_Ctrl_Op Chuck_Instr_UGen_CGet_Op Chuck_Instr_UGen_CGet_Gain Chuck_Instr_UGen_Ctrl_Gain Chuck_Instr_UGen_CGet_Last Chuck_Instr_DLL_Load Chuck_Instr_DLL_Unload casting --- Chuck_Instr_Cast_single2int Chuck_Instr_Cast_int2single Chuck_Instr_Cast_double2int Chuck_Instr_Cast_int2double Chuck_Instr_Cast_single2double Chuck_Instr_Cast_double2single example: --- // connect sine oscillator to d/a convertor sinosc s => dac; // infinite time-loop while(true) { // randomly set a frequency std.rand2f( 30.0, 1000.0 ) => s.freq; // advance time 100::ms => now; } this becomes: [chuck]: dumping src/shred 'osc.ck'... ... '0' 24Chuck_Instr_Mem_Push_Imm( 28 ) '1' 24Chuck_Instr_Reg_Push_Imm( 24 ) '2' 24Chuck_Instr_Reg_Push_Imm( 5350240 ) '3' 22Chuck_Instr_UGen_Alloc( ) '4' 32Chuck_Instr_Chuck_Assign_Object2( ) '5' 15Chuck_Instr_DAC( ) '6' 21Chuck_Instr_UGen_Link( ) '7' 24Chuck_Instr_Reg_Pop_Word( ) '8' 24Chuck_Instr_Reg_Push_Imm( 1 ) '9' 24Chuck_Instr_Reg_Push_Imm( 0 ) '10' 26Chuck_Instr_Branch_Eq_uint( 31 ) '11' 25Chuck_Instr_Reg_Push_Imm2( 30.000000 ) '12' 25Chuck_Instr_Reg_Push_Imm2( 1000.000000 ) '13' 24Chuck_Instr_Reg_Push_Imm( 16 ) '14' 24Chuck_Instr_Reg_Push_Imm( 375076 ) '15' 24Chuck_Instr_Reg_Push_Imm( 28 ) '16' 22Chuck_Instr_Func_Call3( ) '17' 24Chuck_Instr_Reg_Push_Mem( 24 ) '18' 24Chuck_Instr_Reg_Push_Imm( 171168 ) '19' 24Chuck_Instr_Reg_Push_Imm( 171380 ) '20' 22Chuck_Instr_UGen_Ctrl2( ) '21' 25Chuck_Instr_Reg_Pop_Word2( ) '22' 24Chuck_Instr_Reg_Push_Imm( 100 ) '23' 27Chuck_Instr_Cast_int2double( ) '24' 25Chuck_Instr_Reg_Push_Imm2( 44.100000 ) '25' 24Chuck_Instr_Times_double( ) '26' 24Chuck_Instr_Reg_Push_Now( ) '27' 24Chuck_Instr_Add_dur_time( ) '28' 24Chuck_Instr_Time_Advance( ) '29' 25Chuck_Instr_Reg_Pop_Word2( ) '30' 16Chuck_Instr_Goto( 8 ) '31' 15Chuck_Instr_EOC( ) Best, Ge! From: "Dave Griffiths" yeah, I'm up for it. >=20 > On Tue, 14 Sep 2004 12:56:18 +0100, Nick Collins wrote > > I could if needed. Dave, Ade? Anyone else passing near London or any=20 > > web link we can setup? > >=20 > > --On Monday, September 13, 2004 3:49 pm +0100 alex wr= ote: > >=20 > > > On Mon, 2004-09-13 at 15:38, dosensos wrote: > > >> hi alex, thanks for your email, we might be able to do something o= n > > >> our programme for the 29th of september. can you send us more info= on > > >> the toplap project ? > > > > > > I replied with a link to the papers on toplap.org, who could make t= he > > > 29th? > > > > > > alex > > > > > > > > > Date: Wed, 15 Sep 2004 09:02:41 +0100 From: Nick Collins wrote: > ...what is the current thinking on what we are doing? > > if there is jam in the offing I'll try and get my new software into some > working state for it, I just added scala support, so I can livecode in > popular scales such as "W=FCrschmidt's 31-tone system with alternative > tritone" > > On Tue, 14 Sep 2004 13:10:51 +0100, Dave Griffiths wrote >> yeah, I'm up for it. >> >> On Tue, 14 Sep 2004 12:56:18 +0100, Nick Collins wrote >> > I could if needed. Dave, Ade? Anyone else passing near London or any >> > web link we can setup? >> > >> > --On Monday, September 13, 2004 3:49 pm +0100 alex >> > wrote: >> > >> > > On Mon, 2004-09-13 at 15:38, dosensos wrote: >> > >> hi alex, thanks for your email, we might be able to do something on >> > >> our programme for the 29th of september. can you send us more info >> > >> on the toplap project ? >> > > >> > > I replied with a link to the papers on toplap.org, who could make = the >> > > 29th? >> > > >> > > alex >> > > >> > > >> > > > > From: "Dave Griffiths" it sounds like a jam to me. > > But can you alter the mathematics of the scale system live? That's > what I'll be doing if you're just going to use scala as a MIDI database... no, but sort of yes - I've tried to write a description of this several times and failed, so I'll resort to bad ascii art. note information from melody | | interval set by user \/ -------------------- | scale quantiser | <---- uses scala descriptions | | | | | | this box is optional | \/ | | scale filtering | -------------------- | | \/ out to synth/sampler etc the scale filtering allows you to exclude certain notes from the octave. It should be apparent at this point that I know sod all about music theory, and that filtering notes is a complete hack, but there you go :) the cool thing is that I've dumped midi, so I can specify frequencies with note events to my sampler and synth via osc. on another note (and I'm aware this is getting a bit long and technical) I've finally also got the sequencer and sampler apps to pass timestamped events through osc in ntp format, and the sampler stacks them up to act on in time. so if/when we figure out a toplap standard syncronisation standard, I might be able to use it somehow :) perhaps I should just use ChucK or supercollider... dave Date: Wed, 15 Sep 2004 17:25:24 +0200 From: Julian Rohrhuber On Wed, 15 Sep 2004 09:02:41 +0100, Nick Collins wrote >> it sounds like a jam to me. >> >> But can you alter the mathematics of the scale system live? That's >> what I'll be doing if you're just going to use scala as a MIDI database... > >no, but sort of yes - I've tried to write a description of this several times >and failed, so I'll resort to bad ascii art. > >note information from melody > | > | interval set by user > \/ >-------------------- >| scale quantiser | <---- uses scala descriptions >| | | >| | | this box is optional >| \/ | >| scale filtering | >-------------------- > | > | > \/ >out to synth/sampler etc > >the scale filtering allows you to exclude certain notes from the octave. It >should be apparent at this point that I know sod all about music theory, and >that filtering notes is a complete hack, but there you go :) that seems quite ok. I guess you can use an array of floating point numbers as scale, so you are quite free. All the rest can actually happen a level higher, so the filter should maybe be there. >the cool thing is that I've dumped midi, so I can specify frequencies with >note events to my sampler and synth via osc. > >on another note (and I'm aware this is getting a bit long and technical) I've >finally also got the sequencer and sampler apps to pass timestamped events >through osc in ntp format, and the sampler stacks them up to act on in time. > >so if/when we figure out a toplap standard syncronisation standard, I might be >able to use it somehow :) > >perhaps I should just use ChucK or supercollider... it is great that we have a habitat with several systems. I'm looking forward to see you using it (apart from on video..) -- . From: "marcus" create and enter a room called "livecoding" You're in the livecoding room. the livecoding room> define an "oon" as a bassdrum pitch on the 10th channel, at full volume Okay. the livecoding room> define a "cha" as a hihat pitch on the 10th channel, at full volume Okay. the livecoding room> define an "ooncha" as an oon followed by a cha Okay. the livecoding room> create an ooncha Okay, added an ooncha to the livecoding room. the ooncha> set your duration to a quarter Okay, my duration is a quarter. the ooncha> set your tempo to 120 bpm Okay, my tempo is 120 bpm. the ooncha> play repeatedly Okay, I'm playing repeatedly. the ooncha> stop Okay, I stopped. *** Thoughts? I'd also be delighted to receive suggestions in the form of potential transcripts. thanks, take care, -C -- Craig Latta improvisational musical informaticist craig@hiddenorg [|] Proceed for Truth! Subject: Re: [livecode] idea: musical interactive fiction From: Dave Griffiths hey there-- > > Yesterday I had a rather strange idea. I had been working on a novel > approach to "interactive fiction" (see http://netjam.org/cloak , which I > have working). I developed a rather useful parser for english. I started > thinking about other kinds of conversations I could have with it, and > realized that conversations about musical structure and performance > might be interesting. Now I think it would make a nice vehicle for > musical livecoding: > > - The exchanges are all in natural language, making them relatively easy > for an audience to follow, and perhaps even poetic. > > - The conversation can take place with high-contrast text on a black > background, making projection (and mixing with other projections) > straightforward. > > - Multiple people can interact in it at once, over the net (perhaps even > including audience members and/or remote parties). > > - It can fulfill various mundane show tasks (e.g., have a "room" which, > when entered, plays the pre-show background music). > > Here's a sample transcript, which I also have working: > > *** > > You are immersed in silence. > > silence> create and enter a room called "livecoding" > > You're in the livecoding room. > > the livecoding room> define an "oon" as a bassdrum pitch on the 10th > channel, at full volume > > Okay. > > the livecoding room> define a "cha" as a hihat pitch on the 10th > channel, at full volume > > Okay. > > the livecoding room> define an "ooncha" as an oon followed by a cha > > Okay. > > the livecoding room> create an ooncha > > Okay, added an ooncha to the livecoding room. > > the ooncha> set your duration to a quarter > > Okay, my duration is a quarter. > > the ooncha> set your tempo to 120 bpm > > Okay, my tempo is 120 bpm. > > the ooncha> play repeatedly > > Okay, I'm playing repeatedly. > > the ooncha> stop > > Okay, I stopped. > > *** > > Thoughts? I'd also be delighted to receive suggestions in the form of > potential transcripts. > > > thanks, take care, > > -C > > -- > Craig Latta > improvisational musical informaticist > craig@hiddenorg > www.netjam.org > [|] Proceed for Truth! > From: Rob Myers i like this idea a lot, you'd have to be a demon typer, but it would be > fun. I once saw someone "VJ" by playing text adventure games on a c64 > :) > > On Thu, 2004-09-23 at 08:35, Craig Latta wrote: >> hey there-- >> >> Yesterday I had a rather strange idea. I had been working on a novel >> approach to "interactive fiction" (see http://netjam.org/cloak , >> which I >> have working). I developed a rather useful parser for english. I >> started >> thinking about other kinds of conversations I could have with it, and >> realized that conversations about musical structure and performance >> might be interesting. Now I think it would make a nice vehicle for >> musical livecoding: >> >> - The exchanges are all in natural language, making them relatively >> easy >> for an audience to follow, and perhaps even poetic. >> >> - The conversation can take place with high-contrast text on a black >> background, making projection (and mixing with other projections) >> straightforward. >> >> - Multiple people can interact in it at once, over the net (perhaps >> even >> including audience members and/or remote parties). >> >> - It can fulfill various mundane show tasks (e.g., have a "room" >> which, >> when entered, plays the pre-show background music). >> >> Here's a sample transcript, which I also have working: >> >> *** >> >> You are immersed in silence. >> >> silence> create and enter a room called "livecoding" >> >> You're in the livecoding room. >> >> the livecoding room> define an "oon" as a bassdrum pitch on the 10th >> channel, at full volume >> >> Okay. >> >> the livecoding room> define a "cha" as a hihat pitch on the 10th >> channel, at full volume >> >> Okay. >> >> the livecoding room> define an "ooncha" as an oon followed by a cha >> >> Okay. >> >> the livecoding room> create an ooncha >> >> Okay, added an ooncha to the livecoding room. >> >> the ooncha> set your duration to a quarter >> >> Okay, my duration is a quarter. >> >> the ooncha> set your tempo to 120 bpm >> >> Okay, my tempo is 120 bpm. >> >> the ooncha> play repeatedly >> >> Okay, I'm playing repeatedly. >> >> the ooncha> stop >> >> Okay, I stopped. >> >> *** >> >> Thoughts? I'd also be delighted to receive suggestions in the form of >> potential transcripts. >> >> >> thanks, take care, >> >> -C >> >> -- >> Craig Latta >> improvisational musical informaticist >> craig@hiddenorg >> www.netjam.org >> [|] Proceed for Truth! >> > > > -- "Smash global capitalism! Spend less money!" Date: Thu, 23 Sep 2004 16:30:39 -0700 (PDT) From: Amy Alexander i like this idea a lot, you'd have to be a demon typer, but it would be DG> fun. I once saw someone "VJ" by playing text adventure games on a c64 :) DG> DG> On Thu, 2004-09-23 at 08:35, Craig Latta wrote: DG> > hey there-- DG> > DG> > Yesterday I had a rather strange idea. I had been working on a novel DG> > approach to "interactive fiction" (see http://netjam.org/cloak , which I DG> > have working). I developed a rather useful parser for english. I started DG> > thinking about other kinds of conversations I could have with it, and DG> > realized that conversations about musical structure and performance DG> > might be interesting. Now I think it would make a nice vehicle for DG> > musical livecoding: DG> > DG> > - The exchanges are all in natural language, making them relatively easy DG> > for an audience to follow, and perhaps even poetic. DG> > DG> > - The conversation can take place with high-contrast text on a black DG> > background, making projection (and mixing with other projections) DG> > straightforward. DG> > DG> > - Multiple people can interact in it at once, over the net (perhaps even DG> > including audience members and/or remote parties). DG> > DG> > - It can fulfill various mundane show tasks (e.g., have a "room" which, DG> > when entered, plays the pre-show background music). DG> > DG> > Here's a sample transcript, which I also have working: DG> > DG> > *** DG> > DG> > You are immersed in silence. DG> > DG> > silence> create and enter a room called "livecoding" DG> > DG> > You're in the livecoding room. DG> > DG> > the livecoding room> define an "oon" as a bassdrum pitch on the 10th DG> > channel, at full volume DG> > DG> > Okay. DG> > DG> > the livecoding room> define a "cha" as a hihat pitch on the 10th DG> > channel, at full volume DG> > DG> > Okay. DG> > DG> > the livecoding room> define an "ooncha" as an oon followed by a cha DG> > DG> > Okay. DG> > DG> > the livecoding room> create an ooncha DG> > DG> > Okay, added an ooncha to the livecoding room. DG> > DG> > the ooncha> set your duration to a quarter DG> > DG> > Okay, my duration is a quarter. DG> > DG> > the ooncha> set your tempo to 120 bpm DG> > DG> > Okay, my tempo is 120 bpm. DG> > DG> > the ooncha> play repeatedly DG> > DG> > Okay, I'm playing repeatedly. DG> > DG> > the ooncha> stop DG> > DG> > Okay, I stopped. DG> > DG> > *** DG> > DG> > Thoughts? I'd also be delighted to receive suggestions in the form of DG> > potential transcripts. DG> > DG> > DG> > thanks, take care, DG> > DG> > -C DG> > DG> > -- DG> > Craig Latta DG> > improvisational musical informaticist DG> > craig@hiddenorg DG> > www.netjam.org DG> > [|] Proceed for Truth! DG> > DG> DG> From: Adrian Ward the livecoding room> define an "ooncha" as an oon followed by a cha This is great. Amy's right - it does demystify programming in a live context. There's something vaguely HyperTalkish about it, though. And whilst that's generally a positive thing for accessibility, there's something rather silly about statements like if there is a character "?" in theQuestion then put "!" into character (the length of theQuestion) - 1 of theQuestion end if Natural language coding is great for beginners but it gets messy very quickly when - as a programmer - you need more advanced possibilities. More awkwardly, avoiding such verbosity is also dangerous. Look at the mess Lingo is in with it's half-baked dot syntax. The language becomes harder to understand in either scenario. It's be good to see how, perhaps, sticking to strictly music-related functionality will avoid these sorts of problems - do you have any further examples/ideas? Cheers, -- Adrian Ward, Signwave UK http://www.signwave.co.uk/ - +44(0)7711 623498 2nd Floor North, Rutland House, 42-46 New Road, London, E1 2AX, UK. Date: Fri, 24 Sep 2004 11:21:51 -0700 From: Craig Latta There's something vaguely HyperTalkish about it, though. And whilst > that's generally a positive thing for accessibility, there's > something rather silly about statements like > > if there is a character "?" in theQuestion then > put "!" into character (the length of theQuestion) - 1 of > theQuestion > end if The parser allows more straightforward statements; the above would probably be "if the expression ends with a question mark, end the expression with an exclamation point". Everything is completely grammatical, and there's no need for compoundWords, (gratuitous spaces), or delimited clauses . But I know what you mean. :) > Natural language coding is great for beginners but it gets messy very > quickly when - as a programmer - you need more advanced possibilities. Yeah, I think the trickiest thing for me in this context will be trying to do everything as a series of one-liners. It seems doable, though, especially given the ability to "look" at what's been done so far, so as not to lose track of things. > do you have any further examples/ideas? Well, when I was thinking about using a special room for "preshow" music, I started thinking about manipulating prerecorded material, and expressions like "slow the pitch to 50%", "play backwards", "enqueue 'A Hard Day's Night'", and so on. Also, the idea of "taking" musical elements and carrying them from one room to another is appealing. And I plan to use heavily the ability to move between different scopes of a musical structure, to tweak things. E.g.: *** the score> speak to the third note of the thirty-first measure Hi, a note here. a note> your pitch is A4 Okay. *** (While the score is playing, of course.) What I plan to do now is just run the thing, try to keep it interesting, and record the transcripts for people to review (which hopefully will lead to more suggestions). Actually telling some sort of story over the course of a show (improvised, ideally) would be a lot of fun. thanks again, -C -- Craig Latta improvisational musical informaticist craig@hiddenorg [|] Proceed for Truth! Date: Sat, 25 Sep 2004 10:55:50 +0100 From: Nick Collins Some interloper keeps attacking the toplap home page and sticking some link NC> to NC> NC> http://SeaCloud9.org This web site is dedicated to the latest technological NC> advancements, open source code, and cutting edge multimedia art. NC> NC> on. They replace the page with an old revision. I assume this isn't one of NC> us on the list? NC> NC> I just restored it. NC> NC> This is the second time- anything we can do? or do we submit to the fun of NC> constant revisionism that is wiki land? I'm especially waiting for the NC> TransmedialeSubmission to get hacked just as the jury visits... NC> Date: Tue, 28 Sep 2004 16:01:14 +0000 From: f Subject: [livecode] pyper alex, did you see this? http://www.stanford.edu/~andyszy/pyper/ seems like your perl-article spawned off. mailed the guy to complain a little (installer didn't work. grab the sourcecode zipfile if you want to try it. it has a binary you can run on os x) and also invited him here. sorry i can't join tomorrow - not in the uk. /f0 #| fredrikolofsson.com klippav.org musicalfieldsforever.com |# Date: Tue, 28 Sep 2004 16:43:12 +0000 From: f Subject: Re: [livecode] pyper > > sorry i can't join tomorrow - not in the uk. > > we could do a LINK UP with WEBCAMS sure. or, someone (nick) can run a supercollider server on his/her machine and let others join in for a play from remote. i have some nasty feedbackpatches to share with the radio audience and this way i don't have to hear them myself ;-) might be a hassle with routers+firewalls tho... :f0 Subject: [livecode] [Fwd: ULTRASOUND 2004 / PROGRAMME / medialounge // #69 From: alex Ge: > > >What is our stance on combining non-livecode > > >software with state-approved systems during performance? > > Interesting question! woah there - we're in danger of that "you should livecode the OS" insanity again :] > > Maybe we should put out a licence (lcl) that disallows linking to any > > non live coding system.. > > Heh, I agree. It's easy, and perhaps useful to make idealistic > statements like "If software isn't written live, or interacted with, > then it is not a performance." While I these statements are useful for > discussion and self-reflection, I don't think they should bind anyone. I dearly hope you're all joking :) > To be honest I am sceptical about whether reading and understanding the > code that a livecoder is making could really enhance the musical > experience. The interactions between musical processes are more > interesting than the musical processes themselves (the whole being > greater than the sum) but you can only read one thing at a time [1]. So > perhaps it's better not to read one subroutine at a time, and to instead > listen to all of them at once. Live "stepping through a debugger" might be more descriptive - I'm not entirely kidding - I think program flow should be made obvious to the audience too, this is why I like the idea of machine code instructions, or even simpler building blocks where the action is more apparent: http://www.logiblocs.com/logi-orange.htm :) I guess I don't see live coding as being about typing or languages, but essentially about demonstrating a beautiful process that also sounds/looks{/smells?} good as a side effect. Maybe I'm a crazy heretic. Hope you're ok though, alex! dave From: "Dave Griffiths" Dear All, > > Apologize for the cross posts for those also on ChucK list. > > We have a video recording of Perry and my on-the-fly coding > performance in Princeton last week. (audio captured on > poor stereo mic onboard consumer camera). Two projections, > two laptops. > > http://on-the-fly.cs.princeton.edu/listen/ (third > performance down) Good stuff - I see you two are a bit more practised at this than some of us :) > The graphics visualization software you see on the right is a > standalone app not yet integrated into ChucK. It visualizes > the mic input + waterfall plot of STFT in real-time. We just released! > It is festive to pop that up during a projected performance: > > http://soundlab.cs.princeton.edu/software/sndpeek/ Is it livecodable too? cheers, dave From: Ge Wang > The graphics visualization software you see on the right is a >> standalone app not yet integrated into ChucK. It visualizes >> the mic input + waterfall plot of STFT in real-time. We just >> released! >> It is festive to pop that up during a projected performance: >> >> http://soundlab.cs.princeton.edu/software/sndpeek/ > > Is it livecodable too? Not yet. For now, you can tweak the parameters that control how the spectrum is rendered and even change viewing angle. Once we get this into ChucK (via GLucK or the Audicle), then we can do a lot more with it. For now, it's just fun to put up for people to look at along with us typing (sometimes incompetently). What is our stance on combining non-livecode software with state-approved systems during performance? To me, it seems that it is OK as long as it augments and doesn't take the place of the on-the-fly experience. Having something like sndpeek is a good thing in this case. What do you think, people? Best, Ge! Date: Thu, 21 Oct 2004 21:54:59 +0200 From: Julian Rohrhuber incompetently). What is our stance on combining non-livecode >software with state-approved systems during performance? >To me, it seems that it is OK as long as it augments and doesn't >take the place of the on-the-fly experience. Having something >like sndpeek is a good thing in this case. What do you think, >people? Maybe we should put out a licence (lcl) that disallows linking to any non live coding system.. I don't see much point in trying to keep performace as pure as possible. The question is more what the intentions of presenting the text on stage are. I could imagine that someone working on a poem on stage while a tape plays 'Les sons et les parfums tournent dans l'air du soir' by Debussy. The main thing in my view is a certain perspective on language that programming can change when it is done interactively. -- . Subject: Re: [livecode] ChucK performance video + sndpeek From: alex >What is our stance on combining non-livecode > >software with state-approved systems during performance? Interesting question! > Maybe we should put out a licence (lcl) that disallows linking to any > non live coding system.. Heh, I agree. It's easy, and perhaps useful to make idealistic statements like "If software isn't written live, or interacted with, then it is not a performance." While I these statements are useful for discussion and self-reflection, I don't think they should bind anyone. > I don't see much point in trying to keep performace as pure as > possible. The question is more what the intentions of presenting the > text on stage are. [...] The main thing in my view is a certain > perspective on language that programming can change when it is done > interactively. For me presenting the text is allowing the listener to see some of the movement that makes the music. If they can see the text being typed and chopped around then they can hopefully feel able to open themselves to the music as something that is alive in some sense. Last night I heard someone play music from an ipod that they said they made using astrophysics. Normally I wouldn't take it seriously but in this case I trusted the musician, because I knew him. I didn't make the connection between the music and the twin pulsar systems that he was sampling from using a radio telescope, but I was able to take it on trust and listen to the music as very nice, carefully made music. However in general I can't trust blank face behind a hidden laptop screen. To be honest I am sceptical about whether reading and understanding the code that a livecoder is making could really enhance the musical experience. The interactions between musical processes are more interesting than the musical processes themselves (the whole being greater than the sum) but you can only read one thing at a time [1]. So perhaps it's better not to read one subroutine at a time, and to instead listen to all of them at once. Anyway, this all somehow becomes irrelevant if people are dancing, the focus then hopefully being away from the performer. Excuse me if I am coming across a bit vague, I'm coming off a general anaesthetic! alex [1] From "The raft" by Jose Saramago (in translation): "Writing is extremely difficult, it is an enormous responsibility, you need only think of the exhausting work involved in setting out events in chronological order, first this one, then that, or, if considered more convenient to achieve the right effect, today's event placed before yesterday's episode, and other no less risky acrobatics, the past treated as if it were new, the present as a continuous process without any present or ending, but, however hard writers might try, there is one feat they cannot achieve, that is to put into writing, in the same tense, two events which have occurred simultaneously. Some believe the difficulty can be solved by dividing the page into two columns, side by side, but this strategy is ingenuous, because the one was written first and the other afterwards, without forgetting that the reader will have to read this one first and then the other one, or vice versa, the people who come off best are the opera singers, each with his or her own part to sing, three, four, five, six between tenors, basses, sopranos and baritones, all singing different words, for example, the cynic mocking, the ingenue pleading, the gallant lover slow in coming to her aid, what interests the opera-goer is the music, but the reader is not like this, he wants everything explained, syllable by syllable and one after the other, as they are shown here." Perhaps this doesn't apply to a computer, which can have or emulate multiple processes. But does it apply to a human reading code or not? From: Ge Wang To be honest I am sceptical about whether reading and understanding > the code that a livecoder is making could really enhance the musical > experience. If that is true, then why do we (toplap) exist? =) I understand your skepticism, but I like to believe and sincerely hope that reading and understanding the code CAN truly enhance the musical experience (you do too, I am sure) - at least in the same way of watching/appreciating performances of traditional musical instruments. How we can achieve this is part of our research challenge. It depends on the syntax/semantic of the language/tool. Maybe, say LisP/ML/(and SC, to a good extent) can be hard to grok on screen, but it doesn't have to be that way. One of the our primary goals in ChucK is to make expressive, READABLE audio programs. And indeed, the way timing works in ChucK allows readers to follow timing the exact way as following program flow: -- // connect impulse (train) generator to dac impulse i => dac; // infinite time loop while( true ) { // set next sample to 1.0 1.0 => i.next; // advance time by 80 samples 80::samp => now; } --- You can follow the control structures and know exactly what and WHEN things are happening (generating a impulse train with period of 80 samples). Pair this with concurrency, and it allows you to follow parallel flows and you can reason about timing in a synchronous way across the entire system. Of course, this alone is probably not enough and there are other things that we can do to get the understanding across - like visualization the coding process and how code is running in the system at run-time http://audicle.cs.princeton.edu/ One of our goals is to make code a live instrument by conveying the intent of the musician to the audience, and to give the audience an opportunity to appreciate the process of realizing that intent. It is this lack of perceivable intent that really make most computer music utterly meaningless in live performance. We don't know if the performer succeeds in his/her gestures at any level because we don't know what those gestures are. The ability to understand and appreciate code is central because code is our gesture. I believe we must try to get this part right. Okay, I stop typing now. Best, Ge! Date: Fri, 22 Oct 2004 10:40:27 +0200 From: Julian Rohrhuber Ge: >> >What is our stance on combining non-livecode >> >software with state-approved systems during performance? > >Interesting question! > >> Maybe we should put out a licence (lcl) that disallows linking to any >> non live coding system.. > >Heh, I agree. It's easy, and perhaps useful to make idealistic >statements like "If software isn't written live, or interacted with, >then it is not a performance." While I these statements are useful for >discussion and self-reflection, I don't think they should bind anyone. yes, I would agree that these kind of statements do not lead very far. What is a 'performance' depends to a large degree on the context - and if something is set up as performance it is likely to be considered one. >To be honest I am sceptical about whether reading and understanding the >code that a livecoder is making could really enhance the musical >experience. The interactions between musical processes are more >interesting than the musical processes themselves (the whole being >greater than the sum) but you can only read one thing at a time [1]. So >perhaps it's better not to read one subroutine at a time, and to instead >listen to all of them at once. The question is when you hear a change in music that seems to be somehow externally caused you will try to relate it to something. Dependent on the change and the expectation you might relate it to a slider movement, a new preset loaded into the program, or a change in a text that stands for the sound. Reading the text might, and this is what I would hope for if just in time coding should be worthwhile, change the way I percieve sound phenomena and their interactions as well as make me aware of the active qualities of written text. > >[1] From "The raft" by Jose Saramago (in translation): > >"Writing is extremely difficult, it is an enormous responsibility, you >need only think of the exhausting work involved in setting out events in >chronological order, first this one, then that, or, if considered more >convenient to achieve the right effect, today's event placed before >yesterday's episode, and other no less risky acrobatics, the past >treated as if it were new, the present as a continuous process without >any present or ending, but, however hard writers might try, there is one >feat they cannot achieve, that is to put into writing, in the same >tense, two events which have occurred simultaneously. Some believe the >difficulty can be solved by dividing the page into two columns, side by >side, but this strategy is ingenuous, because the one was written first >and the other afterwards, without forgetting that the reader will have >to read this one first and then the other one, or vice versa, the people >who come off best are the opera singers, each with his or her own part >to sing, three, four, five, six between tenors, basses, sopranos and >baritones, all singing different words, for example, the cynic mocking, >the ingenue pleading, the gallant lover slow in coming to her aid, what >interests the opera-goer is the music, but the reader is not like this, >he wants everything explained, syllable by syllable and one after the >other, as they are shown here." > >Perhaps this doesn't apply to a computer, which can have or emulate >multiple processes. But does it apply to a human reading code or not? I think reading is not a process that really resembles a line from A to B. Specially when reading code (or magazines) it resembles much a strolling about, having a look at this, then at this - with the knowledge of each forming the context for the next. It turns out to be quite a wild montage of various bits of understanding. Even if reading a novel usually means going from one word to the next just as from one page to the next, this does not mean that the understanding builds up this way - an extreme case is maybe modern literature, but also many philosophic texts are more like a sculpture that wants to be looked at in various moments of the day, in different light situations. Live coding seems to me like an interaction with a kinetic sculpture, which can be only understood really by changing it. If code is an instrument, then playing it means changing it, if it is an environment, then understanding is means constructing experiments in it. In a novel I can make deductions - I can predict and retrodict events and reasons for behaviour. I would find it interesting to see the search for such experiments and causalities becoming a stronger part of perceiving algorithmic music. -- . Subject: Re: [livecode] ChucK performance video + sndpeek From: alex woah there - we're in danger of that "you should livecode the OS" > insanity again :] Heh. I don't think it's worthwhile to do that, although it's polite to credit whoever programmed the OS. > I dearly hope you're all joking :) Yes I think we were being sarcastic :) > > [me being sceptical about people reading code during a performance] > Live "stepping through a debugger" might be more descriptive - I'm > not entirely kidding - I think program flow should be made obvious > to the audience too I agree! > I guess I don't see live coding as being about typing or languages, > but essentially about demonstrating a beautiful process that also > sounds/looks{/smells?} good as a side effect. I think "side effect" is an understatement. The sound/visual/odour is the way we experience the process with our bodies, so it's quite important as a medium. I think it important to experience processes with our whole bodies, not just our minds... > Maybe I'm a crazy heretic. I agree again. :) > Hope you're ok though, alex! Yep am fine thanks! alex From: Ge Wang > I understand your skepticism, but I like to believe and sincerely >> hope that reading and understanding the code CAN truly enhance the >> musical experience (you do too, I am sure) - at least in the same way >> of watching/appreciating performances of traditional musical >> instruments. > > I don't think that's a fair comparison - you can't read or understand a > traditional musical instrument. You can understand a lot of things about a traditional musical instrument being played, most relevant to our comparison: - you see/hear how gestures of the performers translate to sound (there are exceptions - such as pipe organs) - you often have a sense when the instrument sounds bad, or poorly played - If you are a experienced player in the instrument, you are more capable of appreciating virtuosic playing what kind of understanding are you referring to? >> Pair this with concurrency, and it allows you to follow parallel >> flows > > I don't think it's possible for my brain to do that! If you can read code through control flow, then it's possible to understand at least one flow, if the language/system lends itself to that. Parallel flow is harder, but if you can parse a band into component parallel members, then maybe it's possible and interesting to do so with code. >> One of our goals is to make code a live instrument by conveying >> the intent of the musician to the audience, and to give the audience >> an opportunity to appreciate the process of realizing that intent. > > The code does not relay the musician's intent any more or less than the > music does. With a little difficulty, I might be able to read and > understand what your code does, but I can only guess the intention > behind writing it that way. Perhaps the better word is "gesture", though intention is crucial, but more on the level of "what sound/music gesture am I trying to make". >> It is this lack of perceivable intent that really make most computer >> music utterly meaningless in live performance. > > Where is the perceivable intent in someone playing the trumpet? As far as performance is concerned, it's the difference between hearing Miles Davis live - and hearing Miles Davis live but with Miles playing from inside an opaque box onstage. I want to see the performer interact with the instrument. Computers are unique because it is both the instrument and a performer. To see this interaction, seeing the code helps. Furthermore, there should be the opportunity for the code to be understood (for experienced members of the audience) in order to enhance (intellectually, musically, gesturally) the appreciation. >> We don't know if the performer succeeds in his/her gestures at any >> level because we don't know what those gestures are. >> The ability to understand and appreciate code is central because >> code is our gesture. I believe we must try to get this part right. > > Ah, "gesture" is a better word than "intent," I think I understand and > sympathise with your position better now. However, if code is your > gesture, then aren't you only making sounds to help people understand > the source code behind it? because the end goal is to make meaningful and virtuosic sound/music, not to sonify code (though that can be a neat recursive live coding - on-the-fly programming of sonification of the live code itself). Ge! Date: Fri, 22 Oct 2004 13:22:25 -0700 (PDT) From: Amy Alexander > I understand your skepticism, but I like to believe and sincerely a> > hope that reading and understanding the code CAN truly enhance the a> > musical experience (you do too, I am sure) - at least in the same way a> > of watching/appreciating performances of traditional musical a> > instruments. a> a> I don't think that's a fair comparison - you can't read or understand a a> traditional musical instrument. a> i think this is a really interesting comment, because there's a couple ways to look at this. from one perspective i agree, but i usually find myself saying just the opposite when explaining my interest in livecoding performance to someone who says, "but it's of no use to someone who can't understand the code!" to me what's interesting about watching someone perform a traditional (mechanically-operated) instrument is that even if you don't play that instrument or know anything about it, you still understand how it makes music. you can see/hear the causality of the performer's motions and the music you hear, and that's what makes it exciting. performances in which things get more kinetic are usually the most exciting, and that is played out in both visuals and sound. on the other hand, a person who plays that instrument will have a more sophisticated level of appreciation for the performance. but, in some cases, the knowledgable audience member will be so far inside the process they won't be able to enjoy the performance viscerally at all. what i personally find most interesting about livecoding performance usually is not trying to understand the code that's being typed. what i find most interesting is the kinetic relationship between what i see on the screen and what i hear. i like slub a lot e.g., because even though it may not be considered pure livecoding, there's a lot of action on the screen, and a lot of clear relationships between the onscreen activity and the sound produced - so i get a sense of the *motion* of the processes. with livecoding that involves longer bits of code being developed onscreen, more of the "innards" of the process are shown but paradoxically, i get less of a sense of that process as a piece of music in motion. so the tricky thing for me with livecoding becomes balancing the presentation of internal kineticism (the motion within the process that motivated the performer to livecode in the first place) with external kineticism (a cause-effect relationship between the visual and the aural aspects of the performance.) this of course assumes that we're trying to work within the model of how mechanical instruments are performed and the audience's experience with that model. it's also possible to try to move livecoding away from that model just as early films eventually made the move away from documentation of stage plays toward a medium-specific model using montage, camera angles, time compression, etc.... and how software art tries to move away from the display-centric model of video and media art. in that case, the challenge becomes figuring out how the audience will come to relate to the new model, whether there first needs to be a gradual transition and development of increased literacy of the medium, or whether it can happen right away. my approach with the thingee has been decidely/admittedly non-purist, especially since i don't much trust myself to type long bits of code correctly onstage, or even figure out where my typos are quickly in a live situation. so i've just gone for the overall effect - combining livecoding with quickly-executed parameterized commands, and even some foot-stomps through FIFOSY (FIFOSY Is Foot-Operated Software Ya-know)... plus some actual livecode that acts on whatever text is clicked, so that the onscreen action of clicking can cause something immediate to happen, while still being part of a livecode process (though the process itself may not really be obvious enough live.) btw, i actually did write an actual livecode FIFOSY implementation into the thingee, wherein you can assign an arbitrary snippet of livecode to a square on the dancepad, and then when you hit that square with your foot again, it will spit the code back onto the screen, which thus causes it to be executed. but it was too rough to try to do at the aarhus show, and i'm not sure i'll ever use it because it's pretty difficult to get people to understand any causality with a maneuver like that... oh well, foot in, foot out... -@ Date: Sun, 24 Oct 2004 16:22:26 +0100 (BST) From: Marcel Gonzalez Corso >> Pair this with concurrency, and it allows you to > follow parallel > >> flows > > > > I don't think it's possible for my brain to do > that! > > If you can read code through control flow, then it's > possible to > understand > at least one flow, if the language/system lends > itself to that. > Parallel flow is > harder, but if you can parse a band into component > parallel members, > then > maybe it's possible and interesting to do so with > code. > What about coloring the code line wich is being executed differently? Like many debuggers do... And do that for all the threads (or scripts) that are playing? Arrange them in the screen(s). The viewer/listener could try to relate one sound pattern to one of the threads? To be readable i guess it would be necessary to sleep(some time) on every codeline. bluerg! marcel ___________________________________________________________ALL-NEW Yahoo! Messenger - all new features - even more fun! http://uk.messenger.yahoo.com Date: Sun, 24 Oct 2004 11:32:22 -0700 (PDT) From: Amy Alexander