[livecode] PIGS beta performance video

From: Amy Alexander <amy_at_plagiarist.org>
Date: Tue, 6 Oct 2015 15:42:30 -0700

Howdy all,

I've been working with some collaborators on a visual performance system
for the past coupla years, which, while not specifically livecoding, is
geared toward performative liveness in a different way - so thought I'd
share it here. The idea is to facilitate improvisational, immediate, and
impulsive visual performance. (Ultimately I'd like to increase the
"impulsive" aspect, as one of my interests is in thwarting conventions of
software that assume a docile performer or user.) I'd like to use it in
various contexts, but right now I'm focusing on screen-based stage
performance in which the visualist improvises as a member of a musical duo
or ensemble. In response to the eternal questions regarding visuals
subordinated to audio (or vice versa); synch or no synch; etc: I generally
approach visuals here as a "part" - e.g. in a string quartet the cello has
a part that is different than the other parts and at various moments may or
may not be synchronized, contrapuntal, etc.

The system took some time to get it all working, and it's still a bit
rough. Plus I'm still a very clumsy beginner at this instrument we're
building as we go along. But we did our debut/beta performance in
September. The system is called PIGS (Percussive Image Gestural System),
and the piece we did for the beta performance is called Rocket's Red Glare
(Things Exploding on YouTube.) It's an improvisation between myself on
visuals (PIGS), and Curt Miller on clarinet and computer audio:

https://vimeo.com/139920521

Basically, the setup for the PIGS visuals is that I can draw forms in four
channels (layers) of video - in this show, I used three iPads and a Leap
Motion. The forms are filled with videos that are processed in various ways
(that part is actually pretty extensive, but I'll skip over it for now.)
 For slow, "legato" passages you might continuously draw forms, but at
other times you may want to play faster, more aggressively, more staccato
or more percussive. So here's where the drums come in: Each drawn gesture
can be replayed by striking the corresponding drum, but it will change
according to each stroke. So for example if you play fast and loud, the
gesture might replay faster and larger, and it may also have some mutations
based on ambient sound, etc. (For a loose metaphor, think of regular
percussion - if you hit a cymbal repeatedly it'll have roughly the same
timbre, but velocity, possibly pitch, duration may change depending on how
and where you hit it, what stick or mallet you use, if you choke it, etc.)
I use musical percussion performance as a performative model since it makes
sense to me (a former percussionist) as an approach to performative
visuals, and I do use sound as one of several factors that influence the
gestural forms - but in general I avoid literal sound/image translations.
  Besides the pads and drums, as you'll see in the video, I've got various
controls on the laptop, MIDI controllers and another ipad. (Always trying
to reduce the amount of twiddling around, but so far, some of it is
necessary.) Curt and I each have foot-based MIDI controls for some things,
so we can leave our hands free for performance as much as possible.

Curt's audio system uses live clarinet and samples from the videos, which
he loops using the software patch he wrote. (Curt's audio patch is written
in PD, the visual patch is written in Max/Jitter.) He also sometimes uses a
vintage cassette recorder in performance as a looping/playback device.

More info on PIGS is at http://amy-alexander.com/live-performance/pigs.html

Would be great to hear what people think!

Re:gards,
-Amy

-- 
Read the whole topic here: livecode:
http://lurk.org/r/topic/1ktAMBwEVGzOw9JGOMgcMW
To leave livecode, email livecode_at_group.lurk.org with the following email subject: unsubscribe
Received on Tue Oct 06 2015 - 22:42:38 BST

This archive was generated by hypermail 2.4.0 : Sun Aug 20 2023 - 16:02:23 BST