Re: [livecode] livecode systems architecture questions

From: alex <alex_at_slab.org>
Date: Thu, 16 Nov 2006 00:15:44 +0000

On Wed, 2006-11-15 at 09:40 -0800, Dylan McNamee wrote:
> Is this an acceptable forum for discussing architectural options for
> implementing livecoding systems?

Yes

> In particular, I'm thinking of building one of my own, and I'm trying
> to figure out what my options are, in particular regarding managing
> time.
>
> Is the Unix "interval timer" good enough? What makes "good enough" --
> is the main issue jitter? What's a good way of measuring jitter with
> reasonably high resolution?

Others will have better technical advice than me, but in my experience
if you have a few things playing at the same time, timer inaccuracy
becomes unimportant (as long as all the things are differently
inaccurate. Slightly fumbled timing adds character anyway, so I tell
myself.

What is more important in my view is syncing with other people, and
without talking about MIDI, as far as I know there is no operating
system/application neutral standard way of doing that. I'd like to hear
about it if there was, and have long wanted to collaborate on
formalising something OSC based. Has no one done that yet?

> Since I'm going to be doing this on MacOS X, what about using NSTimer
> or the RunLoopTimer? Alternatively, should I consider building my
> livecoding system as a kernel module and using the real-time
> facilities provided to device drivers?

Livecoding inside the kernel, wow!

Also timing related - has anyone built these kinds of performance rules
into their systems?
  http://www.speech.kth.se/music/performance/performance_rules.html


alex
Received on Thu Nov 16 2006 - 00:17:56 GMT

This archive was generated by hypermail 2.4.0 : Sun Aug 20 2023 - 16:02:23 BST