> On Wed, 2006-11-15 at 09:40 -0800, Dylan McNamee wrote:
>> Is this an acceptable forum for discussing architectural options for
>> implementing livecoding systems?
>
> Yes
>
>> In particular, I'm thinking of building one of my own, and I'm trying
>> to figure out what my options are, in particular regarding managing
>> time.
>>
>> Is the Unix "interval timer" good enough? What makes "good enough" --
>> is the main issue jitter? What's a good way of measuring jitter with
>> reasonably high resolution?
I don't know if it's a common approach (I made a lot of this stuff up as I
went along) but I find running the interpreter you are livecoding a second
or so ahead of the audio backend. It helps to keep things stable and means
the interpreter doesn't have to run in a realtime thread. The latency
doesn't really matter if you're programming. This has changed with my
current setup which graphically displays thread positions, but nobody
seems to notice yet.
I use posix gettimeofday() for non realtime timing, and sample counting
from the audio card on the realtime thread - this is probably the Wrong
Way, but it's the only way of doing it I could think of and seems to work
for me.
> What is more important in my view is syncing with other people, and
> without talking about MIDI, as far as I know there is no operating
> system/application neutral standard way of doing that. I'd like to hear
> about it if there was, and have long wanted to collaborate on
> formalising something OSC based. Has no one done that yet?
Perhaps we should just publish the highly complex "slub syncing message
format"? :)
cheers,
dave
Received on Thu Nov 16 2006 - 13:14:28 GMT