[livecode] livecoding on mars / a data streaming protocol

From: Adam Smith <adam_at_adamsmith.as>
Date: Wed, 29 Dec 2010 04:17:19 -0800

Hi All,

I wanted to share livecoding-relevant line of thought I've been on for a
while. It ropes together space exploration, artificial intelligence,
livecoding, and language/protocol design -- so I hope it's got something for
everbody.

Ok, the story starts with me posing a pointed question on Quora:
http://www.quora.com/What-are-some-realistic-projects-that-might-require-a-local-programmer-on-Mars(I
bet you can guess the question from the url.) Being interested in
visiting Mars myself, I wondered what reasonable professional excuse I (in a
coding capacity, live or not) might have for going. The gist of my answer
(to my own question) is that, to overcome Earth-sourced competition, you'd
have to deliver relevant software in 30 minutes, easing the biggest
permanent barrier to Earth-Mars, communication delay. A job for livecoders
right? If you read the original Quora post you can see my little story about
livecoding data uplink policies for a rover exploring the icy poles.

A few months later, inspired by reading some Vinge perhaps, I got to
thinking about this long-distance communication problem again. Fancy codecs,
swarmed transceiver arrays, anywhere power generation -- if we had all that,
we still couldn't beat the EM propogation delay. My conclusion:
"Content-adaptively reducing the number of round trips will be the most
important problem in communication as we explore space." (
http://twitter.com/rndmcnlly/status/12747633357488129).

Today, finally putting these two threads of thought together. Over longer
and longer distances (and higher and higher rates of technological progress
and infrastructure build-out), serious expansions to the data you are trying
to transmit are going to occur within the space (time) of a single
round-trip in communications. In this situation, it is natural to think of
livecoding (in a sense) the protocol use for communication, knowing that
both the transmitting and receiving ends are both intelligent and capable of
live-modification of their computational systems. The result (at least of an
hour or so of thinking) is my "adaptive upgrading stream protocol" described
here:
https://docs.google.com/document/pub?id=1hICQH4widVmoV6XT3b6MZpLyZz_aR3ThP7ITCYbeIug(which
I think you would like to read).

The gist? Imagine your communication with the repl/shell of your favorite
livecoding tool. Now imagine sending the log of all of the logic you
redefine and all of the immediate snippets you execute over this
hugely-delayed link through space. If the receiver on the other end plays
back the same sequence into a local tool, the same interesting event stream
is produced (an interesting stream of musical events, bubbling with
complexity, despite, at no time, ever having transmitted anything
particularly dense or complex). This isn't quite how it would work, but it's
the right mental picture.

When inherent intervals (such as communication delay over long distances) or
rates (such as the bandwidth available to untethered deep-sea
probes/submarines) approach the natural rates of human programming efforts
(reactivity within ~30 seconds, average effective code change rate ~3
chars/sec in some programming language) livecoding becomes a very natural
concept to build communication systems. Well, this assumes you have some
human livecoders around. Lacking them (on Mars, say) we can still think
about programming the rover to spot regularities and adapt to changes in
data sources, using the same protocols.

I know my thoughts here aren't too well organized, but I think its an
important new direction of thought that we toplappers are uniquely prepared
to consider.

Adam
Received on Wed Dec 29 2010 - 12:18:26 GMT

This archive was generated by hypermail 2.4.0 : Sun Aug 20 2023 - 16:02:23 BST