Re: [livecode] non-linguistic programming

From: douglas edric stanley <destanley_at_mac.com>
Date: Thu, 3 Jan 2008 15:49:01 +0100

Hi Alex.

> > Of note: I call them text-based languages because
>> I think (as I describe in the paper) that they
>> have very little to do with language, and have
>> everything to do with the way in which ASCII is
>> represented within the machine.
>
>I do not fully understand your argument, I think I'm missing
>something... A novel is a list of characters in the same way that a
>text based program is.

Ok, there are several responses to this. I can't get to them all here.

For me a novel is not just a list of characters.
Just as a picture is something far more than a
mere string of pixels in an array. The internal
mechanics of a novel's representation inside of a
computer is merely an exploit, a hack, that
computer science has found, although a handy one
at that. But just because a novel can be
represented inside of a computer through an
array, and a picture can be represented inside of
a computer through an array, this does not mean
that novels and pictures are nothing more than
strung-together tokens. I think this is quite
obvious. They might be so in a computer, but that
is merely a problem of representation, and even
worse, a mere representation within the computer
architecture.

So if you'll follow me on this train of thought,
there is no real fundamental difference in
programming via characters inside of a char *
array, and pixels inside of an int * array. We
only have to look at The Game of Life, or
SimTunes, of some of Dave Griffiths examples
previously listed here, in order to see that you
can very well program with tokens of all sorts. I
created a video-mosiac-programming-language a few
years back to illustrate something along these
lines. There are many other examples. I think,
like others here, that games are probably the
wave of the future on the question of programming
and its representation.

To get back to my original observation: the
special thing about character-based programming
is that the trick is almost readable. I say
almost, because most people can't figure out what
the fuck is going on. For example, I'm writing a
lot of C++ right now, and in that language you
can write a lot of fascinating obfuscations like
this:

for(char*i="Hello"; *i!='\0'; ++i) std::cout << *i << std::endl;

I'm sorry to offend all the robots out there, but
that's totally unreadable unless you understand
the underlying architecture of the computer. You
have to understand memory addresses and pointers
and loops and streams, etc. This is not writing,
has nothing to do with literature, etc. This is
more like knitting. So my real question is: if we
are already kitting, why not just knit?

Which returns me to the question of beauty. I'm
sorry, but code is ugly. Knitting is beautiful,
coding is ugly. Sure, I absolutely love the above
C++ code, I find it -- gasp -- kind of lovely, at
least fascinating. But it's still very geeky and
bastardly, i.e. ugly in the etymological sense:
it's dreadful. I'm assuming that most of us here
can read/write several computer languages, and
therefore can see the structure, the machines,
behind this code. So suddenly it becomes
something else. But not because of the glyph. The
glyph is only a token, I want to say
"un-signe-quelconque" in hommage to Deleuze's
"instants-quelconques".

It can be rendered fascinating, lovely,
beautiful, whatever -- but you have to transform
it into a performance, re-render it into an
aesthetic form, etc. -- and play with the fact
that a computer glyph changes the use-value of
the more historical glyph. I suppose that's why I
love Livecoding as a sort of artistic manifesto.
The art-hack of livecoding is a very novel
response to the computer science hack of char *
arrays. It uses the constructs of art-performance
to redefine code itself, by revealing something
in the code that talks to artists, as if it were
always there in the code to begin with. It also
speaks to the history of glyphs. That's pretty
damn impressive in my book, but it still doesn't
hide the fact that text-based coding is merely a
hack and could replaced by all sorts of other
forms of representation.

All that said, I love to code and I love to look
at it. I think it's quite amazing. I'm also
fascinated with this historical hack of
associating the physical keyboard array with the
char * array with the programming stack. But I
cannot for the life of me see how people can keep
coding like this, i.e. from the perspective of
the history of human representations.

There is nothing moral in these arguments. I'm
not saying code is ugly in the sense that code is
"bad". I'm just saying that it's historically
curious, and in my opinion will transform into
something more historically profound. Hence
knitting, music, mosaics, who-knows...

> I don't see why this linearity leads to
>ugliness.

Well, I don't know if that's what I was trying to
suggest. I just think that historically
text-based programming facilitated to an amazing
degree modularity and abstraction, thanks to all
of the modularity and abstraction already
inherent in text systems. Text is also linear, as
was the Turing Machine -- hence the alliance
between the two -- but both of them are far more
than that, too. So it's a mere alliance, like all
alliances, of convienence. But their achitectures
cannot be reduced to mere linearity. Computation
does not need text to survive. Let us not confuse
the two.

And so it seems quite revealing to me that the
text-computation pact would truly begin to show
its cracks at precisely the moment when we are
forced to work with multiple processors, threads,
etc. in order to evolve. The processor speed wall
has been good at revealing this problem.

I think it's also significant that many artists
use Max/PureData and all their cousins (vvvv,
EyesWeb, etc) without any trouble and do some
pretty amazing things with them. This leads us
back to the original Google talk.

I think it's an important question to ask, even
if we never get past it: was historically tying
linear text-string-of-perls the best way to
represent computation?

>Are pure functional programs immune from your criticism?

I don't know. I'll have to think about that.

> For example
>Haskell programs are not sequences of instructions (leaving aside
>monadic trickery) and so functions may be executed in any order,
>concurrently across any number of processors.

Yes, that's a very cool concept.

> They are however
>represented in ASCII, with two dimensional structure (linefeeds and
>whitespace have meaning).

Haskell is still pretty unreadable, therefore a
bit of a kludge. But at least it looks unreadable
from the get-go and doesn't pretend otherwise.
But you have to be a computer scientist to read
it, whereas I'm going in the other direction.
Anyone can learn to play puzzles and knit, so I
think that's a far richer direction to explore.

I always wonder about Lisp. It seems to me to be a largely overlooked language.
-- 
/*
// Douglas Edric Stanley
<douglas_at_abstractmachine.net>
// Artiste
http://www.abstractmachine.net
// Professeur d'Arts numeriques, L'école supérieure d'art d'Aix-en-provence
http://www.ecole-art-aix.fr/
// Chercheur, Laboratoire Esthétique de l'interactivité, Université de Paris 8
http://www.ciren.org
*/
Received on Thu Jan 03 2008 - 14:51:24 GMT

This archive was generated by hypermail 2.4.0 : Sun Aug 20 2023 - 16:02:23 BST