Thursday, January 19, 2006


Note: This article is using "computational art/design" as compound term also including "generative" approaches.

Code lies at the very heart of computational design, a discipline becoming increasingly popular as proven by the mushrooming number of blogs, books written, conferences and workshops held - all using "code" as their core concept and pivotal sales hook. Yet there's apparently little intelligent discussion taking place about Code (with capital C) which goes beyond the art theoretical/cultural mindset and touches more on its raw (dare I say "technical" side), discussion about its manifold structures, expressiveness, metaphors in generative systems...

Code is language. It can be used and articulated in infinite ways. Poets use language in different, often far more subtle and sophisticated ways compared to our average modes of conversation. Their mastery and/or unique approach to language is what makes them artists, even before theorists can utter the words "political" or "historical" as their main point of interest. I much rather subscribe to something like Tolstoy's naturalistic description of art.

So personally and especially in regards to computational art, I find myself repeatedly standing in direct conflict with the often voiced opinion that literacy in the digital medium is unobtainable or even undesirable.

Processing... a real phenomenon. Heralded as the new "it" tool for computational artists, it actually doesn't directly embrace or promote any state-of-the-art software designs (i.e. code structures). It's true, Processing has been primarily developed as teaching tool and always had in mind a beginner target audience, yet I've been thinking for quite some time that it merely delays the learning curve and lures in an increasing number of users (or shall I say "ongoing generative artists"?) with its easy to learn (and teach!) syntax to get quick visual (mainly) bang-for-the-buck. There's no arguing about its potential as digital sketching tool and its suitability for short workshops.

Being focused on small code sketches/experiments and used by various respected artists the tool created an huge amount of interest fairly quickly. In retrospect (well, for me after almost 3 years) I also think it encouraged a slightly superficial view of computational design by quickly gaining cult status amongst people never been exposed to programming before. I think it's dangerous and a sign of crisis if every recycled L-System, Neural Network, Wolfram automata or webcam tracking experiment automatically is considered art (by their authors), simply because it's been "(Re)Built with Processing"... Of course this is in no way to attack the tool or its intent, but is my growing issue with the surrounding community ethos. We have blogs writing about data equals nature and math being the language of nature, yet there doesn't seem to be any deep understanding of the importance of clean code designs and intelligent data structures or even community interest in further researching and experimenting with those artistically.

boolean isWrong = ( isExperimental != hasGoodDesign);

In fact from conversations with various fellow Processing users and lecturers I gather most are not aware of the total absence of decent software designs in the majority of the work produced with the tool so far. Due to the simplicity of its syntax, authoring environment and reference examples, the implicitly encouraged coding style is somewhere between procedural C programming (minus the pointer mess) and barely scratching the surface of object oriented designs.

In terms of pure expressiveness of ideas, concepts and thought processes as code, Processing is inferior to straight Java or dynamically typed languages like JavaScript or Ruby. Its ease of use has been gained by sacrifying scalability. Processing is based on convenience methods and it shows indirectly too. The Processing community at large has started to grow into one of consumers of previously written code.

On the other hand I believe artists (ongoing or not) working with "computational strategies" (can we please quit the marketing speak) must, or at least should, be aware of and work on intelligent software designs in order to advance the(ir) discipline. Form follows function.

In response to that I also believe it might hurt Processing as platform in future if experienced users will find themselves forced to breakout and leave the tool behind. To pre-empt this to happen, I think the community at large should pay more attention and spend time on extending the current library base. Above all, library authors should also respect the tremendous amount of work put in by Ben+Casey so far and too embrace the open source mentality of their core tool. The licenses are many (as well as much misunderstood, choose wisely!). That way existing library functionality can be further extended without having to reinvent the wheel (yet again!)...

An extensive library base for Processing will help the tool's longevity, even if users will slowly outgrow the initial proposal of the tool and only continue to use it library itself.

Open source is for doers. Happy, belated 2006! Glad I'm still alive...