Friday, March 05, 2004

loading Targa images

if you ever wanted to load images with an embedded alpha channel, TGA (Targa) is a feasible format as long as the image is not too big and too complex. this is because the bitmap is stored uncompressed and tends to be very large in filesize, so for sketches intended to run online, the only compression applied to it will be the LZW one of the .jar file the image will reside in. in my tests i got results between 25-50% of the original filesize. below is some code for loading such an image. like the default loadImage() function, loadTarga() also returns a standard BImage object.

void setup() {

BImage img=loadTarga("test.tga");
// resize screen to image dimensions

BImage loadTarga(String file) {
// load image file as byte array
byte[] buffer=loadBytes(file);

// check if it's a TGA and has 8bits/colour channel
if (buffer[2] == 2 && buffer[17] == 8) {
// get image dimensions
int w=(b2i(buffer[13])<<8) + b2i(buffer[12]);
int h=(b2i(buffer[15])<<8) + b2i(buffer[14]);
// check if image has alpha
boolean hasAlpha=(buffer[16] == 32);

// setup new image object
BImage img=new BImage(w,h);
img.format=(hasAlpha ? RGBA : RGB);

// targa's are written upside down, so we need to parse it in reverse
int index = (h-1) * w;
// actual bitmap data starts at byte 18
int offset=18;

// read out line by line
for (int y = h-1; y >= 0; y--) {
for (int x = 0; x < w; x++) {
// merge RGB components first
img.pixels[index + x] = b2i(buffer[offset++]) |
b2i(buffer[offset++])<<8 |
// then set alpha based on data or revert to 100%
// (if there's only a 24bit image)
if (hasAlpha) img.pixels[index + x]|=b2i(buffer[offset++])<<24;
else img.pixels[index + x]|=0xff000000;
// next scanline
index -= w;
return img;
println("loadTarga(): wrong image format");
return null;

// byte to integer conversion
int b2i(byte b) {
return (int)(b<0 ? 256+b : b);

Thursday, March 04, 2004

hair dynamics

robert hodgin posted a great little sketch about fluid dynamics a few days ago. at it's core is a simply particle system & great use of the (perlin) noise() function. based on each particle's position a noise value is computed and intepreted as rotational angle. particles are then drawn as lines in the direction of their current angle.

after playing around with the code for while, i suddenly realised how easily it could be adapted to simulate hair. in principle, all what's needed was to increase the length of the particle lines and add some lighting calculations for the gloss. now each particle receives a slightly randomized base colour when it is created and which is then modulated by the line's current angle in relation to the light direction. the resulting play and movement of the highlights and dark areas makes it hard to believe there's no 3d involved at all and the image is "only" made of 60000 shaded lines.

remixed sketch, video versions and source are here: /p5/remixed/hair/

processing vs. eclipse in japanese

takashi meakawa has kindly offered to translate my little tutorial about integrating the processing libraries into the eclipse environment. twothirds are already done and can be found here.