I left it a few days before writing up our final day in the project space, to let the dust settle (and the talcum powder – sort of an inside joke but one of the previous artists, Shelby Hanna, had recreated a dust storm with a lot of talcum powder and fans – really wish I’d seen it in person). We tried to tie the loose ends together and present the video works alongside the pictures that had been created in their construction and also worked on a final piece that Sharon drew on the wall while I projected data over it. We took a couple of runs at this one and though it was almost an after thought, as Peter had playfully told us to draw on the walls, it was probably the best still image to come out of the process. Working with the eraser on the charcoal and with the scrape marks of previous attempts Sharon was able to create an image which was both representative of the data I had projected and at same time transcendent of it, bringing a lot of herself to it. This was what I had hoped we’d get out of the project at the beginning.
In the afternoon we opened for a couple of hours and got some good feedback, we had a lot to think about where we might take things next. The final piece that used both lines and negative space is something I’d like to work back into my data visualisation and quite how we might animate that is an open question.
Today we finished the stop motion work and did a few more experiments with the contraption, which we weren’t really happy with but were useful in as far as they showed us what we preferred about the earlier ones; sometimes you have to take a wrong turn to realise you were already on the right track. We’re having an opening to the public tomorrow three till six (ish) so if you’re about pop in.
Today we attacked the perspective problem by slowly rotating the visualisation as we advanced through it, it gave some interesting results but because of time constraints we didn’t get very far so we’re hoping to blitz it tomorrow.
Today we got stuck into a longer stop motion animation, finding an attractive looking loop in the original data set and drawing it out. Sharon looped through a couple of cycles of one part of the accelerometer log before moving to another interesting looking shape in the set, we’ll come back to this and finish it tomorrow.
While the batteries recharged on the DSLR camera we did two more experiments with the camera mounted charcoal contraption, this time drawing over the projection. These were quite extreme.
A couple of things came up in conversation about where to take things next, looking at a way of getting perspective back into both my visualisation and the drawing, doing a stop motion with both the projection and the drawing in shot and expanding the contraption with more bamboo. Watching Sharon work with the charcoal another thing I’d like to try and work into my code is some kind of charcoal dust particle effect.
As part of this year’s rehearsal rooms project visual artist Sharon Kelly and myself have installed ourselves in PS2’s project space for a week to work more fully on our ongoing collaboration exploring drawing, running, data visualisation and the common ground between visual and digital arts.
Today was split between setting up and starting the process of making physical animations based around Sharon’s interpretation of my visualisation of her data. We got one nice stop motion animation done followed by some more experimentation with ‘the contraption’ (a web cam mounted on charcoal) followed by the stop motion’d erasing of the experiment.
How ‘the contraption’ and the videos it creates fits in with the wider theme of the two of us imitating and drawing from each other’s work has yet to be decided. We’re talking about fitting the wii-mote to it and perhaps doing something with that data, as if the whole process were repeating and folding in on itself.
For the technically curious I’ve been doing the stop motion using Sofortbild to tether Sharon’s D200 to my macbook and a custom Jitter patch that watches the folder for fresh jpegs and automatically adds them to a jit.matrixset animation. This allows us to watch back the animation with the source data visualisation next to it so we can quickly see how they compare.
Here are some pictures of how we’ve been using the space so far.
For the last year, on and off, I’ve been collaborating with the visual artist Sharon Kelly. Sharon has a keen interest in running and she draws on this in her art. The collaboration fell out of a chance conversation we had about the possibility of fitting her up with some accelerometers while she was running and doing something visual with the data that generated afterwards.
The process we settled on was me giving Sharon a wii-mote and a netbook, when Sharon went out for a run she would set the netbook up and put it in her backpack and carry the wii-mote like a baton in her hand. The netbook was running Glove Pie which took the wii-mote’s accelerometer data in via bluetooth and output it to a Processing sketch via OSC. The sketch just timestamped and recorded the raw data generated by Sharon’s hands while running. Here is what part of the raw data looks like in Excel
Periodic data plotted
I took this raw data and wrote a second Processing sketch that attempted to animate the data in a style that complemented Sharon’s pencil drawing, it takes the accelerometer data, scales it and then animates it in 3D, as the pen width doesn’t vary with depth it gives the impression of a 2D drawing. Here’s a video of it running.
I showed this sketch running on Sharon’s iMac at her February show in the Crescent Arts Centre.
Sharon spent some time in the gallery with the sketch running and as well as finding it quite hypnotic became interested in using it as a source of inspiration for sketching out the forms she saw in it. Exploring this example of pareidolia became the focus of the next stage of the collaboration. We arrived at the idea of exploring the data looking for visually ‘interesting’ sections of the data which could then be used as inspiration for more abstract works.
This led me to rewrite the early Processing sketch in MaxMSP/Jitter and create a standalone visualiser I could hand over to her so she could move through the recorded data, explore it and experiment with different scalings and projections. I tried to give the sketch a stop motion style / pencil drawn effect that was inspired by Sharon’s work. Here is an example of her accelerometer data being visualised by it.
The next stage of the project which will be occurring in PS2 next week as part of the rehearsal rooms project will involve projecting these data visualisations and making stop motion videos with Sharon, inspired by the shapes and motion inherent in original data. I met Sharon last week in her studio to work on the set up, here are some early photos.
Projected visualisation and response sketch
Stop motion set up
Web cam taped to charcoal and bamboo contraption.
The last photo is of an interesting contraption that came out of working together in the same room for the first time, it’s a webcam on a piece of charcoal that led to some interesting animations.
Charcoal cam animation one
Charcoal cam animation two
Charcoal cam animation three
Charcoal cam animation four
Charcoal cam animation five
What I found interesting about the process has been the theme of imitation, of my visualisation attempting to imitate her work and in the next stage that of her drawings imitating mine, each working iteratively towards some middle ground between us. It’s been really great working together, an open ended exploration with lots of back and forth. I’ll post materials from next week’s gallery time as they arise.
After meeting Peter, who runs Belfast’s PS2 gallery, about showing some of my PhD pieces there, he asked me to do ‘something’ with their ping pong table for Belfast Culture Night. Naturally I thought to myself ‘I’ll piezo mic it and do bonk detection’ and use the player’s actions to drive some interactive sound and light mood altering music machine. This boiled down to ‘I’ll stick an Arduino in it’ (this seems to be a pattern in my projects) and drive some pretty LEDS that can react to the ball hitting the table and use a Max patch to control the whole sha-bang as well and putting out some pleasant reactive sounds.
All this led me to getting messy with some contact/piezo/transducer mics, the first time I’d used them though I’ve seen them in numerous interactive projects as they’re pretty handy for simple bonk detection. I spent a week fooling about with them gaffa taped to the underside of a plastic garden table which was all I had to hand at the time. The outcome of the was a ground breaking equation governing the relative loudness of ping pong balls on plastic as a function of distance which proved rather less useful when I moved the contact mics on to Peter’s table.
Prototype garden table, at that point I’d given up and filled the table with synthesisers instead.
After I’d got reliable bonk data coming into Max I started with the lights. I had a limited budget which I decided to blow almost entirely on ultrabright RGB LEDS which I sourced quite reasonably from Rapid. I used a 16 channel TLC9540 PWM current sink to control 5 groups of 3 RGB LEDS in series, I picked the TLC9540 because Alex Leone’s well written Arduino library for the chip. The LED’s were powered from a spare DC multi-adaptor supply that I had lying around, 12V sufficed (I find it best to avoid electrocuting the public wherever possible).
A word to the wise, if you get your own TLC9540 and you’re not careful to set the dot brightness low (and you have no way of knowing what it gets scrambled to when you power up) you can easily sink enough current to trigger the built in thermal protection which turns off the chip till it cools down. I found a cheap heatsink that was wider than the chip which I stuck on the back of it using the same thermally conductive double sided sticky tape I later used to stick the LEDs to their heatsinks.
The only Arduino code I had to write was a simple serial library to convert commands from Max into commands for the TLC. I always like to write these kind of things from scratch because I enjoy the challenge of it and they never normally take very long. I generally just base them around my understanding of MIDI (i.e. a command space for values above a certain number and a data space for values below then use extra packets and bitshifts to send larger values if necessary).
At that stage I had an array of bright lights I could control the intensity and colour of and a method for getting data in. As culture night loomed I needed away of sticking them safely to the underside of a ping pong table, luckily I’d been put in touch with the excellent guys at Farset Labs who lent me use of their glue gun and some old aluminium strips. I spent a very happy and very late night using some heat conductive double sided sticky tape (it was from Maplin and designed for sticking heat sinks on GPUs) gluing the LEDS to the bars and wiring the whole thing up.
A couple of days before Culture Night I set up the table in PS2 with sensors and lights for some serious play testing and mapping design. In the end I tacked the aluminium strips to the underside of the table and used the ubiquitous gaffa tape to hold the cables in place. This is a video I took when I was getting the lights set up for the first time.
I had initially had all sorts of ideas about how to use the sensor data to control Ableton Live and even created a drum machine that kept tempo with the tapping of the ball back and forth. In the end I went for a more literal approach and decided to actually use the acoustic signal from the piezo mics for more than just bonk detection but to actually generate the sound itself. I achieved this by feeding banks of tuned resonators and a custom Reaktor patch that I made years ago that does interesting things with interpolated delay lines. Each of the sensors fed its own effects chain and I also mixed in some live signal from a microphone I hung above the table to pick up the natural acoustic sound of the ball and the audience, this was just fed through some EQ and delays. I think it ended up sounding like a mix between Basic Channel and Autechre, which is no bad thing in my opinion. This is a recording of a game that I made.
The mapping worked using the location system I’d established earlier, I let the position of the ball strike along the length of the table control the chords the resonator was programmed to play and the colour of the LEDS such that the table had a red and blue end with the spectrum in between. Where the ball landed across the width of the table affected the panning of Live’s master output. I also used the peak amplitude of Live’s output to control the intensity of the lights as they flashed and faded after each ball strike, this was a really nice effect that tied the sound and light together. Here’s a video of the final installation on Culture Night.
Looking at the piece technically the whole thing ran on a combination of Live hosting all the Reaktor vsts and processing the acoustic signal while Max used the same signal to do bonk detection and control Live and the Arduino, if I had more time I’d have tried to squeeze the whole thing into a single M4L patch but to be honest I find communication between instances of M4L patches to be pretty unpredictable timing wise so it might have to stay as two separate applications with OSC and Midi doing the communicating.
Putting it in its artistic context there a whole host of interesting ping pong projects. Some of my favourites are Kings of Ping, Ping Tron and the spookily contemporary Noisy Table.
Pretty pleased with this one as it feels like an improvement both in terms of tune and mixdown. It’s based on the skit I posted here Hangover acid medley. This is the first track I did on my Atari using Notator, normally I’m a Cubase 3.1 kind of guy but I thought I’d see how the other half live (or should that be lived given development stopped in ’93). Notator’s really good for track layout and development, the only problem is the piano roll note editor doesn’t display long notes very well, makes programming long chordal stuff a bit tricky at times. I can see why people flicked between both.
Made this this afternoon in my home studio that’s been transplanted in to PS2 project space for the week as part of ‘Sounds like home’, was fun knob twiddling as people walked past outside, got some interesting reactions off gear heads. It’s my first track with my dreamy droney new TG-33. I’ll come back to it and polish it later…