After meeting Peter, who runs Belfast’s PS2 gallery, about showing some of my PhD pieces there, he asked me to do ‘something’ with their ping pong table for Belfast Culture Night. Naturally I thought to myself ‘I’ll piezo mic it and do bonk detection’ and use the player’s actions to drive some interactive sound and light mood altering music machine. This boiled down to ‘I’ll stick an Arduino in it’ (this seems to be a pattern in my projects) and drive some pretty LEDS that can react to the ball hitting the table and use a Max patch to control the whole sha-bang as well and putting out some pleasant reactive sounds.
All this led me to getting messy with some contact/piezo/transducer mics, the first time I’d used them though I’ve seen them in numerous interactive projects as they’re pretty handy for simple bonk detection. I spent a week fooling about with them gaffa taped to the underside of a plastic garden table which was all I had to hand at the time. The outcome of the was a ground breaking equation governing the relative loudness of ping pong balls on plastic as a function of distance which proved rather less useful when I moved the contact mics on to Peter’s table.
After I’d got reliable bonk data coming into Max I started with the lights. I had a limited budget which I decided to blow almost entirely on ultrabright RGB LEDS which I sourced quite reasonably from Rapid. I used a 16 channel TLC9540 PWM current sink to control 5 groups of 3 RGB LEDS in series, I picked the TLC9540 because Alex Leone’s well written Arduino library for the chip. The LED’s were powered from a spare DC multi-adaptor supply that I had lying around, 12V sufficed (I find it best to avoid electrocuting the public wherever possible).
A word to the wise, if you get your own TLC9540 and you’re not careful to set the dot brightness low (and you have no way of knowing what it gets scrambled to when you power up) you can easily sink enough current to trigger the built in thermal protection which turns off the chip till it cools down. I found a cheap heatsink that was wider than the chip which I stuck on the back of it using the same thermally conductive double sided sticky tape I later used to stick the LEDs to their heatsinks.
The only Arduino code I had to write was a simple serial library to convert commands from Max into commands for the TLC. I always like to write these kind of things from scratch because I enjoy the challenge of it and they never normally take very long. I generally just base them around my understanding of MIDI (i.e. a command space for values above a certain number and a data space for values below then use extra packets and bitshifts to send larger values if necessary).
At that stage I had an array of bright lights I could control the intensity and colour of and a method for getting data in. As culture night loomed I needed away of sticking them safely to the underside of a ping pong table, luckily I’d been put in touch with the excellent guys at Farset Labs who lent me use of their glue gun and some old aluminium strips. I spent a very happy and very late night using some heat conductive double sided sticky tape (it was from Maplin and designed for sticking heat sinks on GPUs) gluing the LEDS to the bars and wiring the whole thing up.
A couple of days before Culture Night I set up the table in PS2 with sensors and lights for some serious play testing and mapping design. In the end I tacked the aluminium strips to the underside of the table and used the ubiquitous gaffa tape to hold the cables in place. This is a video I took when I was getting the lights set up for the first time.
I had initially had all sorts of ideas about how to use the sensor data to control Ableton Live and even created a drum machine that kept tempo with the tapping of the ball back and forth. In the end I went for a more literal approach and decided to actually use the acoustic signal from the piezo mics for more than just bonk detection but to actually generate the sound itself. I achieved this by feeding banks of tuned resonators and a custom Reaktor patch that I made years ago that does interesting things with interpolated delay lines. Each of the sensors fed its own effects chain and I also mixed in some live signal from a microphone I hung above the table to pick up the natural acoustic sound of the ball and the audience, this was just fed through some EQ and delays. I think it ended up sounding like a mix between Basic Channel and Autechre, which is no bad thing in my opinion. This is a recording of a game that I made.
The mapping worked using the location system I’d established earlier, I let the position of the ball strike along the length of the table control the chords the resonator was programmed to play and the colour of the LEDS such that the table had a red and blue end with the spectrum in between. Where the ball landed across the width of the table affected the panning of Live’s master output. I also used the peak amplitude of Live’s output to control the intensity of the lights as they flashed and faded after each ball strike, this was a really nice effect that tied the sound and light together. Here’s a video of the final installation on Culture Night.
Looking at the piece technically the whole thing ran on a combination of Live hosting all the Reaktor vsts and processing the acoustic signal while Max used the same signal to do bonk detection and control Live and the Arduino, if I had more time I’d have tried to squeeze the whole thing into a single M4L patch but to be honest I find communication between instances of M4L patches to be pretty unpredictable timing wise so it might have to stay as two separate applications with OSC and Midi doing the communicating.