Last week got lost entirely to laser cutters, making a case for my sensors and porting the code for my measuring breathing trick to the kinect.
On Wednesday I visited FizzPop the Birmingham maker space to monopolise their laser cutter for a bit and curiously bumped into two guys also building home brew air quality sensor networks. They were both using laser particle counters that were slightly more expensive than mine. I got some really good tips on how to get the best from my sensor by placing it in a light locked box with rounded corners for the air flowing through it to stop any particulates getting dumped on the corners as they pass in.
Apparently preserving a laminar air flow is important in getting the best from these cheap sensors. In the end I was convinced by the argument enough to ditch my original box idea, modelled on the cases I’d seen at indiaairquality blog and try something along those lines.
The rest of the week I had my head stuck in openframeworks with the kinect. I’ve got something that can reliably tell when someone’s breathing when they’re close to the kinect’s camera and I’m currently fine tuning my algorithm so it will work better at a distance.
Away from BOM lab at home I finished (mostly – I still want to edit it down) another piece of music based around a field recording, it’s not as ambient as I see the material I want to write for the sound installation but these things often take on a life of their own. It was inspired by the bird song in the field next to my house but it ended up sounding like some clangers jamming with rubber bands and a cat. I’m thinking more ambient less time, need to turn off the drum machine….
A third of the way in! By Friday I’d built a portable air quality sensor that works with a lithium ion battery and an SD card logger but no sign of a nice laser cut case. I’ve had to make do with my lunchbox with a few holes drilled in the lid for the time being. This week I’ll hopefully get to calibrate it with the help of environmental scientists from Birmingham University. Full credit to the lad (he doesn’t leave a name on his site) who wrote up how to adapt the sensor I’m using for greater sensitivity.
When I wasn’t spending time soldering and desoldering things I coded an openFrameworks program that does a pretty good job of identifying when someone takes a breath from video footage from the web cam alone. It works well if people stand still but is totally useless if they’re moving but I’ve designed my interaction scheme around this so I’m confident it’ll still be useful. I’m spending this week setting it up to work with a kinect, which has thus far been a total nightmare.
I also did some more field recording and tried to go full ambient in the studio over the weekend. Unfortunately I picked up my drum machine and 303 and ended up making really weird microtonal xenharmonic acid instead so I’ll have to go back to the drawing board there….
Last week I got back on it after having been distracted by other things. I came up with a plan which is to record a load of sounds and air quality readings from where those sounds were recorded and then group all the clean and dirty sounds together and make music out of them. I also came up with an idea about how to present the work in an installation context that links these sounds and ideas back to the body and breathing. My observations from the first sensor I built were that the particulates in the air peak with human activity and then drop of exponentially. This tallies with what I’ve read, basically we kick up dust when we walk about places indoors. I’ve decided to do something that’ll respond to people in the space that will kick up loads of particles from the floor when they walk in. For a while now I’ve wanted to build an interactive installation that rewards people standing very still so I’m going to make it play one kind of music when the dust settles and draw from the other set of sounds when people move round.
I ordered a brand new knock off Arduino, an SD card logger, another sensor and a portable microphone and found somewhere to get some acrylic laser cut to make a nice box. The idea is to build a hand held sensor box I can take out outdoor places with my field recorder and a second box I can leave indoor places to surreptitiously record ambient sounds and air quality stats. I did look at the cost of just buying a sensor but building my own worked out significantly cheaper and more fun. While I’m waiting for the bits to arrive I’ve been studying ways to passively monitor people’s breathing rate using Eulerian Video Magnification and got something working in OpenFrameworks.
Despite not having my box built yet I decided to go out and make some field recordings and take some pictures anyway. I got back on the music front and boshed this out. I made the whooshy bass line and bits of percussion for this track out of the sound of my daily train commute to Birmingham, the air humming by the window and some bloke coughing. Quite ambient….. could be even more ambient…. you have to limber up for full ambience though.
Residency supported by the arts council of Northern Ireland
This week was mostly eaten up by revising a piece of work I had hoped to hand over before I left for Birmingham, though I did have a meeting with an air quality scientist from the University of Birmingham and I am hopeful that the scene has been set for a meaningful art science collaboration to take place.
The piece that I spent the week re-work was from Diagramming the Archive and used the PRONI‘s archive of signatures of the Ulster Covenant. The curator’s brief involved layering the images to give a sense of the mass of inscriptions that were collected. Initially I had tried to inject some movement and perspective into this by selecting images at random and layering them in space. I had wanted to do this from four simultaneous camera angles using different projection methods but when I came to port my code to openFrameworks on the raspberry pi 3 the necessary grunt wasn’t there.
In the end I had settled on a single camera angle and selected images at random allowing them to fade in, move across the frame and fade out again.
But the effect was all a bit slideshow so I went back and constructed another of other studies. They all involved layering in different ways and looking at the density of inscriptions, one involved taking the dense parts and using that as starting point for geometric drawing algorithms and the others just averaged large sets of images to give a feel of the archive as a whole and would potentially use that as the starting point for something else.
At one point I started drawing images from the archive in to my computer having forgotten to clear the memory from the buffer I was writing to and got some really nice glitch effects going on based on the left over imagery my graphics card had previously been drawing.
Given the archive images I had been supplied with already had a number of glitches within them, presumably artefacts of the scanning and compression processes I decided to base the aesthetic of the work around that, sitting next to Antonio Roberts at BOM Lab probably helped inspire me in this direction.
Unfortunately the background buffer would not glitch on the pi in the same way as OS X so I ended up creating the backdrop images on my mac and using them as a backdrop for the layered texts. This made me think that as I now had a free hand to import whatever images I want into the backdrop, not just things that had previously been on my mac’s screen (the first glitches were just copies of the XCode screen before my app started working) I should perhaps think about what imagery would make the most sense in the work. To me the glitch aesthetic immediately situated the work in the present so I started to think about how James Craig might go about galvanising support for his cause if the situation played out again. I created mockups of the Covenant text on modern petition websites and then glitched those to use as the back drops. To further give a sense of this conflation of real past events and how it might be approached in the present I animated lots of mouse pointers scanning across the text as if they were signing it.
The curator had stipulated my piece had to link up wirelessly with Ed & George‘s drawing machine which had been commissioned to create imagery based on the same set of images and we had struggled for a long time to find a meaningful way of linking them that was obvious to an audience but not just inserted purely for its own sake. The recurring problem was that their machine moves very slowly to create its artwork while my screen based work could move at a much faster pace. We also quite liked that contrast between them so slowing mine down to their speed seemed wrong. I had decided to make my pointers, symbolising the people signing the petition, beachball whenever the work selected new images to layer as a way of further playing with the glitch theme and injecting some humour into the work. We thought it would be fun if while my piece beachballed theirs simultaneously paused too, like the whole installation was temporarily brought to a halt under the weight of loading new imagery from the archive.
You can see the work at The Irish Architectural Archive, 45 Merrion Square, Dublin (June 1st – 30th) and at The Linen Hall Library, 17 Donegall Square, Belfast, (September 5th – 30th).
Next week I will be mostly working on recording ambient sounds and air quality data.