Rutt-etra videosynth nearly ready

Since my last post I’ve been working on a few things, details of which I hope to post soon but I’ve spent most of my time working on my Rutt-etra inspired jitter/open-gl video synth. This along with a database youtube video harvester I wrote in Python will form a major part of my next piece I’m hoping to submit to the 2009 conference. I’m aware that it’s not entirely an original idea to build a software version of the famous 70s video synth and I have to admit to being inspired by vade’s work in the same area.

As well as the version of the software developed for my own use I’m working on stand alone and max patch versions for distribution to interested third parties. I’m pretty pleased with the vsynth, it basically displays an input video on to a set of parallel lines each of which can be deformed, point by point along the line, with the luminosity of the image controlling the height of each point. On top of this it does some pretty open-gl feedback and a nice feature I just added is to allow the mixing of the feedback with the source footage to give some pretty pyschedelic results.

Although it’s a bit of a cpu hog it runs fine on my old 1.8 GhZ single-core laptop as I went to great pains to move as much of the process on to the GPU as possible. I’m hoping to post a first general release of the patch by the end of next week so for now you’ll have to make do with some screenshots.

Original Dr Who title sequence Rutt stylee

Original Dr Who title sequence Rutt stylee

Rutt-etra psychedelic colour disco

Rutt-etra psychedelic colour disco

2 Responses to “Rutt-etra videosynth nearly ready”

  1. vade

    Looks great! nice to see another Rutt Etra out :) How are you making the RE work? I had one working in Jitter for a while, it worked pretty well. I used Vertex Attributes on jit.gl.mesh and a shader program, it seemed to be decently fast. Im happy to digg it up for you if it might be helpful :)

  2. crx

    Cheers vade. ;-)

    I’m using the luminosity of the incoming video to displace the vertices of a jit.gl.mesh which is just textured with the original video. This gets rendered off screen and mixed in with the previous frame using an accumulator slab the output of which can then get mixed with the incoming video for psychedelic weirdness. Did your shader program do vertex mangling ? Wouldn’t mind a sneaky peak. I’m just stripping the non-relevant guff from mine (it has an 8 bit text menu overlay for video selection for the TV gallery installation) then I’ll put it out for general consumption.

Leave a Reply

  • (will not be published)

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>