Bio: Toby*Spark (D-Fuse)


Bio: Toby*Spark (D-Fuse)

Toby Harris is a London based artist, designer and engineer. A self proclaimed digital media maverick, Toby is part of audio visual collective and VJ pioneers D-Fuse. You'll also find him developing solo projects as *spark, helping out the legendary The Light Surgeons, and working for innovative production companies such as Imagination -- where pretty much every event industry award was scooped in the year of the Ford VJ project -- and Yeast Culture -- where he's workshopped with youth around the world.

Today Toby’s views on audio visual culture and live performance go beyond just impressive graphics:

Narrative and meaning became increasingly important in my live ‘video fused motion graphics’ and i found a clear direction in fusing cinema, VJ tools and performance, which nowadays has the moniker ‘live cinema’. there’s a lot of interesting things in that territory, ranging from individuals re-channelling the oral storyteller tradition to whole industries in a post-broadcast world.

Get past the aesthetic possibility and interactivity is what matters here. Its not just about your act of creation, but on exchange with the audience. If you could have just pressed play on a video and nobody know the difference, you've missed the opportunity here. Whether your interest falls within exploring narrative or the liveness of live events, you'll quickly find you have to get past the limits of the tools we have at present. And Quartz Composer a big part of my arsenal to do that, trying to shape new tools for me, and shape new experiences for my audiences. Stepping back from any planning, patching or performing I think it's earlier days for all this stuff than most people even imagine -- human to human interaction is so nuanced and production so often at odds with improvised possibility, I feel like a barbarian with a sci-fi dream.

Give us some examples of how you've used Quartz Composer 'for yourself and your audiences'?

My live cinema work and Quartz Composer know-how allowed me to answer "Yes - it's possible, and I can do it for you" to a really amazing commercial brief: perform amongst the audience, capturing their views to make them part of the mix, there and then. Working at the limits of what graphics cards could do, I was able to combine pre-made film with an array of Quartz Composer designs that animated all sorts of user-generated content, all composited and performed through a VJ application called VDMX running on a large touch-screen. For more: Ford VJ

As an artist, I really wanted to explore just what kind of storytelling could come  from the audience themselves. This would mean not presenting *to* an audience, but creating an interactive environment *for* an audience. Collaborating with Novak, that artist itch turned into designer and engineer drive and I used Quartz Composer to create a panorama around the room that was a story telling canvas, with text messages, live illustration and allsorts going on. It was quite the patch. For more: KineTXT 

Using Quartz Composer as a way to build graphical worlds, I've even written custom Mac OSX applications to package up and run those worlds for people on tour -- the integration of QC on a Mac is quite amazing, and it's quite the trick to be able to turn your VJ setup from the first gig into a black box that gets plugged in by the road crew. This approach has also really worked for running screens at meetings of all sizes, where beyond simple titling the event can be fed back itself with visualisations, twitter back-channels and more. I've got something I'm calling *spark screen-runner, and half of it is the bespoke graphic world built each time in QC.

Tell us about D-Fuse. The touring project 'Particle' looks stunning, and we know there's a lot of QC in there!

Particle is an audio-visual trip that deconstructs D-Fuse's film project 'Endless Cities' live in front of an audience. Its a dream brief: take the hard drives full of material for the film, and transform it into an abstracted, sensory experience that can only be performed live. The result is an ever-developing audio-visual set developed between Mike, Matthias and myself that has very particular staging requirements - we are playing with the form of light through the venue as much as representational imagery on screens.

An avid fan of creating his own customised hardware and software setups, this month sees the release of Toby’s newly designed video processor and controller called The Spark D-Fuser.

The *Spark D-Fuser allowed us to use high-resolution computer imagery and crossfade between laptops (check out: The twin laptop setup was enabled by the dvi crossfader and our solid state drives. The ability to tag-team the performance really transforms things, allowing the breathing space to check pace and prepare for the next section. and within each laptop, being able to seamlessly scale from cinematic playback to ultra-noodle is so empowering as a visualist.

As part of his live tour project Particle, Toby used VDMX with various Quartz Composer plugins in order to give him more creative control and freedom over the performance.

The customisation of my vj tool of choice, vdmx, allowing panels dedicated to doing creative things with 4x3 and 8x3 sources within the 12x3 canvas. this was achieved with a combination of quartz composer sources i made, fronted by an experimental vdmx feature allowing you to build your own interface plugins, and backed by a set of quartz composer plug-ins i’m working on that scale to any canvas rather than working per-pixel in the source resolution.


What this means in practice for Mike and I, is rather than 'pre-bake' the visuals, we're pushing as much rendering to the live performance, controlling all sorts of animation parameters with Matthias's audio feed and our in-the-moment feeling. These animation parameters are of course inputs to a number of Quartz Composer patches I've developed, that sit inside our VJ tool of choice, VDMX, and allow us to transform the raw source clips into motion graphics. Perhaps the main trick of Particle is how we can take small chunks of footage -- much smaller than the overall output -- and process these to form a composition across the multiple projections we have in the theatre space. I have written a suite of QC plug-ins called 'Tokyo Render' for the show that give us all sorts of abilities in that vein, but you don't need these to do a load of interesting things with the stock QC patches: while I had to learn openGL to get the show running on the hardware we had at the time, put an 'Image Texturing Properties' patch before a 'Sprite', and you're well on your way!

I’m currently focussed on software development as the key to realising my artistic desires as well as facilitating stimulating commercial work. I am expert in Quartz Composer, the node-based realtime graphics programming environment built into mac osx, and increasingly experienced in application development leveraging Quartz Composer along with quicktime, opengl, and the richness of osx’s cocoa frameworks. I’m also expert at mac osx system administration, which goes hand in hand with delivering software as a product rather than download.

Toby gives many talks and workshops on Quartz Composer but found himself attending one at a recent Live Performers Meeting in Rome.

It felt funny there being a quartz composer workshop, me not giving it, and the copy being beyond something even i could even have pumped-up (world exclusive! creator of the most widely viewed online video tutorial series!). but definitely a sign of progress in this world, so props to Graham Robinson of iLoveQC and lpm. The class had a good time, and the word was spread: hopefully more interesting work will happen because of it.