New Project: Rubato
Rubato is a platform that allows performers to control synchronised visual content on their audience’s devices. And from a foot pedal/MIDI controller too, meaning that the content is sympathetic to the ebbs and flows (the rubato) of a performance. The project is part of Sound & Music’s Audience Development Incubator.
Photo CC Paul Coles.
People looking at their phones during a gig, or using their phone as a camera, video-ing the entire thing, is seen by many as a modern scourge in the live music experience. We wondered what it would be like to make devices into a part of the show – creating an intimate, heads-down experience.
For the past few months, I’ve been working on a project called Songs For Spoken Words (which will be announced properly soon). For the performance of this work, I wanted a way to project the words onto a screen. This would involve a projector, and possibly a click track that syncs the playing to the visual. Something about that set up felt wrong. Like we (myself and a string quartet) would be supporting the visuals, rather than the visuals supporting us.
So I emailed Tom Armitage – a lovely chap who is known for his incredible abilities as a Creative Technologist – about making something for the audience to read on their phones, but sync’d with the music.
Tom had wanted to work on something like this for a while, and we had conversations about Dan Deacon’s performances (where the audience’s phones become a synchronised light show) and Amon Tobin’s ISAM experience (using projector mapping to melt your mind) as influences. These are both things that made us go “that is amazing, but wouldn’t it be cool if [something else could happen]“.
So we pitched to be part of the Audience Development Incubator at Sound & Music. And we’re in.
We start work next week, and will debut Rubato in February at the Songs For Spoken Words launch performances in London.
Amon Tobin re-calibrating your eyes and ears:
Dan Deacon’s app in action: