So far I’ve talked about the music, story and photos. Let’s package those together (as they will be in the final installation) and call them ‘Media’.
The physical music-box will need to control the Media. Therefore, I needed to develop an interface between the Physical Installation and the Media. I would also need to have the Media in a format that would be controllable via the interface.
First off, let’s talk interface.
This and the string quartet aspect of Night Sun are the key areas in which I want to develop as an artist. I’ve wanted to combine Arduino and music for a long long time.
For this, I went to DoES Liverpool, a Co-working space and Maker Community, home of lord of The Internet of Things: Adrian McEwan. He and I spent a day making an Arduino do the three things I wanted it to do: respond to Touch, Turning, and Timeout.
In order for the public’s use of the toy music box to affect the projection/music, Adrian McEwan and I got an Arduino to act as a USB keyboard, sending key-presses to the laptop according to the three human interactions.
Story: When a user touches the music box, something will happen.
For this to work, we connected the music-box to the Arduino with a piece of wire through which was sent a small electrical charge, which the Arduino measures the capacitance of the wire with. Using the Serial Monitor in the Arduino editor, we could see that when no-one was touching the music-box, the readings were between 0 and 150, jumping to over 3000 when touched.
In this video, we test the Capacitive Sensor script. If the Arduino measures the capacitance of the metal music-box going above 3000, then it sends a keypress (space bar) to the computer.
We had Spotify open anyway, listening to Kettel (electronic music for tinkering with electronics). So we tried out the music box as a ‘dead man’s handle’ – if you touch it, the track plays, if you don’t, it pauses.
Story: When a user turns the handle of the music box, the Media will play.
In this video, we test the Rotary Encoder. If you turn the knob, the Arduino sends a keypress (space bar) to the computer. If you stop turning, it send another keypress of ‘space bar’ to pause.
The rotary encoder would eventually be connected to the music-box via a cog. This would be the only interaction that would send a keypress of ‘space bar’ as well. The ‘touch’ interaction would send different keypresses to move the projection through it’s different states:
- No-one touching: Dark Shop Window (projecting a black screen)
- Touching (but not turning): Light Shop Window (projecting a white screen)
- Touching & turning: Play Media
Story: When a user stops touching the music box, the Media will pause and, eventually, reset (ready for the next user to begin at the start).
In this video, we test the reset function. I wanted the Arduino to reset the installation (go back to the start of the music/movie) if a user hadn’t touched the music box for an amount of time. Keep your eye on the Text Edit window. When Adrian touches the music box, the Arduino sends a keypress of ‘T’. When he touches it again within five seconds, nothing happens. When he doesn’t touch it for over five seconds, the Arduino sends a keypress of ‘N’.
These keypresses of ‘T’ and ‘N’ were just so that we could see it working. The final version would send numbered keypresses that would be bound to hyperlinks that loaded the previously mentioned different states that would make up the Media.
I needed a way for different keypresses to alter the ‘state’ of the Media, between three screens:
- Blank (static)
- White (static)
- Playing Media
The first idea was to create a movie file containing the three states. The Arduino’s keypresses would then take the user to different places on the timeline of the Media:
This seemed pretty tidy, but something about it didn’t sit right. I think this was mainly down to me not being confident with rendering video files to strange screen configurations (depending on the areas the installation would be projected onto), and wanting a way to make last minute changes without having to render again and again.
Orchestrated Text only used differently styled text, but I was sure that if you could do that, then you could change the
background-image of a div as well, or its
z-index – allowing me to move between Dan’s photographs as well as the paragraphs of Emma’s story. With this method, I could also apply CSS transitions (which you wouldn’t get jumping around an AVI file in VLC).
Above are my notes on Emma’s story. I arranged the words into a configuration that would work on a tall window (like an extremely oversized iPhone).
Each line was then given a number. This number applies to the HTML, where all the lines are laid out. They all have a default of
FadeIn at a point in time.
Then I apply Timings to each line by reading out the story while listening to the music, noting the time in the audio file when I naturally begin reading a new line. These Timings are added as a list in the .js file:
There’s not much else to show here. Lots of time spent refreshing Chrome and making adjustments to the code. Instead, let’s jump to when all eight photos and paragraphs had been put into the timeline of the audio.
I then created a html file that was just a black screen. And another that was just white. So I had a mini-site of three pages:
1.html – Black screen
2.html – White screen
nightsun.html – music, photos and words (on autoplay).
KeyPress('2')takes the user to 2.html when they touch the music box
- While on 2.html,
Keypress('1')resets to black (if they let go), while
KeyPress('space')takes them to the Media (autoplay)
- While on nightsun.html,
KeyPress('space')pauses the audio, and starts it again after pause, if the user stops turning and then starts turning again quickly after.
- While on nightsun.html,
KeyPress('1')takes the user back to black if they haven’t touched the music box in five seconds.
Phew. Here’s a photo of my desk after all that was worked out, coded up, and tested with the music box.