Dan Deacon's Concert App: Developer Talks Inspiration, Ideation, Creation
Dan Deacon's Concert App: Developer Talks Inspiration, Ideation, Creation

With the release of his newest album titled simply "America," electronic musician Dan Deacon has debuted a free smartphone app available for Android and iOS that will, among other things, greatly impact his live shows.

Deacon's new app, like many other artist apps, links to his website, social properties and tour dates; but what's most impressive about the app are the light and musical functions which will incorporate users into the live show. Rather than thinking about the app/concert combination as some vapid promotional or revenue generating scheme--as many artists have--Deacon's intention is to transform you and your phone into an integral part of the performance. The light changes and morphs in time with the music becoming the world's smartest (and most expensive) glow stick appropriately complimenting Deacon's eccentric style of glitchy-yet-orchestral electronic music. Likewise the app's musical instrument can be programmed to play in harmony with whatever song is being performed.

Dan Deacon Signs With Domino Records

"I thought more and more about how we are living in a time where much of the concert going audience has smart phones, that are basically computers, that can be used as both sound making machines and lighting devices," Dan Deacon wrote on his website where he released the app. "The thought of an audience being the light and the sound source for a show became one of my main goals. The detailed spatial sound environments that could be created: lights all in unison coming from endless and constantly changing direction as the audience moved and change yet the lights all change in unison; using the LEDs as strobe lights, etc. A whole world began to emerge."

Dan Deacon app screen shots as featured on the Apple iTunes App store.

To bring this concept to life, Dan approached a few of his friends who worked together to make it happen. Keith Lea, the head programmer behind the application, spoke with Billboard.biz to shine a light on the team that pulled this off and explain how he managed to make the phone's visuals synch up without the use of WiFi or data.

Billboard.biz: How do you know Dan? How did he approach you to make the application?

Keith Lea: Dan and I met around 2004 through mutual friends. I was attending Rensselaer Polytechnic Institute but constantly visiting Purchase College, where Dan and most of the members of Wham City [a collaborative arts collective] studied.

The app idea started brewing on the 2011 Wham City Comedy Tour (bus, when Dan brought up his idea with myself and Alan Resnick. He asked if it would be possible to make audience members' smartphones flash colors in sync with each other and with the music. I whipped up a little Web-based prototype right there on the bus, and it didn't work at all, as the phones' clocks were all way off from each other. Upon realizing this, I was immediately intrigued by the technical challenge presented by this idea.

What was your role in creating the app, and how did that relate to the roles of everybody else on the team?
I wrote all of the code (C++, Objective C, and Java) for the Android and iPhone apps. While scrambling to learn as much as I could about [app development], I tried out various algorithms and protocols for communicating from the stage to the phone. Once I settled on a protocol that was robust and efficient enough for our specific, unusual requirements, I wrote the desktop application that produces the calibration tones, and the algorithm to interpret the calibration tone on the phones themselves. We then sent it off to our dear friend Dina Kelberman who did all the graphic design for the app.

The roles of the other members were as follows: Dan was full of ideas every week, as were the rest of us, and it was my job to determine what was technically possible and guide the group in that way. The group as a whole -- myself, Dan, Alan and Robert O'Brien developed these ideas and decided what the final product should look like. This design & refinement process was largely guided (or at least reined in) by digital artist Alan Resnick, and of course Dan himself.

What was the inspiration for some of the specific decisions made towards overall user experience?
Our guiding principle was transforming the cell phone into an agent for a collective aesthetic experience. Collective was the keyword. A few times we were tempted to show imagery, video, or patterns on the phone's screen during Dan's songs, instead of just the solid color light show. And it was tempting to add little games or interactive features that people could use to really "participate" in the show. But we quickly realized that the whole point was to take people out of their phone bubble, to use their phones but not in a self-absorbed way. When we imagined a crowd full of people staring at their phones instead of dancing or being present at the show we collectively wanted to puke.

Right now it appears the app is geared to promote the "America" album and is skinned as such, but it's called "Dan Deacon." How might this application adapt and iterate to suit the needs of different Dan Deacon projects?
We have lots of exciting ideas and are currently workshopping and prototyping. We also plan to make parts of the app open source, so people will be able to utilize the underlying technology to do whatever they want. I'm looking forward to seeing what kinds of great ideas people come up with. Someone has already approached me about using the code to turn a cellphone into an assistive hearing device for live shows, for the hard of hearing. Pretty cool!