Moment Factory Provide Virtual Dancers And Live Tweeting At Usher's London Show [Q&A]
With the decline of sales from albums and singles, musicians are increasingly looking to the live shows as a place to make the big bucks as well as engage with their fans in a direct way. But, much like the rest of the music industry, the live show is in a state of flux. Visuals are, and have been for some time, a big part of the live experience, but as incredible as some visual components can be, what’s also becoming increasingly common are interactive elements using technology.
Of course, a live show has always been about interaction with the audience, but with technology people can interact differently, taking the experiences we’re familiar with on the web and translating them for the stage, even involving the audience in the on-stage antics, albeit in a virtual way. Along with this digital interaction is the trend of live webcasting. But rather than just watching the artists perform their set, some musicians are choosing to augment the experience by pairing themselves with innovative ideas or well-known directors. British Sea Power broadcast their gig at the Roundhouse last year through the filter of a hacked Kinect, and earlier this year Jack White’s performance at New York’s Webster Hall, which was streamed online, was directed by Gary Oldman.
In the same series of shows as Jack White’s, R&B artist Usher took to the stage in London at the Hammersmith Apollo on Monday this week. Along with being broadcast as a live webcast, directed by Hamish Hamilton who was behind Madonna’s Super Bowl Halftime Show, the show also featured interactive elements brought to virtual life by multimedia collective Moment Factory. For the Usher show they incorporated a dancing avatar crew created by fans and real time Twitter interaction, with tweets animated, remixed, and posted on stage as part of the visuals.
We emailed a few questions off to Stephane Raymond, one of the producers at Moment Factory, to find out how it all worked and how important he thinks this kind of interaction in a live show is.
The Creators Project: What was the idea behind having people create their own avatars?
Stephane Raymond: American Express and Digitas [who are behind the series of live shows] wanted to find a way to bridge the digital/online and the immersive/in situ worlds. The Amex Unstaged events are well-known. By creating dancing avatars for this one-off in London, the goal was to enhance the public’s experience, who can actually see themselves dancing with Usher during the show.
How did you go about accomplishing the task of having virtual avatars dance on stage?
People create their avatars on the web, then we choose the widest variety of avatars and program them into a digital fresco, mixing them to the rhythm of the song (“Scream”) and to the the general artistic direction.
What were some of the challenges of taking on a project like this?
The first challenge was time. We had just two and a half weeks to create interactive content for four songs. The other challenge was related to the three other songs (“Yeah”, “There Goes My Baby”, and “Without You”) where we played with tweets from the public, live during the songs. There was a web agency in London who were filtering the tweets and then our team choose from them so we could mix them and play with the letters. We animated hundreds of tweets during these songs, gave them movement and color and special effects. This was very exciting!
Were there any added problems/worries because it’s a live online broadcast too?
No, because wherever the public is, we have to deliver! And Hamish Hamilton (who we’ve collaborated with on the Super Bowl Halftime Show with Madonna this year) was in charge of directing the broadcast, so there were no worries.
Fans’ tweets become visuals for the show
How did the on-stage real-time tweeting setup work?
People had to tweet from the live stream page, allowing Amex to reuse the content on stage. As stated, an agency in London was moderating the content. Then our team of on-site interactive artists received the tweets, aligned them, and then opened the stream when it was time, prioritizing the messages which were pertinent to the show and the songs. They also tried to expose a diversity of tweets. Then, live, the team animated the letters and created special effects (color, movements, explosions of letters, etc.).
How important are these interactive element at concerts now, do you think?
We believe this type of interactive show is becoming a trend and we’re only just seeing the beginning of it now. There are so many things that can be done to create a bi-directional relationship between the public and the artists they love. The goal is to connect people together, to give them a reason to go out of their houses in order to experience something new and something in which they can take an active part. We want to encourage people to collaborate and to be part of the event, by changing the content and the visual elements live during the event, for example. Using technology, we hope we can connect people more and let them express themselves.
Tweets as visuals
How do you see this type of audience interaction evolving in the future?
Again, there are so many things that can be done. A lot of artists have great ideas. But we don’t want to give away too much… What we can say though, is that we’d like to invite the public to use their smartphones and tablets more during events and shows, allowing them to be a greater part of it.