Beatsurfing: The Music Production Interface Of The Future Comes Early

In the 90s, when music production software began its ascent to ubiquity, computer-screen producers wondered what it would be like to use more than just a mouse to control their arrangements. Sure, access was at an all-time high, but control was severely lacking. Each parameter was controlled individually, and being restricted to a single point of contact with your entire onscreen setup had beat makers pining for the day when the advent of the touchscreen would bless them with multi-touch relief.

That day is here, but the interface looks nothing like what anyone anticipated. With such potential at our fingertips, why would we want to use the traditional rack/sequencer configuration? Why emulate the hardware we’ve been using for decades? Why emulate reality at all?

Enter Beatsurfing. By streamlining various types of controls and allowing you to manipulate multiple parameters with a single movement, Beatsurfing attempts to make music-making a truly intuitive process, ignoring the past methods that we find ourselves mired in so often.

Rather than tapping pads or twisting knobs, Beatsurfing utilizes the the swipe motion, something the iPad responds to remarkably well. You can arrange your sound components based on how you swipe, turning your compositions into a pinball board of sonic activity.

The app came about from the collaboration of Vlek Records and Herrmutt Lobby, a Belgian production duo known for adventurous controllerism, first breaking out with their experiments in using video game controllers as MIDI controllers.

We spoke with the whole team to find out how Beatsurfing works and why they made it work the way it does.

Beatsurfing is currently in beta testing. If you’d like to test it out for yourself, register here.

The Creators Project: Why is an interface like this necessary for today’s producers? What do you hope to achieve with Beatsurfing?
Beatsurfing:We can’t really say it’s necessary, however we believe that the “real time” and spontaneity aspects of the app can radically change the way you think about producing music. By “real time” we mean that the music can be a physical movement extension and the core concept of Beatsurfing is really about movements—you play it by sliding your finger(s) across the screen.

We are very excited to discover the ways in which users will take advantage of the app. We’re sure it will be a great source of inspiration for our own way of thinking about music. This will also give us ideas on the direction we need to take the app in the future.

Can you describe what’s happening in this video? Which shapes represent which types of sound? Which colors? Is there a pattern to this?
First, we think it’s important to highlight that the application is designed to produce a real-time MIDI flow. The MIDI output is an instantaneous result of the confrontation of three things:

1) The architecture of the scene (geometry)
2) The rules you fix between objects (systemic, cybernetic)
3) The way you surf along the screen (time, space, interpretation)

As with the main part of the iPad music apps, you still can play by hitting objects on the screen, but here it’s more appropriate to slide over them with your fingers. Actually that’s why we call it Beatsurfing—you can easily play a complete piece of music guided by your feelings and intuition, and reinterpret it in many ways.

The iPad is not accurate as a percussion instrument. If you want to play by hitting any type of interface, you should really be hitting pads or drums. But developing a piece of music by surfing your fingers on the screen and putting your sounds and controllers anywhere on the screen at the same time as you build your track is something really fresh. When objects can interact with each other, it becomes an organic environment. Surfing along the screen to generate music with great freedom is the main idea here.

The video (top) shows a particular use of Beatsurfing, in which basic interactions are set between objects. Herrmutt Lobby used the app to create modular synthesis on a simple beat—kick, snare, and voice (vocals by MC NON of the Shadow Huntaz, a long-time Herrmutt Lobby collaborator).

Figure 1: Beatsurfing iPad screenshot

A & B are SNARES
C & D are KICKS that turn into BASS when held.
E moves the playback head along the first voice sample WAV (Fig. 2)
F & G change the size of the loop around the playback head (Fig. 2)
H changes the PITCH on C & D (when used as BASS)
I is not used for this video. But sets all faders to center position
J is a fader that controls the playback head reading another voice sample WAV, launched by K.
K launches the voice sample WAV controlled by J.

Figure 2: Ableton Live screenshot

There are Behaviors involved in this scene (see Fig. 3). The Behaviors are a very important part of Beatsurfing. They allow the user to link objects together to alter their states when they’re played in a certain sequence. For instance, here we can see how the E fader’s value in the middle is set to maximum when A & B (SNARES) are collided (“GOTO 127”), and to a minimum when colliding with C & D (KICKS, “GOTO 0”). F is a control to set the fader back to a medium position (“GOTO 63”).

Figure 3: Behaviors

How does the Beatsurfing app interact with an existing DAW?
It’s been developed for music production and performance. You control anything that is MIDI-enabled—digital and analog devices, synths, VJ softwares, lightning systems, you name it.

Can you describe the characteristics of the Line, the Circle, the Polygon, and the Fader?
The app features 4 different “operators” that we call Line, Circle, Polygon and Fader. Each of these operators have different characteristics, which allows you to build your own MIDI instruments from the ground up by combining them in a creative way. The distribution of the objects on the scene and the trajectory of your fingers then help reintroduce variety and a human feeling to your performance.

Because the idea here is to slide your finger on the surface, we can start to think about interesting ways to place the operators on your scene such as overlapping them. This concept is very interesting because you can add many layers. You may have, for example, a few Line objects that trigger some drums and a Fader underneath these Line Objects that drives a filter and by sliding your finger from Line to Line to play a rhythm and therefore be colliding at the same time as the Fader underneath. It may create some interesting variation that would be harder to do otherwise.

All four objects are very simple operators if you consider them separately. It is also important to highlight that each operator has a concept of “behaviors” they can be assigned with.

The Line can trigger a unique MIDI note defined as the “root note.” Each time your finger collides a Line, the MIDI note message is sent. Then you have the notion of directional detection that you can enable or not. When enabled, it will detect in which direction you pass through the Line and trigger the root note if you slide your finger from right to left, or the next note relative to the root note if you slide your finger from left to right.

The Polygon is very similar to the Line, where each segment of the polygon has the same characteristic as the Line operator.

The Circle is more or less a sequencer. You can define from 1 to 16 steps. Each step triggers a MIDI note. Each time you collide a Circle, the current step is increased and as a result triggers the next MIDI note. Other objects in the scene can be configured to reset or set a particular step on the circle.

The Fader sends MIDI Control Change or Pitch Wheel messages.

You can define:
- a minimum and maximum value from 0 to 127.
- the “attack” which is the time it will take to go from its current position to the new position.
- the “return value” let you define the value it returns as soon as your finger release the Fader.
- the “release” which is the time it will take to go from the latest position to the return value position. So for example if you combine the attack,release and return value parameters it will let you create a pitch wheel control.

- the “definition” that represent the number of steps shown on the Fader. Assuming a Fader assigned from 0 to 127, If you set the definition to 3, when you press the first step it will trigger the value 0, the second step will trigger the value 64 and the third step will trigger 127.
If an attack or a release is defined it will tween smoothly from values to values.

Then we have the notion of Behaviors where each type of Object have a defined set of them.

For example, one of the Circle’s Behaviors is called “Clock Direction.” It can be activated by any other object on the scene and be set to “Flip-Flop,” so each time this behavior is triggered the Circle it is associated with switches its playback direction from clockwise to counterclockwise, and vice-versa. See the explanation of the ‘Granular Synthesis’ scene (Fig. 3) for another example of behaviors.

Is Beatsurfing built for a certain style of music?
Not at all, we think Beatsurfing is a tool that can complement any setup. We hope to see musicians using Beatsurfing in any style of music from jazz to rock to techno. In fact, even though it’s built with music production in mind, we believe that Beatsurfing will also be used for things other than music, maybe for controlling lights or visuals.

How’s beta testing going? What are some of the improvements being made before launch?
We are currently very busy preparing the launch of the private beta release in about a week. The subscriptions are still open here. All features are implemented by now and therefore our main focus will be about making the app as stable as possible while gathering as much feedback as possible from our beta testers and if interesting enhancements are highlighted we will try to squeeze them in for the 1.0 release. We can’t provide a precise date for the release but we are committed to have it in the App Store by the end of spring.

We have many ideas, many possible directions. We currently have a roadmap for a few dot releases with features that we could not deliver in time for the first release. We embrace the principles of lean startup, we drive a culture of fast innovation and pleasing our users. Our next step, after the release, will be to watch how the people actually use Beatsurfing, and collaborate with them to elaborate a roadmap.

How did the concept for Beatsurfing originate?
Herrmutt Lobby has been developing and using modded yet powerful hardware controllers (that were already using human movement in a creative way) for years, along with programming acclaimed Live-MAX/MSP patches and bits of other software. But until then, none of the available technologies enabled them to use software to create controllers from the ground up, with total freedom of movement and an acceptable latency/response. This has changed only recently with the arrival of the iPad and other touchscreen tablets. It opened totally new grounds for reflection on how you could create your controller as if you were to draw a map and let your fingers explore.

As a result, Beatsurfing is the combination of many technologies developed by Herrmutt Lobby through the years. Plus a lot of plain new ones, inspired recently by the tablet’s touch-screen capabilities.