Visual Pixation with Rick Jacobs
Our quest for excellence in the visual space has now brought us to Rick Jacobs of RJB visuals.

Touring currently with Nicky Romero, and the man behind the operation and visual design of his entire show, the stuff that Rick does is epic of massive proportions. [fold][/fold]

What do we love about him?
He makes some great, heavily detailed content which is then displayed perfectly in sync with what Nicky is doing on stage. I, personally, love the magnitude & depth with which he portrays infinity, space and the inexplicable wonders of it.


We reached out to Rick to talk to us, and throw some light on the great work he is doing.
What is touring with Nicky like? When did this great journey begin & how would you say you have grown with it?
It started 4 years ago, my first show with Nicky was Ultra 2013, the Main Stage. I was so nervous, everybody at home watching, my friends, family. Before that I had vj’d at clubs with just 1 output always. So, for Ultra, I brought 2 laptops to handle multiple outputs - being the newby I were back then ;)
Nicky and the team were impressed with that first show and offered me to tour with them. I chose to finish school first, because it was just 3 months left. I graduated as a game design and developer and missed my graduation ceremony as I went straight to Vegas to tour with Nicky.

When I finished the tour I started RJB Visuals and teamed up with my brother Bob who was studying game art. Our teamwork was put to the test, immediately. We needed to conceptualize and create a 5min intro visual in 3 days!
Nowadays, we plan 1 month for an intro- This has become kind of our signature.
Here are links to some intros: Novell & Omnia
It’s been a really awesome journey so far, Nicky and the team trust Bob and I with the whole visualization of the show. When I started, they more or less had just the guy fawkes mask, so I had the freedom to design and develop a whole new visual style for his shows, which was really great!
Here is a sneakpeak into the latest intro for Nicky:


You and the LD do a great job live. How much of a role does SMPTE play in this & how much is freestyle?
The first 2 years that I toured with Nicky, we didn’t have a LD. After that, Koen van Elderen joined the team and I couldn’t have been happier! The guy is great, he programs really fast and we come up with new things while we are doing the show. We just understand each other immediately.
The whole show is freestyle, we never use SMPTE.It keeps us focused. Also, I don’t link all visuals to songs. One day this song has these visuals the next day you’ll see something different, it depends on what colors Koen and I yell at each other.

For all lyrics I use cue points so as soon as I hear Nicky mixing in a track with vocals I’ll ready up the visual and start cueing it.
From on point strobes, to perfect transitions, to super color changes- there’s gotta be a lot of concentration, communication & practice involved between you and Koen.
Like I said, Koen and I are just really on the same page. We make up new stuff during the show and remember it for the next show.We normally don’t receive a playlist or a lot of info on his set so we often get some nice surprises and have to come up with something, along the way.
It usually goes something like this:If you take the TISKS I’ll take the BOEMS.. Sure thing..Or whenever there are really powerful accents in a song we look at each other and ask “do you want to take these or shall I take them?” Haha!
It’s fun to change stuff around now and then.


Also at each outro of a song we turn to each other and one of us will say the next color and we change it at the same time, when the song changes over. Or, if it’s a familiar song with its own visuals we both already know what to do or I make hand gestures of the visual that is coming up next so he will know the color. Sometimes, I will be stone faced visualizing a sea with my hands and he will know which visual is coming up.
What are your favorite effects & features on Resolume, that you would encourage VJs to incorporate into their show?
Mostly my effects are small flashy clips linked to buttons on my MIDI panel, but my knobs are linked to various screenshake/twitch/strobe effects. Mainly all sorts of small effects to highlight certain melodies or accentuate the bass.
What brief/ thought process do you follow while designing content for the show. We see a whole range from nature to space to waterfalls to abstract.
We try to create contrast between drops and breaks by changing the color scheme, style and pace while at the same time try to have the transitions be as fluid as possible. Break visuals for Nicky Romero's show are often desaturated/black-and-white realistic looking visuals while the drop visuals are full of flashing neon colors and abstract shapes loosely based on the realistic styled visual. Putting these completely different styles together in one song works as a great crowd booster.

The risk of mixing these completely different styles after each other is that it could lead to too harsh of a transition. We're not a big fan of fade ins so several visuals have an actual intro clip that will autopilot to the next clip which is a loop. They're sort of a 'special fade in' to a loop starting of black and having the visual's scene unfold in a smooth transition.
Here are some Intro Clips:



Talk to us about your go to software & hardware (both for content creation & operation).
Most of our content is created in Cinema4D with the Octane renderer. For all the intros we use Autodesk Maya, since we have a history in game design and development we were pretty used to working in Maya or 3ds Max at school. It just has a little bit more extra options to get that exact look you want for the intro.
When we started creating short visual loops we soon realized Cinema4D is much more straight forward for creating visuals.For post we are using After Effects. And, of course, for vjing Resolume!
As for hardware, I’m using an MSI GT73VR 6RF Titan Pro and the Akai APC40MK2.
Tell us about your studio. What gear is an absolute must have, and what would you like to change/ upgrade?
My studio isn’t that great actually, haha, we have a small office in Limburg at my parents place. One of our employees is also from Limburg so half of the week we’re working in Limburg and the other in Utrecht.

We have a small setup in my apartment in Utrecht, my brother lives with me so it’s easy to work from home. In the near future we’re planning to get an office as we’re expanding and looking for new people to work with.
As for an upgrade, I really need more render power, haha, with these 4k visual content rendering is a nightmare.


Any words of advice for budding visual artists out there?
Less is more! Don’t layer 6 completely different visuals on top of each other and mix them every beat. It can become chaos really easily. Also black is your friend, leave enough black space to make your visual really pop out.
Is there anything else you would like to talk about? We would love to hear it.
Our most recent development is that we’re starting a co-operation called Visual Lab with Hatim. Hatim was the reason I started vj’ing for Nicky and over the years we built a great bond as he is Nicky Romero’s tour/production manager.
As probably all of us here know, talking and arranging gigs/assignments is the least fun part of our job so it seems like a great idea to have someone do that for us. It also seems like the next big step for our company and will lead to us hiring more talented vj’s and content creators.
Also, recently we’ve been working on creating a more generic visualpack we would like to sell on Resolume.
It’s interesting creating visuals that are not for your own use because normally we create pretty specific visuals for certain parts of the show. Now we need to forget about that and create visuals that can be used by anyone in any situation. It’s a good practice. I think we have come up with a pretty cool style of modern styled visuals and classic kaleidoscopic visuals for your enjoyment. :)


And, on a last note ,we are working on a VR horror escape room game in between all the visual work related stuff. Got to keep those university-skills going! :D
If you’re interested we’ll post something about it on our social media in the future.

*Shudders* Oooh this gave us chills.
Thanks for doing this interview Rick. We all look forward to those visual packs from you, and wish you so much success with Visual Lab.
With skills like that, you’re miles ahead already :)
Check out some more of Rick’s work here
Credits:
Rick and Bob, RJB Visuals + Visual Lab
Follow them on: Instagram & their website
Touring currently with Nicky Romero, and the man behind the operation and visual design of his entire show, the stuff that Rick does is epic of massive proportions. [fold][/fold]
What do we love about him?
He makes some great, heavily detailed content which is then displayed perfectly in sync with what Nicky is doing on stage. I, personally, love the magnitude & depth with which he portrays infinity, space and the inexplicable wonders of it.
We reached out to Rick to talk to us, and throw some light on the great work he is doing.
What is touring with Nicky like? When did this great journey begin & how would you say you have grown with it?
It started 4 years ago, my first show with Nicky was Ultra 2013, the Main Stage. I was so nervous, everybody at home watching, my friends, family. Before that I had vj’d at clubs with just 1 output always. So, for Ultra, I brought 2 laptops to handle multiple outputs - being the newby I were back then ;)
Nicky and the team were impressed with that first show and offered me to tour with them. I chose to finish school first, because it was just 3 months left. I graduated as a game design and developer and missed my graduation ceremony as I went straight to Vegas to tour with Nicky.
When I finished the tour I started RJB Visuals and teamed up with my brother Bob who was studying game art. Our teamwork was put to the test, immediately. We needed to conceptualize and create a 5min intro visual in 3 days!
Nowadays, we plan 1 month for an intro- This has become kind of our signature.
Here are links to some intros: Novell & Omnia
It’s been a really awesome journey so far, Nicky and the team trust Bob and I with the whole visualization of the show. When I started, they more or less had just the guy fawkes mask, so I had the freedom to design and develop a whole new visual style for his shows, which was really great!
Here is a sneakpeak into the latest intro for Nicky:
You and the LD do a great job live. How much of a role does SMPTE play in this & how much is freestyle?
The first 2 years that I toured with Nicky, we didn’t have a LD. After that, Koen van Elderen joined the team and I couldn’t have been happier! The guy is great, he programs really fast and we come up with new things while we are doing the show. We just understand each other immediately.
The whole show is freestyle, we never use SMPTE.It keeps us focused. Also, I don’t link all visuals to songs. One day this song has these visuals the next day you’ll see something different, it depends on what colors Koen and I yell at each other.
For all lyrics I use cue points so as soon as I hear Nicky mixing in a track with vocals I’ll ready up the visual and start cueing it.
From on point strobes, to perfect transitions, to super color changes- there’s gotta be a lot of concentration, communication & practice involved between you and Koen.
Like I said, Koen and I are just really on the same page. We make up new stuff during the show and remember it for the next show.We normally don’t receive a playlist or a lot of info on his set so we often get some nice surprises and have to come up with something, along the way.
It usually goes something like this:If you take the TISKS I’ll take the BOEMS.. Sure thing..Or whenever there are really powerful accents in a song we look at each other and ask “do you want to take these or shall I take them?” Haha!
It’s fun to change stuff around now and then.
Also at each outro of a song we turn to each other and one of us will say the next color and we change it at the same time, when the song changes over. Or, if it’s a familiar song with its own visuals we both already know what to do or I make hand gestures of the visual that is coming up next so he will know the color. Sometimes, I will be stone faced visualizing a sea with my hands and he will know which visual is coming up.
What are your favorite effects & features on Resolume, that you would encourage VJs to incorporate into their show?
Mostly my effects are small flashy clips linked to buttons on my MIDI panel, but my knobs are linked to various screenshake/twitch/strobe effects. Mainly all sorts of small effects to highlight certain melodies or accentuate the bass.
What brief/ thought process do you follow while designing content for the show. We see a whole range from nature to space to waterfalls to abstract.
We try to create contrast between drops and breaks by changing the color scheme, style and pace while at the same time try to have the transitions be as fluid as possible. Break visuals for Nicky Romero's show are often desaturated/black-and-white realistic looking visuals while the drop visuals are full of flashing neon colors and abstract shapes loosely based on the realistic styled visual. Putting these completely different styles together in one song works as a great crowd booster.
The risk of mixing these completely different styles after each other is that it could lead to too harsh of a transition. We're not a big fan of fade ins so several visuals have an actual intro clip that will autopilot to the next clip which is a loop. They're sort of a 'special fade in' to a loop starting of black and having the visual's scene unfold in a smooth transition.
Here are some Intro Clips:
Talk to us about your go to software & hardware (both for content creation & operation).
Most of our content is created in Cinema4D with the Octane renderer. For all the intros we use Autodesk Maya, since we have a history in game design and development we were pretty used to working in Maya or 3ds Max at school. It just has a little bit more extra options to get that exact look you want for the intro.
When we started creating short visual loops we soon realized Cinema4D is much more straight forward for creating visuals.For post we are using After Effects. And, of course, for vjing Resolume!
As for hardware, I’m using an MSI GT73VR 6RF Titan Pro and the Akai APC40MK2.
Tell us about your studio. What gear is an absolute must have, and what would you like to change/ upgrade?
My studio isn’t that great actually, haha, we have a small office in Limburg at my parents place. One of our employees is also from Limburg so half of the week we’re working in Limburg and the other in Utrecht.
We have a small setup in my apartment in Utrecht, my brother lives with me so it’s easy to work from home. In the near future we’re planning to get an office as we’re expanding and looking for new people to work with.
As for an upgrade, I really need more render power, haha, with these 4k visual content rendering is a nightmare.
Any words of advice for budding visual artists out there?
Less is more! Don’t layer 6 completely different visuals on top of each other and mix them every beat. It can become chaos really easily. Also black is your friend, leave enough black space to make your visual really pop out.
Is there anything else you would like to talk about? We would love to hear it.
Our most recent development is that we’re starting a co-operation called Visual Lab with Hatim. Hatim was the reason I started vj’ing for Nicky and over the years we built a great bond as he is Nicky Romero’s tour/production manager.
As probably all of us here know, talking and arranging gigs/assignments is the least fun part of our job so it seems like a great idea to have someone do that for us. It also seems like the next big step for our company and will lead to us hiring more talented vj’s and content creators.
Also, recently we’ve been working on creating a more generic visualpack we would like to sell on Resolume.
It’s interesting creating visuals that are not for your own use because normally we create pretty specific visuals for certain parts of the show. Now we need to forget about that and create visuals that can be used by anyone in any situation. It’s a good practice. I think we have come up with a pretty cool style of modern styled visuals and classic kaleidoscopic visuals for your enjoyment. :)
And, on a last note ,we are working on a VR horror escape room game in between all the visual work related stuff. Got to keep those university-skills going! :D
If you’re interested we’ll post something about it on our social media in the future.
*Shudders* Oooh this gave us chills.
Thanks for doing this interview Rick. We all look forward to those visual packs from you, and wish you so much success with Visual Lab.
With skills like that, you’re miles ahead already :)
Check out some more of Rick’s work here
Credits:
Rick and Bob, RJB Visuals + Visual Lab
Follow them on: Instagram & their website
Resolume Blog
This blog is about Resolume, VJ-ing and the inspiring things the Resolume users make. Do you have something interesting to show the community? Send in your work!
Highlights
Make Some Noisia
Dutch electronic megahouse Noisia has been rocking the planet with their latest album ‘Outer Edges’.

Photo by Diana Gheorghiu
It was a wait. But one that was truly worth it. Essentially a concept album, they pushed the boundaries on this one by backing it up with a ‘concept tour’.
An audio-visual phenomenon with rivetting content, perfect sync & melt-yo-face energy, the Outer Edges show is one that could not pass our dissection.
[fold][/fold]
We visited Rampage, one of the biggest Drum & Bass gigs around the world & caught up with Roy Gerritsen (Boompje Studio) & Manuel Rodrigues (DeepRED.tv), on video and lighting duty respectively, to talk to us about the levels of excellence the Noisia crew has achieved, with this concept show.
Here is a look at Diplodocus, a favorite amongst bass heads:
Video by Diana Gheorghiu
Thanks for doing this guys! Much appreciated.
What exactly is a concept show and how is preparation for it different from other shows?
When Noisia approached us they explained they wanted to combine the release of their next album “Outer Edges” with a synchronized audio visual performance. It had been 6 years since Noisia released a full album so you can imagine it was a big thing.
Together, we came up with a plan to lay the foundation for upcoming shows. We wanted to focus on developing a workflow and pipeline to create one balanced and synchronized experience.
Normally, all the different elements within a show (audio, light, visual, performance) focus on their own area. There is one general theme or concept and everything then comes together in the end - during rehearsals.
We really wanted to create a show where we could focus on the total picture. Develop a workflow where we could keep refining the show and push the concept in all different elements in a quick and effective way, without overlooking the details.
What was the main goal you set out to achieve as you planned the Outer Edges show?
How long did it take to come together, from start to end?
We wanted to create a show where everything is 100% synchronized and highly adaptable. Having one main control computer which connects to all elements within the show in a perfect synchronized way.This setup gave us the ability to find a perfect balance and narrative between sound, performance, lights and visuals. Besides that we wanted to have a modular and highly flexible show. Making it easy and quick to adapt or add new content.
We started with the project in March 2016 and our premiere was at the Let It Roll festival in Prague (July 2016).
The show is designed in such a way that it has an “open-end”. We record every show and because of the open infrastructure we are constantly refining it on all fronts.

What are the different gadgets and software you use to achieve that perfect sync between audio/video & lighting?
Roy:Back in the day, my graduation project at the HKU was a vj mix tool where I used the concept of “cue based” triggering. Instead of the widely used timecode synchronization where you premix all the content (the lights and the video tracks), we send a MIDI trigger of every beat and sound effect.This saves a lot of time in the content creation production process.
The edit and mix down of the visuals are basically done live on stage instead of on After effects. This means we don't have to render out 8 minute video clips and can focus on only a couple of key visual loops per track. (Every track consists of about 5 clips which get triggered directly from Ableton Live using a custom midi track).Inside Ableton we group a couple of extra designated clips so they all get triggered at the same time.
For every audio clip we sequence separate midi clips for the video and lighting, which get played perfectly in sync with the audio. These midi tracks then get sent to the VJ laptop and Manuel's lighting desk.
We understand you trigger clips off Resolume from Abelton Live using the extremely handy Max for Live patches?
Yes, we sequence a separate midi track for each audio track. We divided up the audio track in 5 different elements (beats, snares, melody , fx etc.), which corresponds with 5 video layers in Resolume.
When a note gets triggered, a Max for Live patch translates it to an OSC message and sends if off to the VJ laptop. The OSC messages get caught by a small tool we built in Derivative’s TouchDesigner. In its essence this tool translates the incoming messages into OSC messages which Resolume understands. Basically, operating Resolume automatically with the triggers received from Ableton.
This way of triggering videoclips was a result of an experiment from Martin Boverhof and Sander Haakman during a performance at an art festival in Germany, a couple of years ago. Only two variables are being used- triggering video files and adjusting the opacity of a clip. We were amazed how powerful these two variables are.



Regarding lighting, we understand the newer Chamsys boards have inbuilt support for MIDI/ timecode. What desk do you use?
Manuel:To drive the lighting in the Noisia - Outer Edges show I use a Chamsys Lighting desk. It is a very open environment. You can send Midi, Midi showcontrol, OSC, Timecode LTC & MTC, UDP, Serial Data and off course DMX & Artnet to the desk.
The support of Chamsys is extremely good and the software version is 100% free. Compared to other lighting desk manufacturers, the hardware is much cheaper.
A lighting desk is still much more expensive than a midi controller.
It might look similar as both have faders and buttons but the difference is that a lighting desk has a brain.
You can store, recall and sequence states, something which is invaluable for a lighting designer and now is happening is videoland more and more.
I have been researching on bridging the gap between Ableton Live and ChamSys for 8 years.
This research has led me to M2Q, acronym of Music-to-Cue which acts as a bridge between Ableton live and ChamSys. M2Q is a hardware solution designed together with Lorenzo Fattori, an Italian lighting designer and engineer. M2Q listens to midi messages sent from Ableton Live and converts them to Chamsys Remote Control messages, providing cue triggering and playback intensity control.
M2Q is reliable, easy and fast lighting sync solution. It enables non linear lighting sync.
When using Timecode it is impossible to loop within a song, do the chorus one more time or alter the playback speed on the fly. One is basically limited to pressing play.
Because our lighting sync system is midi based the artist on stage has exactly the same freedom Ableton audio playback offers.
Do you link it to Resolume?
Chamsys has a personality file (head file) for Resolume and this enables driving Resolume as a media server from the lighting desk. I must confess that I’m am considering switching to Resolume now for some time as it is very cost effective and stable solution compared to other mediaserver platforms.
Video by Diana Gheorghiu
Tell us about the trio’s super cool headgear. They change color, strobe, are dimmable. How?!
The led suits are custom designed and built by Élodie Laurent and are basically 3 generic led parcans and have similar functionality.
They are connected to the lighting desk just as the rest of the lighting rig and are driven using the same system.
Fun fact: These are the only three lights we bring with us so the Outer Edges show is extremely tour-able.


The Noisia content is great in it’s quirkyness. Sometimes we see regular video clips, sometimes distorted human faces, sometimes exploding planets, mechanical animals- what’s the thought process behind the content you create? Is it track specific?
The main concept behind this show is that every track has his own little world in this Outer Edges universe. Every track stands on its own and has a different focus on style and narrative.
Nik (one third of Noisia & Art director) compiled a team of 10 international motion graphic artists and together we took on the visualization of the initial 34 audio tracks. Cover artwork, videoclips and general themes from the audio tracks formed the base for most of the tracks.

Photo by Diana Gheorghiu

Photo by Diana Gheorghiu
The lighting & video sync is so on point, we can’t stop talking about it. It must have taken hours of studio time & programming?
That was the whole idea behind the setup.
Instead of synchronizing everything in the light and video tracks, we separated the synchronizing process from the design process. Meaning that we sequence everything in Ableton and on the content side Resolume takes care of the rest. Updating a vj clip is just a matter of dragging a new clip into Resolume.
This also resulted in Resolume being a big part in the design process (instead of normally only serving as a media server).
During the design process we run the Ableton set and see how clips get triggered, if we don't like something we can easily replace the video clip with a new one or adjust for instance the scaling size inside Resolume.
Some tracks which included 3D rendered images took a bit longer, but there is one track “Diplodocus” which took 30 minutes to make from start to finish. Originally, meant as a placeholder but after seeing it being synchronized we liked the simplicity and boldness of it and decided to keep it in the show.
Here is some more madness that went down:
Video by Diana Gheorghiu
Is it challenging to adapt your concept show into different, extremely diverse festival setups? How do you output the video to LED setups that are not standard?
We mostly work with our rider setup consisting of a big LED screen in the back and LED banner in front of the booth, but in case of bigger festivals we can easily adjust the mapping setup inside Resolume.
In the case of Rampage we had another challenge to come up with a solution to operate with 7 full HD outputs.

Photo by Diana Gheorghiu
Normally Nik is controlling everything from stage and we have a direct video line to the LED processor. Since all the connections to the LED screens were located in the front of house we used 2 laptops positioned there.
It was easy to adjust the Ableton Max for Live patch to send the triggers to two computers instead of one, and we wrote a small extra tool that sends all the midi-controller data from the stage to the FOH (to make sure Nik was still able to operate everything from the stage).
Talk to us about some features of Resolume that you think are handy, and would advice people out there to explore.
Resolume was a big part of the design process in this show. Using it almost as a small little After Effects, we stacked up effects until we reached our preferred end result. We triggered scalings, rotations, effects and opacity using the full OSC control option Resolume offers. This makes it super easy to create spot on synchronized shows. With a minimal amount of pre - production.
This in combination with the really powerful mapping options makes it an ideal tool to build our shows on!
What a great interview, Roy & Manuel.
Thanks for giving us a behind-the-scenes understanding of what it takes to run this epic show, day after day.
Noisia has been ruling the Drum & Bass circuit, for a reason. Thumping, fresh & original music along with a remarkable show- what else do we want?
Here is one last video for a group rage :
Video by Diana Gheorghiu
Rinseout.
Credits:
Photo credits Noisa setup: Roy Gerritsen
Adhiraj, Refractor for the on point video edits.
Photo by Diana Gheorghiu
It was a wait. But one that was truly worth it. Essentially a concept album, they pushed the boundaries on this one by backing it up with a ‘concept tour’.
An audio-visual phenomenon with rivetting content, perfect sync & melt-yo-face energy, the Outer Edges show is one that could not pass our dissection.
[fold][/fold]
We visited Rampage, one of the biggest Drum & Bass gigs around the world & caught up with Roy Gerritsen (Boompje Studio) & Manuel Rodrigues (DeepRED.tv), on video and lighting duty respectively, to talk to us about the levels of excellence the Noisia crew has achieved, with this concept show.
Here is a look at Diplodocus, a favorite amongst bass heads:
Video by Diana Gheorghiu
Thanks for doing this guys! Much appreciated.
What exactly is a concept show and how is preparation for it different from other shows?
When Noisia approached us they explained they wanted to combine the release of their next album “Outer Edges” with a synchronized audio visual performance. It had been 6 years since Noisia released a full album so you can imagine it was a big thing.
Together, we came up with a plan to lay the foundation for upcoming shows. We wanted to focus on developing a workflow and pipeline to create one balanced and synchronized experience.
Normally, all the different elements within a show (audio, light, visual, performance) focus on their own area. There is one general theme or concept and everything then comes together in the end - during rehearsals.
We really wanted to create a show where we could focus on the total picture. Develop a workflow where we could keep refining the show and push the concept in all different elements in a quick and effective way, without overlooking the details.
What was the main goal you set out to achieve as you planned the Outer Edges show?
How long did it take to come together, from start to end?
We wanted to create a show where everything is 100% synchronized and highly adaptable. Having one main control computer which connects to all elements within the show in a perfect synchronized way.This setup gave us the ability to find a perfect balance and narrative between sound, performance, lights and visuals. Besides that we wanted to have a modular and highly flexible show. Making it easy and quick to adapt or add new content.
We started with the project in March 2016 and our premiere was at the Let It Roll festival in Prague (July 2016).
The show is designed in such a way that it has an “open-end”. We record every show and because of the open infrastructure we are constantly refining it on all fronts.
What are the different gadgets and software you use to achieve that perfect sync between audio/video & lighting?
Roy:Back in the day, my graduation project at the HKU was a vj mix tool where I used the concept of “cue based” triggering. Instead of the widely used timecode synchronization where you premix all the content (the lights and the video tracks), we send a MIDI trigger of every beat and sound effect.This saves a lot of time in the content creation production process.
The edit and mix down of the visuals are basically done live on stage instead of on After effects. This means we don't have to render out 8 minute video clips and can focus on only a couple of key visual loops per track. (Every track consists of about 5 clips which get triggered directly from Ableton Live using a custom midi track).Inside Ableton we group a couple of extra designated clips so they all get triggered at the same time.
For every audio clip we sequence separate midi clips for the video and lighting, which get played perfectly in sync with the audio. These midi tracks then get sent to the VJ laptop and Manuel's lighting desk.
We understand you trigger clips off Resolume from Abelton Live using the extremely handy Max for Live patches?
Yes, we sequence a separate midi track for each audio track. We divided up the audio track in 5 different elements (beats, snares, melody , fx etc.), which corresponds with 5 video layers in Resolume.
When a note gets triggered, a Max for Live patch translates it to an OSC message and sends if off to the VJ laptop. The OSC messages get caught by a small tool we built in Derivative’s TouchDesigner. In its essence this tool translates the incoming messages into OSC messages which Resolume understands. Basically, operating Resolume automatically with the triggers received from Ableton.
This way of triggering videoclips was a result of an experiment from Martin Boverhof and Sander Haakman during a performance at an art festival in Germany, a couple of years ago. Only two variables are being used- triggering video files and adjusting the opacity of a clip. We were amazed how powerful these two variables are.
Regarding lighting, we understand the newer Chamsys boards have inbuilt support for MIDI/ timecode. What desk do you use?
Manuel:To drive the lighting in the Noisia - Outer Edges show I use a Chamsys Lighting desk. It is a very open environment. You can send Midi, Midi showcontrol, OSC, Timecode LTC & MTC, UDP, Serial Data and off course DMX & Artnet to the desk.
The support of Chamsys is extremely good and the software version is 100% free. Compared to other lighting desk manufacturers, the hardware is much cheaper.
A lighting desk is still much more expensive than a midi controller.
It might look similar as both have faders and buttons but the difference is that a lighting desk has a brain.
You can store, recall and sequence states, something which is invaluable for a lighting designer and now is happening is videoland more and more.
I have been researching on bridging the gap between Ableton Live and ChamSys for 8 years.
This research has led me to M2Q, acronym of Music-to-Cue which acts as a bridge between Ableton live and ChamSys. M2Q is a hardware solution designed together with Lorenzo Fattori, an Italian lighting designer and engineer. M2Q listens to midi messages sent from Ableton Live and converts them to Chamsys Remote Control messages, providing cue triggering and playback intensity control.
M2Q is reliable, easy and fast lighting sync solution. It enables non linear lighting sync.
When using Timecode it is impossible to loop within a song, do the chorus one more time or alter the playback speed on the fly. One is basically limited to pressing play.
Because our lighting sync system is midi based the artist on stage has exactly the same freedom Ableton audio playback offers.
Do you link it to Resolume?
Chamsys has a personality file (head file) for Resolume and this enables driving Resolume as a media server from the lighting desk. I must confess that I’m am considering switching to Resolume now for some time as it is very cost effective and stable solution compared to other mediaserver platforms.
Video by Diana Gheorghiu
Tell us about the trio’s super cool headgear. They change color, strobe, are dimmable. How?!
The led suits are custom designed and built by Élodie Laurent and are basically 3 generic led parcans and have similar functionality.
They are connected to the lighting desk just as the rest of the lighting rig and are driven using the same system.
Fun fact: These are the only three lights we bring with us so the Outer Edges show is extremely tour-able.
The Noisia content is great in it’s quirkyness. Sometimes we see regular video clips, sometimes distorted human faces, sometimes exploding planets, mechanical animals- what’s the thought process behind the content you create? Is it track specific?
The main concept behind this show is that every track has his own little world in this Outer Edges universe. Every track stands on its own and has a different focus on style and narrative.
Nik (one third of Noisia & Art director) compiled a team of 10 international motion graphic artists and together we took on the visualization of the initial 34 audio tracks. Cover artwork, videoclips and general themes from the audio tracks formed the base for most of the tracks.
Photo by Diana Gheorghiu
Photo by Diana Gheorghiu
The lighting & video sync is so on point, we can’t stop talking about it. It must have taken hours of studio time & programming?
That was the whole idea behind the setup.
Instead of synchronizing everything in the light and video tracks, we separated the synchronizing process from the design process. Meaning that we sequence everything in Ableton and on the content side Resolume takes care of the rest. Updating a vj clip is just a matter of dragging a new clip into Resolume.
This also resulted in Resolume being a big part in the design process (instead of normally only serving as a media server).
During the design process we run the Ableton set and see how clips get triggered, if we don't like something we can easily replace the video clip with a new one or adjust for instance the scaling size inside Resolume.
Some tracks which included 3D rendered images took a bit longer, but there is one track “Diplodocus” which took 30 minutes to make from start to finish. Originally, meant as a placeholder but after seeing it being synchronized we liked the simplicity and boldness of it and decided to keep it in the show.
Here is some more madness that went down:
Video by Diana Gheorghiu
Is it challenging to adapt your concept show into different, extremely diverse festival setups? How do you output the video to LED setups that are not standard?
We mostly work with our rider setup consisting of a big LED screen in the back and LED banner in front of the booth, but in case of bigger festivals we can easily adjust the mapping setup inside Resolume.
In the case of Rampage we had another challenge to come up with a solution to operate with 7 full HD outputs.
Photo by Diana Gheorghiu
Normally Nik is controlling everything from stage and we have a direct video line to the LED processor. Since all the connections to the LED screens were located in the front of house we used 2 laptops positioned there.
It was easy to adjust the Ableton Max for Live patch to send the triggers to two computers instead of one, and we wrote a small extra tool that sends all the midi-controller data from the stage to the FOH (to make sure Nik was still able to operate everything from the stage).
Talk to us about some features of Resolume that you think are handy, and would advice people out there to explore.
Resolume was a big part of the design process in this show. Using it almost as a small little After Effects, we stacked up effects until we reached our preferred end result. We triggered scalings, rotations, effects and opacity using the full OSC control option Resolume offers. This makes it super easy to create spot on synchronized shows. With a minimal amount of pre - production.
This in combination with the really powerful mapping options makes it an ideal tool to build our shows on!
What a great interview, Roy & Manuel.
Thanks for giving us a behind-the-scenes understanding of what it takes to run this epic show, day after day.
Noisia has been ruling the Drum & Bass circuit, for a reason. Thumping, fresh & original music along with a remarkable show- what else do we want?
Here is one last video for a group rage :
Video by Diana Gheorghiu
Rinseout.
Credits:
Photo credits Noisa setup: Roy Gerritsen
Adhiraj, Refractor for the on point video edits.
Taking the World by Storm (Part 2)
Hello all you video junkies. This one's just for you.
It took a while to digest the awesomeness, but Part 2 of "Taking the world by Storm" is here.
So, quick recap?
Gig: Storm Festival, 2016, Shanghai
Epic stage:

Video: 400 square meters of Led, 7.9 mm pitch, 10 processors.
[fold][/fold]
Brandon Chaung, the local VJ on site, talked us through the whole process. And it is intense.
So, sit back..relax..a Storm is brewing.
What computers did you use for the show?
I used both PC and Mac.
I like PC because it is powerful and easy to upgrade. Especially with the graphics card (MXM 3.0b) and storage -which are both essential for running Resolume.
I replaced my optical drive with a SSD for new custom footage for the show (I have more than 40 decks in my composition).
I like Mac because of the great onboard audio quality. Also, it’s less of a hassle for audio playback and midi mapping, when using Resolume with other applications, at the same time.
I switched between two laptops with a Barco Encore switcher.
It required four HD outputs to cover all the LED panels we had.
Both the circle screen and the one behind the DJ booth are split into two outputs. Which makes it important to have synchronized outputs.

The 4kTwo display controller provided by Flux studio did a great job.
It is a Chinese brand which has similar, but less, features than the Datapath X4 that other VJs brought for the show.
Talk to us about Resolume, maybe you have heard of it? *grins*
Resolume Arena is my first choice for media server.
It runs great on both operating systems. I can have exactly the same experience while VJing, no matter if I’m on a PC or a MAC. I can switch between the two without thinking.
Another great tool that is worth mentioning is Chaser by Joris. Woo hoo!
I find it super useful when I use it for switching between different mapping settings and even masking.
Some VJs use Madmapper or mapio to switch mapping. I prefer doing this with Resolume.
Normally, I apply two Chaser FFGL plugins on each layer. One is for switching between different output mappings.
So, after some set up in Advance output and Chaser, during the show, I can just pick the footage I desire, set the screen I want it to show through Chaser (using steps) and boom it’s on!
And, I can mix different layers with different mapping without losing blend mode.
Also, what I see in the preview window can be very close to what I get on the actual screen.
A second Chaser plugin applied in the EFX chain (sometime I don’t need it, or if in future it supports polygons other than triangles) is to mask out unwanted parts of the layer that shows on the screens I don’t want it to.
For me it’s better than applying crops and adjusting XY position in different layers, because I can just make use of the slices in advance output.
This technique is very useful for the circle screen in this show.
Can you give us details about how the LED was mapped?
I did the mapping by starting with numbers.
Counting the pixels, modules, actual width and length.
Then, it became like a math exam in high school or a puzzle to solve. The goal was to make the best use of every pixel of every HD output.
Try to find the most efficient combinations of each slice. At the same time, think about how to run the CAT.5 cable through every module- with less cables.


This is the the front view of all the LEDs. Below the name of each slice is the number of modules followed with the pixels in width and height. The circle screen is cut into 8 slices using two HD outputs. The main screen is spread across two HD outputs.
Next, we come to the pixel maps for the four outputs.




I actually quite enjoy this process.
The advance output of Resolume Arena is pretty handy when solving the puzzle. The fixed snapping across screens in Arena 5.1 saved me a lot of time.
Then it’s time to match the advance output with the pixel map (Thanks to the new feature about importing .png into advanced output). After adding a few masks and adjusting it to fit 4k output, It’s pretty much done with the basic setting.
The Output side looks like this:

The Input side looks like this:

Now comes the most interesting part, Chaser.
I added another virtual screen at the bottom just for Chaser slices.
These slices are just for Chaser programing. Dosen’t really output anything.
From these slices you can see it’s all in the ratio of 1920 x 1080, except the center triangle used for custom footage. This also shows how I scale and postion the footage (Most of my footage is in HD)
This is one of the mapping when I want the HD footage focus at only the circle screen, but then notice it also covers the IMAG screen. This will be masked by the second Chaser plugin
Then, I create another sequnce to pick the screen I wanna preserve. So it functions like a mask. Here, I picked the circle. Note that both these two sequences have the same amount of steps.
In this picture you can see the result of what we did in the preview window. I put another layer of lines in different mapping, opacity at half and used diffrence as blend mode. So you can see the blend mode still works like a charm.
Then, I assign both steps of the Chaser plugin to one fader or knob on my MIDI controller. So I can switch it really fast.
This is how I arranged the mapping for this show.
Of course, I still use Chaser to create bumps like it was designed for.
In my mind, I feel that there must be many other creative ways in Resolume to fullfill my imagination- about how my visuals should look, or how I can respond to the music, the moment I hear it. Or when the screens and cues get complicated, how do to it in a simple way.
I’m glad so far Resolume had never let me down.
*Blushes* Thanks for your great words, Brandon!
Quick question- a lot of VJs have been complaining of overheating of MACs, was this a problem during the show?
Not on the 1st day because it was cloudy.
But on the 2nd day, right before the show, I found my MacBook lagging and it was exposed to direct sunlight.
After a reboot and change of position, it came back to normal. Other than this, it was all good during this show.
I think it is not a problem only for MacBook, but with my PC too- it just reacts differently.
The overheating can cause Resolume to crash on my Windows laptop.
So, extra fans for both PC and MacBook become a must have for most of my outdoor events.
Finally, here is a list of equipment that was used during the show:
MSI GT72-2QE Laptop with-
CPU: Intel Core i7 4980HQ @ 2.8GHz
RAM: 32GB DDR3L
Graphic: nVidia GTX980m GDDR5 8GB
Storage: MSI superRAID 4x128g SSD, 512GB Samsung EVO SSD AKAI APC40 MKII
Magewell HDMI USB3.0 capture device
4kTwo display controller x 2
Windows 10
Resolume Arena 5.1.1
Apple Macbook Pro Retina (Mid-2012)
CPU: 2.7Ghz Intel Core i7
RAM: 16GB 1600MHz DDR3
Graphic: NVIDIA GeForce GT 650M 1G VRAM Storage: APPLE SSD SM512E
OS X 10.11.6 El Capitan
Resolume Arena 5.0.2
With this, we come to an end to this two-part extensive coverage of Storm Festival, Shanghai.
It feels great to see the new features we develop put to use, in multiple different ways. Sometimes, even in ways we didn't fathom while developing them :)
Thank you to 250K and the entire crew for doing such a great job at the festival and then educating us about it, in these interviews.
Until we see you again- go try Chaser like Brandon explained. Go on now, get moving.
It took a while to digest the awesomeness, but Part 2 of "Taking the world by Storm" is here.
So, quick recap?
Gig: Storm Festival, 2016, Shanghai
Epic stage:
Video: 400 square meters of Led, 7.9 mm pitch, 10 processors.
[fold][/fold]
Brandon Chaung, the local VJ on site, talked us through the whole process. And it is intense.
So, sit back..relax..a Storm is brewing.
What computers did you use for the show?
I used both PC and Mac.
I like PC because it is powerful and easy to upgrade. Especially with the graphics card (MXM 3.0b) and storage -which are both essential for running Resolume.
I replaced my optical drive with a SSD for new custom footage for the show (I have more than 40 decks in my composition).
I like Mac because of the great onboard audio quality. Also, it’s less of a hassle for audio playback and midi mapping, when using Resolume with other applications, at the same time.
I switched between two laptops with a Barco Encore switcher.
It required four HD outputs to cover all the LED panels we had.
Both the circle screen and the one behind the DJ booth are split into two outputs. Which makes it important to have synchronized outputs.
The 4kTwo display controller provided by Flux studio did a great job.
It is a Chinese brand which has similar, but less, features than the Datapath X4 that other VJs brought for the show.
Talk to us about Resolume, maybe you have heard of it? *grins*
Resolume Arena is my first choice for media server.
It runs great on both operating systems. I can have exactly the same experience while VJing, no matter if I’m on a PC or a MAC. I can switch between the two without thinking.
Another great tool that is worth mentioning is Chaser by Joris. Woo hoo!
I find it super useful when I use it for switching between different mapping settings and even masking.
Some VJs use Madmapper or mapio to switch mapping. I prefer doing this with Resolume.
Normally, I apply two Chaser FFGL plugins on each layer. One is for switching between different output mappings.
So, after some set up in Advance output and Chaser, during the show, I can just pick the footage I desire, set the screen I want it to show through Chaser (using steps) and boom it’s on!
And, I can mix different layers with different mapping without losing blend mode.
Also, what I see in the preview window can be very close to what I get on the actual screen.
A second Chaser plugin applied in the EFX chain (sometime I don’t need it, or if in future it supports polygons other than triangles) is to mask out unwanted parts of the layer that shows on the screens I don’t want it to.
For me it’s better than applying crops and adjusting XY position in different layers, because I can just make use of the slices in advance output.
This technique is very useful for the circle screen in this show.
Can you give us details about how the LED was mapped?
I did the mapping by starting with numbers.
Counting the pixels, modules, actual width and length.
Then, it became like a math exam in high school or a puzzle to solve. The goal was to make the best use of every pixel of every HD output.
Try to find the most efficient combinations of each slice. At the same time, think about how to run the CAT.5 cable through every module- with less cables.
This is the the front view of all the LEDs. Below the name of each slice is the number of modules followed with the pixels in width and height. The circle screen is cut into 8 slices using two HD outputs. The main screen is spread across two HD outputs.
Next, we come to the pixel maps for the four outputs.
I actually quite enjoy this process.
The advance output of Resolume Arena is pretty handy when solving the puzzle. The fixed snapping across screens in Arena 5.1 saved me a lot of time.
Then it’s time to match the advance output with the pixel map (Thanks to the new feature about importing .png into advanced output). After adding a few masks and adjusting it to fit 4k output, It’s pretty much done with the basic setting.
The Output side looks like this:
The Input side looks like this:
Now comes the most interesting part, Chaser.
I added another virtual screen at the bottom just for Chaser slices.
These slices are just for Chaser programing. Dosen’t really output anything.
Then, I assign both steps of the Chaser plugin to one fader or knob on my MIDI controller. So I can switch it really fast.
This is how I arranged the mapping for this show.
Of course, I still use Chaser to create bumps like it was designed for.
In my mind, I feel that there must be many other creative ways in Resolume to fullfill my imagination- about how my visuals should look, or how I can respond to the music, the moment I hear it. Or when the screens and cues get complicated, how do to it in a simple way.
I’m glad so far Resolume had never let me down.
*Blushes* Thanks for your great words, Brandon!
Quick question- a lot of VJs have been complaining of overheating of MACs, was this a problem during the show?
Not on the 1st day because it was cloudy.
But on the 2nd day, right before the show, I found my MacBook lagging and it was exposed to direct sunlight.
After a reboot and change of position, it came back to normal. Other than this, it was all good during this show.
I think it is not a problem only for MacBook, but with my PC too- it just reacts differently.
The overheating can cause Resolume to crash on my Windows laptop.
So, extra fans for both PC and MacBook become a must have for most of my outdoor events.
Finally, here is a list of equipment that was used during the show:
MSI GT72-2QE Laptop with-
CPU: Intel Core i7 4980HQ @ 2.8GHz
RAM: 32GB DDR3L
Graphic: nVidia GTX980m GDDR5 8GB
Storage: MSI superRAID 4x128g SSD, 512GB Samsung EVO SSD AKAI APC40 MKII
Magewell HDMI USB3.0 capture device
4kTwo display controller x 2
Windows 10
Resolume Arena 5.1.1
Apple Macbook Pro Retina (Mid-2012)
CPU: 2.7Ghz Intel Core i7
RAM: 16GB 1600MHz DDR3
Graphic: NVIDIA GeForce GT 650M 1G VRAM Storage: APPLE SSD SM512E
OS X 10.11.6 El Capitan
Resolume Arena 5.0.2
With this, we come to an end to this two-part extensive coverage of Storm Festival, Shanghai.
It feels great to see the new features we develop put to use, in multiple different ways. Sometimes, even in ways we didn't fathom while developing them :)
Thank you to 250K and the entire crew for doing such a great job at the festival and then educating us about it, in these interviews.
Until we see you again- go try Chaser like Brandon explained. Go on now, get moving.
250K- Taking the World by Storm
Massive rigs.
Immersive content.
Path-breaking stage productions.
What a great time to be alive!
We certainly think so, and our quest for “a big production to dissect” landed us in the eye of the Storm @ Shanghai.

[fold][/fold]
On design and production duty for Storm festival 2016 were super imaginative creative specialists- 250K.
They have been slowly and steadily taking over the world, one stage at a time.
After epic shows like The Flying Dutch, Ground Zero and Armin Van Buuren’s tours, Storm Festival 2016 was 250K's most recent conquest.
We got Dennis de Klein to take a break from basking in the glory of a great show (naww!) to talk us through the setup and tech specifics.
From starting out as a stage design intern with 250K, cut to 6 years later: Creative Project Manager- Dennis has come a long way.
For Storm Festival, he managed the project from start to end, working in close association with the designers and the Creative Director of 250K- Sander Reneman.
Dennis’ most important responsibility was to ensure the original design is brought to life, in the best way possible.

He did a good job right?
So, the Set was 60 meters wide x 20 meters deep x 36 meters high. Whew!
Productions of this scale need some detailed and on pointe planning.
They probably worked on the design for months, right?
The development of the set-design, from initial idea to a final 3D drawing, took the 250k team one month.

Then, the 3D drawing was translated into technical drawings and detailed decor plans- to be able to create the set design as efficiently and as close to the original design as possible.

The load-in lasted for around two weeks, and the load-out was finalized in about five days.
We, here at Resolume, love the stage concept! Can you talk us through what the stage depicts?
Storm Festival, a concept created by A2Live, tells the story of the Actaurians, travelers from outer space who have come to Earth to find like-minded people to live and collaborate with.
This year, the story focuses on ‘The Impact’, the first contact between Actaurians and humans.

The (3D) logo of Storm Festival is actually depicted as the mothership of the Actaurians in the artwork and trailers.
To reenact ‘The Impact’ within the set design, the mothership has landed into the set-design making the connection with the Earth.
It is a representation of both worlds colliding into one merged structure, where the logo and organic shapes represent the Actaurians, and the solid stage platform expresses humanity.
How do you, as designers, incorporate a balance between set fabrication, LED and lighting?
As designers, we focus on finding the perfect mix between representing the brand identity of the promotor/event and the Artist’s technical rider requirements.
Keeping this in mind, we are constantly looking to take the set design to a next level, to create something that is not out there and to challenge all disciplines.
For example, if we design a specific set of video panels, it needs to be positioned in a logical location, needs to be functional for décor visual content and artist visual content and needs to blend in with the look & feel of the set design.
It is also about contrast, where video, lighting & decor feel balanced when looking at the set design. We closely collaborate with lighting designers, video operators and the decor fabrication company.
The lighting inventory seems massive!
For this set design, we have collaborated closely with Daniel Richardson, the lighting designer and operator for the past years of Storm Festival.

For this set design, he created a lighting design that incorporates over
140 beams
225 quad LED bars
60 Spots &
60 blinders, just to name a few.
Fun fact, all lighting in the stage is hanging (except for those on the deck).
In terms of set fabrication, what material did you use to create the set and the mammoth logo in the center?
Structurally, the logo is created using truss, it is a geometrical shape that can be recreated with truss and corners.
The facades of the pyramid are supported by a custom welded steel frame, on which the wooden (grey) panels are connected on.
The wooden panels are painted with two type of finishes, to give it that look & feel and to add the sharp edge to accentuate the logo.
The inside of the pyramid is covered with semi-transparent white fabric, to transform it into a giant lightbox.
The pyramid’s truss and frame is held up to a large scaffolding wall, that is part of the whole set design.

Lets talk video..
For the whole set design, we have used around 400 square meters of 7.9 mm LED. The LED tiles were split into 10 processors, of which 4 controlled the central circle screen. Most of the lower LED screens are stacked on a deck, the top circle is supported by the back scaffolding.

What challenges did you face producing this Stage in China and how did you overcome them?
There were two main challenges that had to be overcome:
First, a language barrier.
Most of the crew would only speak Chinese, so it was difficult to get a message across. It can be quite difficult to translate the technical terms that I am used to in Dutch, to English and then to Chinese. For this, sketching, gestures and the 3D model we have created was of great use.
Second, the overall approach is different.
Not saying it is good or bad, but different than what we are used to from a production in Europe or the USA.
The level of customization to the set design and adding details on site, instead of off site, was a lot higher.
There was a strong focus on bringing the detail of the original 3D render into reality. And in addition, the used materials were different.
The basis is still scaffolding and trussing, but the measurements were different than what we’re used to.
It is not a difficulty, but is something that has be taken into account strongly when designing a set design for a different market or region.

With this, we come to the end of Part 1 of our coverage of Storm Festival.
Thanks for talking to us Dennis. Kudos on a great show!
In Part 2, we will plunge into the video details of Storm festival 2016- so get your geek on.
Credits:
Gil Wadsworth and the whole team of A2Live for the opportunity to create 250k's first set design in China;
Daniel Richardson for his great lighting design, lighting operating during show and to assist as a translator from English to Chinese and back;
Atilla Meijs from Corrino for introducing 250K to A2Live
You can also visit 250K & Storm Festival
Photo Credits: Dennis de Klein & Storm Festival
Immersive content.
Path-breaking stage productions.
What a great time to be alive!
We certainly think so, and our quest for “a big production to dissect” landed us in the eye of the Storm @ Shanghai.
[fold][/fold]
On design and production duty for Storm festival 2016 were super imaginative creative specialists- 250K.
They have been slowly and steadily taking over the world, one stage at a time.
After epic shows like The Flying Dutch, Ground Zero and Armin Van Buuren’s tours, Storm Festival 2016 was 250K's most recent conquest.
We got Dennis de Klein to take a break from basking in the glory of a great show (naww!) to talk us through the setup and tech specifics.
From starting out as a stage design intern with 250K, cut to 6 years later: Creative Project Manager- Dennis has come a long way.
For Storm Festival, he managed the project from start to end, working in close association with the designers and the Creative Director of 250K- Sander Reneman.
Dennis’ most important responsibility was to ensure the original design is brought to life, in the best way possible.
He did a good job right?
So, the Set was 60 meters wide x 20 meters deep x 36 meters high. Whew!
Productions of this scale need some detailed and on pointe planning.
They probably worked on the design for months, right?
The development of the set-design, from initial idea to a final 3D drawing, took the 250k team one month.
Then, the 3D drawing was translated into technical drawings and detailed decor plans- to be able to create the set design as efficiently and as close to the original design as possible.
The load-in lasted for around two weeks, and the load-out was finalized in about five days.
We, here at Resolume, love the stage concept! Can you talk us through what the stage depicts?
Storm Festival, a concept created by A2Live, tells the story of the Actaurians, travelers from outer space who have come to Earth to find like-minded people to live and collaborate with.
This year, the story focuses on ‘The Impact’, the first contact between Actaurians and humans.
The (3D) logo of Storm Festival is actually depicted as the mothership of the Actaurians in the artwork and trailers.
To reenact ‘The Impact’ within the set design, the mothership has landed into the set-design making the connection with the Earth.
It is a representation of both worlds colliding into one merged structure, where the logo and organic shapes represent the Actaurians, and the solid stage platform expresses humanity.
How do you, as designers, incorporate a balance between set fabrication, LED and lighting?
As designers, we focus on finding the perfect mix between representing the brand identity of the promotor/event and the Artist’s technical rider requirements.
Keeping this in mind, we are constantly looking to take the set design to a next level, to create something that is not out there and to challenge all disciplines.
For example, if we design a specific set of video panels, it needs to be positioned in a logical location, needs to be functional for décor visual content and artist visual content and needs to blend in with the look & feel of the set design.
It is also about contrast, where video, lighting & decor feel balanced when looking at the set design. We closely collaborate with lighting designers, video operators and the decor fabrication company.
The lighting inventory seems massive!
For this set design, we have collaborated closely with Daniel Richardson, the lighting designer and operator for the past years of Storm Festival.
For this set design, he created a lighting design that incorporates over
140 beams
225 quad LED bars
60 Spots &
60 blinders, just to name a few.
Fun fact, all lighting in the stage is hanging (except for those on the deck).
In terms of set fabrication, what material did you use to create the set and the mammoth logo in the center?
Structurally, the logo is created using truss, it is a geometrical shape that can be recreated with truss and corners.
The facades of the pyramid are supported by a custom welded steel frame, on which the wooden (grey) panels are connected on.
The wooden panels are painted with two type of finishes, to give it that look & feel and to add the sharp edge to accentuate the logo.
The inside of the pyramid is covered with semi-transparent white fabric, to transform it into a giant lightbox.
The pyramid’s truss and frame is held up to a large scaffolding wall, that is part of the whole set design.
Lets talk video..
For the whole set design, we have used around 400 square meters of 7.9 mm LED. The LED tiles were split into 10 processors, of which 4 controlled the central circle screen. Most of the lower LED screens are stacked on a deck, the top circle is supported by the back scaffolding.
What challenges did you face producing this Stage in China and how did you overcome them?
There were two main challenges that had to be overcome:
First, a language barrier.
Most of the crew would only speak Chinese, so it was difficult to get a message across. It can be quite difficult to translate the technical terms that I am used to in Dutch, to English and then to Chinese. For this, sketching, gestures and the 3D model we have created was of great use.
Second, the overall approach is different.
Not saying it is good or bad, but different than what we are used to from a production in Europe or the USA.
The level of customization to the set design and adding details on site, instead of off site, was a lot higher.
There was a strong focus on bringing the detail of the original 3D render into reality. And in addition, the used materials were different.
The basis is still scaffolding and trussing, but the measurements were different than what we’re used to.
It is not a difficulty, but is something that has be taken into account strongly when designing a set design for a different market or region.
With this, we come to the end of Part 1 of our coverage of Storm Festival.
Thanks for talking to us Dennis. Kudos on a great show!
In Part 2, we will plunge into the video details of Storm festival 2016- so get your geek on.
Credits:
Gil Wadsworth and the whole team of A2Live for the opportunity to create 250k's first set design in China;
Daniel Richardson for his great lighting design, lighting operating during show and to assist as a translator from English to Chinese and back;
Atilla Meijs from Corrino for introducing 250K to A2Live
You can also visit 250K & Storm Festival
Photo Credits: Dennis de Klein & Storm Festival
Mad About Madeon
Madeon is a French electronic producer, who uses gadgets and technology like they’re an extension of his very being.
With an on stage setup that baffles even the best in the business, this 22 year old producer has reached where he is because of his focus on the audio-visual aspect of a performance, as a unit.
His stage setup should be trademarked. It’s a diamond with arrow- like shapes on either side.
All made of LED.

Geometric.
Symmetric.
Minimalist.
We, here at Resolume, couldn’t pass on the chance of understanding his rig and how he perfectly triggers his visuals to the music, live.
Thanks very much for speaking to us Hugo!
[fold][/fold]
First things first, the answer many have been curious to know, can you explain your live setup to us? All the gadgets you use and their purpose?
The show is run on two laptops which are on stage with me.
One runs the audio side of things in Ableton and sends MIDI through ethernet in real time to a second, dedicated video laptop running Resolume.
I have two Novation Launchpads to play musical parts and modify existing stems, one Novation Launchcontrol XL to handle some additional fx and general controls (including tempo) and a Korg SV-1 keyboard.
There is also a Xone K2 plugged into Resolume to control some video effects.

You do a great job of syncing your visuals to the music. Can you explain to us how you do this with Resolume?
All of the audio clips and parts in Ableton are grouped with matching MIDI clips that trigger videos and effect in Resolume.
All of the editing is done in real time, it's really useful as it means I can edit the video show easily between shows by simply changing the MIDI score.
It also means that I can improvise, extend or shorten a section, with the knowledge that the video show will keep up.
We have noticed some LED strips being used in your setups. Do you control DMX fixtures with Resolume as well?
No, we haven't explored this yet but i'm looking forward to it! At the moment, all of the fixtures are triggered manually (no timecode, shoutout to Alex Cerio!)
We really like the pixel mapping of the buttons on your Launchpads. Tell us about this.
This is a simple MIDI score sent to the Launchpad to animate it. Novation kindly built custom Launchpads for me with unique firmware features enabling me to switch between this type of "animation" mode and a regular functioning mode seamlessly.

Audio-visuals is so important to you- sometimes the content looks like the launchpad. It’s gotta be intentional?
Absolutely! For the 2012 shows, there were sections of the show where the screen matched the Launchpad completely. There were also pixel-style grid animations that were completely in real-time (with 64 layers in Resolume for each of the 64 pad), each pad corresponding to a different MIDI note. Very fun to program!

What thought process do you go through while creating visuals in your studio? What software do you use? How long does it take for you to prepare a render/ clip?
I work with a number of companies on making the content for the show but I make the content for about a third of the show.
I mostly use After Effects. I'm not very familiar with 3D softwares so I make 3D animations in AE polygon by polygon which is quite excruciating!
I like to keep making content on tour as new ideas occur to me, it's always a work in progress.

Give us a rundown of your studio equipment. What is an absolute must-have?What would you like to change/ upgrade?
A great computer has to be the most indispensable gear.
Whenever I upgrade, my production style always seems to adapt to use more plugins until I reach the limit again, it's constant frustration!
A zero-latency, unlimited-resources dream computer would be the best imaginable upgrade.
Why did you pick Resolume over the other software available out there?
Resolume reminded me a lot of audio softwares I was already familiar with.
It's intuitive and powerful, the effects are extremely usable and the latest updates in Arena 5 added mapping options that enabled my latest "diamond/chevron" LED setup.

With this, we come to the end of this interview.
Thanks much for taking the time out to do this Hugo, we are all very grateful.
Our hunger for technology and the things you can do with it has been duly satiated. For now.
Time to go try all of this out now, eh? :)
You can check out Madeon's work here:
Photo Cred: Charles Edouard Dangelser
With an on stage setup that baffles even the best in the business, this 22 year old producer has reached where he is because of his focus on the audio-visual aspect of a performance, as a unit.
His stage setup should be trademarked. It’s a diamond with arrow- like shapes on either side.
All made of LED.
Geometric.
Symmetric.
Minimalist.
We, here at Resolume, couldn’t pass on the chance of understanding his rig and how he perfectly triggers his visuals to the music, live.
Thanks very much for speaking to us Hugo!
[fold][/fold]
First things first, the answer many have been curious to know, can you explain your live setup to us? All the gadgets you use and their purpose?
The show is run on two laptops which are on stage with me.
One runs the audio side of things in Ableton and sends MIDI through ethernet in real time to a second, dedicated video laptop running Resolume.
I have two Novation Launchpads to play musical parts and modify existing stems, one Novation Launchcontrol XL to handle some additional fx and general controls (including tempo) and a Korg SV-1 keyboard.
There is also a Xone K2 plugged into Resolume to control some video effects.
You do a great job of syncing your visuals to the music. Can you explain to us how you do this with Resolume?
All of the audio clips and parts in Ableton are grouped with matching MIDI clips that trigger videos and effect in Resolume.
All of the editing is done in real time, it's really useful as it means I can edit the video show easily between shows by simply changing the MIDI score.
It also means that I can improvise, extend or shorten a section, with the knowledge that the video show will keep up.
We have noticed some LED strips being used in your setups. Do you control DMX fixtures with Resolume as well?
No, we haven't explored this yet but i'm looking forward to it! At the moment, all of the fixtures are triggered manually (no timecode, shoutout to Alex Cerio!)
We really like the pixel mapping of the buttons on your Launchpads. Tell us about this.
This is a simple MIDI score sent to the Launchpad to animate it. Novation kindly built custom Launchpads for me with unique firmware features enabling me to switch between this type of "animation" mode and a regular functioning mode seamlessly.
Audio-visuals is so important to you- sometimes the content looks like the launchpad. It’s gotta be intentional?
Absolutely! For the 2012 shows, there were sections of the show where the screen matched the Launchpad completely. There were also pixel-style grid animations that were completely in real-time (with 64 layers in Resolume for each of the 64 pad), each pad corresponding to a different MIDI note. Very fun to program!
What thought process do you go through while creating visuals in your studio? What software do you use? How long does it take for you to prepare a render/ clip?
I work with a number of companies on making the content for the show but I make the content for about a third of the show.
I mostly use After Effects. I'm not very familiar with 3D softwares so I make 3D animations in AE polygon by polygon which is quite excruciating!
I like to keep making content on tour as new ideas occur to me, it's always a work in progress.
Give us a rundown of your studio equipment. What is an absolute must-have?What would you like to change/ upgrade?
A great computer has to be the most indispensable gear.
Whenever I upgrade, my production style always seems to adapt to use more plugins until I reach the limit again, it's constant frustration!
A zero-latency, unlimited-resources dream computer would be the best imaginable upgrade.
Why did you pick Resolume over the other software available out there?
Resolume reminded me a lot of audio softwares I was already familiar with.
It's intuitive and powerful, the effects are extremely usable and the latest updates in Arena 5 added mapping options that enabled my latest "diamond/chevron" LED setup.
With this, we come to the end of this interview.
Thanks much for taking the time out to do this Hugo, we are all very grateful.
Our hunger for technology and the things you can do with it has been duly satiated. For now.
Time to go try all of this out now, eh? :)
You can check out Madeon's work here:
Photo Cred: Charles Edouard Dangelser
On Tour with Zedd: Gabe Damast
Working for Resolume, we're lucky enough to see some of the most amazing VJ talent in action. One such person is Gabe Damast, whose live show for Zedd blew me away. Gabe is a true VJ and seldom we see a show this tight and in sync with the music. And most amazing of all, it's pure VJ skill, no SMPTE or other tricks.
Take a look at the video for an idea of how Gabe rocks it, and then read on below for what he has to say about all this.
[fold][/fold]
How did you start VJ'ing?
My introduction to the world of VJing came through music. I grew up in the San Francisco Bay Area playing saxophone and piano in a couple different Jazz and Funk bands, and as my love for electronic music developed I got into beat making, record producing, and sound engineering. I spent years learning basically every major production software set up a small studio in my parents basement where I'd record myself and my musician friends goofing off, and sometimes they'd turn into actual songs.
At the end of college, a friend of mine showed me Resolume, which was really the first time I was exposed to any visual performance software. I remember a lot of things clicked for me all at once, coming from a background using Ableton Live and FL Studio, Resolume felt like a very user friendly video version of the DAWs I was familiar with. It wasn't long before I got ahold of a projector and started working on my first VJ sets in my tiny dark bedroom late at night. At first I would use found footage and VJ clips from vimeo, but I eventually got into cinema 4D and after effects and started making my own video content, some of which is being used in the Zedd show currently!

Can you tell us a bit more about the Zedd tour? How does such a tour get organised when it comes to the stage design, the content, the operating of video, lights and laser? Who does what?
The True Colors - which was the latest Arena tour we did with Zedd - all started more than two years ago with scribbles on a paper napkin. Many artists will hire a specific designer to conceptualize a stage production, but from the very beginning, the Zedd touring team been extremely close-knit, and we always share roles and creative ideas freely. Zedd likes to be incredibly close with pretty much every aspect of his live show, so many of the crucial design decisions would happen in a group discussions during a meal at an airport, or a van ride on the way to a music festival. Our lighting director Stevie Hernandez would create renderings of different ideas in vector works pretty much in real time, which helped different ideas evolve and change.
Video content has always been the central focus of the Zedd show (and I'm NOT just saying that because I'm a VJ!!). For the True Colors Tour we wanted to give fans the most immersive experience possible, so the design we landed on was pretty much a giant 84 foot wide LED wall, framed with all sorts of light fixtures, lasers, and special effects. We were able to use an LED wall that was fully 4K in width - a dream come true for any pixel pusher. It's been really exciting to watch the rapid development of LED technology in recent years. Bigger walls, higher resolutions, soon I'm sure we're going to be watching shows in retina quality! In the five months leading up to the start of the tour, we worked closely with Beeple (Mike Winkelman) to create the bulk of the new show visuals rendered in stunning 4418x1080 resolution. Scott Pagano and myself also contributed to the content push, which enabled me to curate an entirely new Zedd visual show from our previous tour.
Read more about Production Works process here: http://www.productionclub.net/work/truecolors
The thing that stands out most to me is how video, laser and light play the accents in the music as a team, almost like a band. Is this something that you practice?
"Practicing" is always a tricky subject in the world of live production. The cost of renting enough gear to do a proper rehearsal is so high that it only really makes sense surrounding a tour where the costs are being spread over a few months. We were lucky to have two weeks of rehearsals before our tour rolled out, where we built the full size production in a sweaty, cavernous warehouse in Las Vegas, and Zedd, myself, Ken (our tour manager AND laser operator), and Stevie (lights) spent 12+ hours a day listening to music and creating unique looks for each song Zedd wanted to play during the tour. We brought in a mobile studio for Zedd to use, and each day would usually begin with us brainstorming visual ideas, and then taking breaks where me and Stevie could program the looks, and Zedd could work on musical edits and tweaks. It was hard to leave the rehearsal space at the end of the day because we were getting so much done!
It's all live right, no SMPTE? What would you say to people that are just starting out and are looking to get a tight sync like that?
No SMPTE! Every single video clip, strobe hit, and pyro shot are all cued live. That's why our rehearsals took so long. I have a lot of respect for people who put together time coded shows, and there are a lot of things you can do with that kind of sync that just aren't possible with live triggering, but for me, realtime performance is the only way I like to work. Music is what drives the visuals, and Zedd always DJs live, so there is a certain level of authenticity that is communicated by including some human error into the visual performance.
Whenever someone asks me how they should get into VJing, I always tell them to start by understanding music. You can definitely be a technical person and excel in the visual performance world, but in order to deliver an on-time show (with no timecode) you really have to learn music and rhythm. If you have good timing, and understand the basics of music theory, you can put on an amazing show even with the worst video content on the smallest screens.

What gear are you bringing with you? Is it easy to deal with airport customs?
For a normal fly-in show, I use a Macbook Pro Retina with three midi controllers: 2 tractor control F1s and a MIDI fighter 3D. My whole kit fits nicely in a Pelican 1510 carryon case, and if customs ever tries to hassle me I just say "it's for making computer music!!!" and they always leave me alone. Flying around with three laptops sometimes raises a few eyebrows, but I've never gotten seriously held up (yet! *knock on wood*)
How does Resolume fit into all this?
Resolume's simple layout makes it SUPER easy to organize our visual show. I always try to think about telling a story through our video content, and all of my Resolume compositions are arranged in a timeline that I navigate around depending on what songs are being played. Since everything is live, choosing a media server that allowed for quick re-organization was really important to me. Add in the first class customer service from the Resolume team, and it's a no brainer!

Where can we find you online?
You can find my work on the web at:
--- http://www.gabedamast.com ---
or other platforms like:
--- vimeo: https://vimeo.com/user5953855 ---
--- behance: https://www.behance.net/gabedamast ---
Take a look at the video for an idea of how Gabe rocks it, and then read on below for what he has to say about all this.
[fold][/fold]
How did you start VJ'ing?
My introduction to the world of VJing came through music. I grew up in the San Francisco Bay Area playing saxophone and piano in a couple different Jazz and Funk bands, and as my love for electronic music developed I got into beat making, record producing, and sound engineering. I spent years learning basically every major production software set up a small studio in my parents basement where I'd record myself and my musician friends goofing off, and sometimes they'd turn into actual songs.
At the end of college, a friend of mine showed me Resolume, which was really the first time I was exposed to any visual performance software. I remember a lot of things clicked for me all at once, coming from a background using Ableton Live and FL Studio, Resolume felt like a very user friendly video version of the DAWs I was familiar with. It wasn't long before I got ahold of a projector and started working on my first VJ sets in my tiny dark bedroom late at night. At first I would use found footage and VJ clips from vimeo, but I eventually got into cinema 4D and after effects and started making my own video content, some of which is being used in the Zedd show currently!
Can you tell us a bit more about the Zedd tour? How does such a tour get organised when it comes to the stage design, the content, the operating of video, lights and laser? Who does what?
The True Colors - which was the latest Arena tour we did with Zedd - all started more than two years ago with scribbles on a paper napkin. Many artists will hire a specific designer to conceptualize a stage production, but from the very beginning, the Zedd touring team been extremely close-knit, and we always share roles and creative ideas freely. Zedd likes to be incredibly close with pretty much every aspect of his live show, so many of the crucial design decisions would happen in a group discussions during a meal at an airport, or a van ride on the way to a music festival. Our lighting director Stevie Hernandez would create renderings of different ideas in vector works pretty much in real time, which helped different ideas evolve and change.
Video content has always been the central focus of the Zedd show (and I'm NOT just saying that because I'm a VJ!!). For the True Colors Tour we wanted to give fans the most immersive experience possible, so the design we landed on was pretty much a giant 84 foot wide LED wall, framed with all sorts of light fixtures, lasers, and special effects. We were able to use an LED wall that was fully 4K in width - a dream come true for any pixel pusher. It's been really exciting to watch the rapid development of LED technology in recent years. Bigger walls, higher resolutions, soon I'm sure we're going to be watching shows in retina quality! In the five months leading up to the start of the tour, we worked closely with Beeple (Mike Winkelman) to create the bulk of the new show visuals rendered in stunning 4418x1080 resolution. Scott Pagano and myself also contributed to the content push, which enabled me to curate an entirely new Zedd visual show from our previous tour.
Read more about Production Works process here: http://www.productionclub.net/work/truecolors
The thing that stands out most to me is how video, laser and light play the accents in the music as a team, almost like a band. Is this something that you practice?
"Practicing" is always a tricky subject in the world of live production. The cost of renting enough gear to do a proper rehearsal is so high that it only really makes sense surrounding a tour where the costs are being spread over a few months. We were lucky to have two weeks of rehearsals before our tour rolled out, where we built the full size production in a sweaty, cavernous warehouse in Las Vegas, and Zedd, myself, Ken (our tour manager AND laser operator), and Stevie (lights) spent 12+ hours a day listening to music and creating unique looks for each song Zedd wanted to play during the tour. We brought in a mobile studio for Zedd to use, and each day would usually begin with us brainstorming visual ideas, and then taking breaks where me and Stevie could program the looks, and Zedd could work on musical edits and tweaks. It was hard to leave the rehearsal space at the end of the day because we were getting so much done!
It's all live right, no SMPTE? What would you say to people that are just starting out and are looking to get a tight sync like that?
No SMPTE! Every single video clip, strobe hit, and pyro shot are all cued live. That's why our rehearsals took so long. I have a lot of respect for people who put together time coded shows, and there are a lot of things you can do with that kind of sync that just aren't possible with live triggering, but for me, realtime performance is the only way I like to work. Music is what drives the visuals, and Zedd always DJs live, so there is a certain level of authenticity that is communicated by including some human error into the visual performance.
Whenever someone asks me how they should get into VJing, I always tell them to start by understanding music. You can definitely be a technical person and excel in the visual performance world, but in order to deliver an on-time show (with no timecode) you really have to learn music and rhythm. If you have good timing, and understand the basics of music theory, you can put on an amazing show even with the worst video content on the smallest screens.
What gear are you bringing with you? Is it easy to deal with airport customs?
For a normal fly-in show, I use a Macbook Pro Retina with three midi controllers: 2 tractor control F1s and a MIDI fighter 3D. My whole kit fits nicely in a Pelican 1510 carryon case, and if customs ever tries to hassle me I just say "it's for making computer music!!!" and they always leave me alone. Flying around with three laptops sometimes raises a few eyebrows, but I've never gotten seriously held up (yet! *knock on wood*)
How does Resolume fit into all this?
Resolume's simple layout makes it SUPER easy to organize our visual show. I always try to think about telling a story through our video content, and all of my Resolume compositions are arranged in a timeline that I navigate around depending on what songs are being played. Since everything is live, choosing a media server that allowed for quick re-organization was really important to me. Add in the first class customer service from the Resolume team, and it's a no brainer!
Where can we find you online?
You can find my work on the web at:
--- http://www.gabedamast.com ---
or other platforms like:
--- vimeo: https://vimeo.com/user5953855 ---
--- behance: https://www.behance.net/gabedamast ---
Touring Latin America - Viaje Tour Ricardo Arjona
The Viaje Tour Ricardo Arjona has been called the most successful latin tour in 2014-2015 by Pollstar and Billboard, with an attendance of more than 1.7 million people.
The man behind the visuals on this tour is Camilo Mejia, also known as VJ Bastard. Read more about what he has to say on the touring experience below.
[fold][/fold]
Read more about Camilo at his website: http://www.vjbastard.com or check out his work on Vimeo at https://vimeo.com/visualbastard
The man behind the visuals on this tour is Camilo Mejia, also known as VJ Bastard. Read more about what he has to say on the touring experience below.
[fold][/fold]
We have been touring for a year using Resolume with no issues. We've been to Argentina (8 cities, 25 shows and we're going back for at least 5 shows more in November), Mexico (16 cities, 35 shows), Uruguay, Paraguay, Panama, Costa rica (2 shows), Chile (7 shows), Puerto Rico (5 shows), USA (13 cities, and we're going back next month to 8 cities), Ecuador (3 cities), Venezuela (5 cities), Guatemala (2 shows, 3 cities), El Salvador, Honduras (2 cities), Nicaragua, Colombia (5 cities) and we are waiting for the confirmation of the Europe tour.
I have been using Resolume since 2.4.1, and have a good 15 years of experience playing with video.
I was called for this tour in May of 2014 as visualist and video engineer. We made rehearsals for a month in Mexico, during which we decided that the perfect system for our tour would be Resolume Arena.
First of all is the stability. I've played HUGE clips from 2gb to 70 or 80 gb, with no issues, so I know i will not have any problem with that. Because we don't run with a backup signal, that's a serious point.
Second of all we have a lot to stuff in our tour. Props (cars, trains, bikes, chairs…), back line, consoles, screens, and everything is travelling with us. As you may notice the screens of the tour are huge, and it is the first thing that we prepare for the show. Build up time is around 4 hours for the screens alone. With other systems it's easy for things to go missing, so the portability is really important for us.
The set up for the show that we use for the tour consists 436 modules of 6mm pitch LED screens. Resolume runs on a MacPro 12core 2,7Ghz /dual gpu with 64GbRam DDR3 with 1TB of storage, and 2 GPU Fire Pro d700 AMD of 6gm of VRAM.
The outputs are set up 1 for the main screen, 1 for the “leeds” or totems by the sides, 1 for the backing for the musicians (moving door), and 1 for the tunnels. The full comp is made for 2115 px X 1080 px, with no scalers, Folsoms or anything, I go straight to the processors with dual links and that's it.
I play the show with an Akai APC40 (older version) and some of the songs had SMPTE sync sent from Pro Tools. A Blackmagic capture card is used to capture an HD-SDI signal that is used in some cues to show the musicians and other live shots.
Read more about Camilo at his website: http://www.vjbastard.com or check out his work on Vimeo at https://vimeo.com/visualbastard
Dream On: Rocking It Out with Aerosmith
One of the great things about hosting Resolume workshops is you get to meet so many amazing people from all over the world.
One such amazing person is Danny Purdue.
After joining us for a session last year, he showed us the impressive work he was doing for Light Nightclub in Las Vegas. Soon after that, we got word he would be running Resolume for the visuals on the Aerosmith world tour. Live visuals are common in EDM and club scenes, but still a relatively new thing on rock shows, so of course we had to get the lowdown on this.
Here's the interview with Danny himself.[fold][/fold]

Who are you and how the heck did you land a job with Aerosmith? What other work have you done?
I’ve spent most of the last ten years touring and producing live video at concerts. I started out running camera and editing, then eventually got into directing and more of the overall show design. On the side I had an interest in VJing, and the two paths crossed when I directed the live video for Swedish House Mafia's “One Last Tour” in 2012. The camera shots needed to be stylized to complement the LED visuals, so I integrated Resolume with a broadcast switcher to mix effects with my line cut. Creatively it was really fun and being out there inspired me to pursue VJing more seriously.
When that tour was over I headed to Las Vegas for a residency at two clubs opening in Mandalay Bay. One was Daylight Beach Club, a straightforward VJ gig, and the other was Cirque du Soleil’s Light Nightclub, an ambitious project to combine the theatrics of Cirque with EDM and the nightclub environment. Light has a massive LED installation, wireless HD cameras, motion tracking equipment for generative visuals, and custom content built for the architecture of the room. It took a lot of talented people to bring all the pieces together and make Light successful.
In March I got a call about putting together a Resolume demo for the upcoming Aerosmith tour. It sounded like a cool opportunity, so I went over to LA and worked out a formula similar to the Swedish House rig with Arena running inline with broadcast video equipment. A few days later Steven Tyler came by, I demoed some looks, then we spent a couple hours trying out all kinds of effects using recordings from previous shows. He liked what he saw and asked me to join the tour.
Why was Resolume chosen over other media servers?
The choice to use Resolume came from Steven. I was pretty surprised he knew about it, and even more so when we first met that he had actually downloaded the demo, gone through footage on the website, and rattled off names of the animators he liked. The man does his homework. After seeing what VJs were doing with Resolume, Steven was excited to use the large palette of effects to create visual moments in Aerosmith's show.
We didn’t have production rehearsals before the tour, so the immediate benefit of Resolume was rapidly developing ideas. Instead a lengthy creative process with renderings and reviewals, we knew a server running Arena could achieve whatever visual treatments we came up with on the road.
How is operating a rock show different from an EDM style event?
The main difference on this project was using visuals and effects to accent a show rather than drive it. What fans want to see at an Aerosmith concert isn’t graphics, it’s these rock icons playing their instruments and Steven Tyler’s wild stage presence. So it was a video-first approach where several elements had to be right for a visual moment to be effective.
After Steven and I developed a concept, I worked with our lighting designer Cosmo Wilson, video director Jeff Claire, and the camera crew to sort out the practical side of things like colors, spotlights, and filming angles. It was a much different environment than VJing at a rave where your content is the show and you’re more in control of the ambience.
How was your deck structured?
I ended up using a single deck mostly because it simplified my workflow with live inputs. Rather than having a lot of device source clips, I stuck with two and used layer routers to get signal wherever I needed it in the
composition. For one of the keying effects this routing allowed me to send an upper layer back down in the render order, which is a feature that’s hard to appreciate until you need it.
The deck was mostly effect clips and a small selection of content. Out of seven layers total, two were essentially fixed tools and the other five gave me plenty of room to stack up each look in a single column. One feature of Resolume I had rarely used for VJing but came to rely on for this project was Composition MIDI mapping. It saved a lot of time by not having to remap as I shuffled things around and tried different orders of effects.
What effects did you use to create the different looks for the songs?
Each look was a combination of multiple effects with the most significant parameters linked to dashboard controls. Here are a few of my favorites.
One of the first looks we created was for “Eat the Rich,” which started with an upbeat, tribal-esque breakdown of percussion and vocals. Steven walked downstage, faced the camera, and did some crazy dance moves with an effect we called "Shiva.” It was based on Delay RGB with some color strobing, and I had a couple knobs controlling distortion effects based on his moves that particular night.

The trick with this Edge Detection effect was keeping detail on Steven so it didn’t look gaudy, then I added the glare to give it a more elegant feel. This one really popped during “Dream On” when Steven stood on top of the piano with his hair and clothes blowing in the wind from the stage fans. The song opened with a clip of rolling clouds and those fit nicely with this look too.


The most challenging cue each show was called “Face Melt,” where a combination of keying effects made Steven's skin translucent to reveal twisting (Cosmic Strings!) graphics. Most of the time we used this at the beginning of "Sweet Emotion” under moody ultraviolet light, which is incredibly tough to key against. I had presets that got it close and dialed in the rest by hand to make sure content only played over him and didn't spill out over the rest of the image. This look was part of my original Resolume demo for Steven.


What were the technical setup and signal flow like?
As VJs we’re often confined to prosumer gear that fits in a carry-on case. Not here. My equipment and the main video system were provided by Screenworks NEP out of LA, giving me access to considerable resources at the beginning of the project. During the system build I was able to pull broadcast-grade hardware and converters off the shelf, test them out, and get exactly what I needed. Having a lab of sorts to experiment with integration was a real luxury. Once the spec was complete, our tech-saavy video engineer went through each piece of gear from camera to screen and shaved off every millisecond of latency possible.
My rig was located backstage with the rest of video world since I needed access to several HD sources and quick communication with the video crew. Resolume ran on a 2013 Mac Pro and captured two discrete video signals using a Blackmagic DeckLink Duo. The card took any combination of cameras and switcher feeds based on my selection with remote panels connected to the main system router. Resolume’s output passed through an Image Pro for genlock and HD-SDI conversion, then went back to the central router so we could place it upstream or downstream of anything else in the signal flow to the LED screen.
For control I used Livid Instruments’ CNTRL:R. It has both regular and “endless" knobs, small drum pads, plenty of RGB buttons, long-throw faders, and a layout that works well with how I operate. Everyone of course has their own cup of tea when it comes to MIDI controllers, but when Resolume is open I almost always have the CNTRL:R plugged in too.
The heart of the video system was a Ross Vision, a high end broadcast switcher with all kinds of mixing, keying, and automation abilities. We had one look driven by the Vision that was a grid of nine different 1080 HD sources with no drop in frame-rate or performance. For another song we had switcher buttons triggering sequences of comic book styled playback based on which band member and camera angle were being shown, then a layer of effected live video from Resolume was keyed into a panel to match the look. Top-notch hardware opens the door to some pretty imaginative workflows.


Where do you see Resolume fitting in to a crowded scene of media servers and VJ software?
What originally got me into Resolume is its simplicity and intuitiveness, which let me focus on being creative. This is particularly important when you’re working with a high profile artist whose time is very valuable. In a creative session you have to quickly translate ideas into a repeatable cues, so you need a fast and flexible workflow. There is always time to go back and get technical with optimizing effect stacks, layering, and MIDI mapping. What doesn’t work is a rock star tapping his foot waiting on you to configure something.
One of Resolume’s best advantages that seems to either be overlooked or taken for granted is that it’s cross-platform. It’s important to me that no matter what tools and hardware I want to use, I don’t have to worry about changing the main piece of software I use to operate a show. Especially with Syphon and now Spout, a lot just comes down to user preference and project needs.
Looking forward, I’m excited to see how Resolume tackles new trends like it did with projection mapping. Things like timeline-based control data and DMX output are readily available using third party apps, but the process could be simpler.
Resolume is still a new tool to the industry as a whole and has a lot of room to grow beyond the EDM scene. As more artists embrace interactive technologies, generative show elements, and live content operators, having a powerful creative hub that can adapt to different workflows is key. Before this project I wouldn’t have expected Aerosmith to be part of this conversation, and was pleasantly surprised that even rock legends are riding the new wave of visual art.
See more of Danny's work at http://programfeed.com/
One such amazing person is Danny Purdue.
After joining us for a session last year, he showed us the impressive work he was doing for Light Nightclub in Las Vegas. Soon after that, we got word he would be running Resolume for the visuals on the Aerosmith world tour. Live visuals are common in EDM and club scenes, but still a relatively new thing on rock shows, so of course we had to get the lowdown on this.
Here's the interview with Danny himself.[fold][/fold]
Who are you and how the heck did you land a job with Aerosmith? What other work have you done?
I’ve spent most of the last ten years touring and producing live video at concerts. I started out running camera and editing, then eventually got into directing and more of the overall show design. On the side I had an interest in VJing, and the two paths crossed when I directed the live video for Swedish House Mafia's “One Last Tour” in 2012. The camera shots needed to be stylized to complement the LED visuals, so I integrated Resolume with a broadcast switcher to mix effects with my line cut. Creatively it was really fun and being out there inspired me to pursue VJing more seriously.
When that tour was over I headed to Las Vegas for a residency at two clubs opening in Mandalay Bay. One was Daylight Beach Club, a straightforward VJ gig, and the other was Cirque du Soleil’s Light Nightclub, an ambitious project to combine the theatrics of Cirque with EDM and the nightclub environment. Light has a massive LED installation, wireless HD cameras, motion tracking equipment for generative visuals, and custom content built for the architecture of the room. It took a lot of talented people to bring all the pieces together and make Light successful.
In March I got a call about putting together a Resolume demo for the upcoming Aerosmith tour. It sounded like a cool opportunity, so I went over to LA and worked out a formula similar to the Swedish House rig with Arena running inline with broadcast video equipment. A few days later Steven Tyler came by, I demoed some looks, then we spent a couple hours trying out all kinds of effects using recordings from previous shows. He liked what he saw and asked me to join the tour.
Why was Resolume chosen over other media servers?
The choice to use Resolume came from Steven. I was pretty surprised he knew about it, and even more so when we first met that he had actually downloaded the demo, gone through footage on the website, and rattled off names of the animators he liked. The man does his homework. After seeing what VJs were doing with Resolume, Steven was excited to use the large palette of effects to create visual moments in Aerosmith's show.
We didn’t have production rehearsals before the tour, so the immediate benefit of Resolume was rapidly developing ideas. Instead a lengthy creative process with renderings and reviewals, we knew a server running Arena could achieve whatever visual treatments we came up with on the road.
How is operating a rock show different from an EDM style event?
The main difference on this project was using visuals and effects to accent a show rather than drive it. What fans want to see at an Aerosmith concert isn’t graphics, it’s these rock icons playing their instruments and Steven Tyler’s wild stage presence. So it was a video-first approach where several elements had to be right for a visual moment to be effective.
After Steven and I developed a concept, I worked with our lighting designer Cosmo Wilson, video director Jeff Claire, and the camera crew to sort out the practical side of things like colors, spotlights, and filming angles. It was a much different environment than VJing at a rave where your content is the show and you’re more in control of the ambience.
How was your deck structured?
I ended up using a single deck mostly because it simplified my workflow with live inputs. Rather than having a lot of device source clips, I stuck with two and used layer routers to get signal wherever I needed it in the
composition. For one of the keying effects this routing allowed me to send an upper layer back down in the render order, which is a feature that’s hard to appreciate until you need it.
The deck was mostly effect clips and a small selection of content. Out of seven layers total, two were essentially fixed tools and the other five gave me plenty of room to stack up each look in a single column. One feature of Resolume I had rarely used for VJing but came to rely on for this project was Composition MIDI mapping. It saved a lot of time by not having to remap as I shuffled things around and tried different orders of effects.
What effects did you use to create the different looks for the songs?
Each look was a combination of multiple effects with the most significant parameters linked to dashboard controls. Here are a few of my favorites.
One of the first looks we created was for “Eat the Rich,” which started with an upbeat, tribal-esque breakdown of percussion and vocals. Steven walked downstage, faced the camera, and did some crazy dance moves with an effect we called "Shiva.” It was based on Delay RGB with some color strobing, and I had a couple knobs controlling distortion effects based on his moves that particular night.
The trick with this Edge Detection effect was keeping detail on Steven so it didn’t look gaudy, then I added the glare to give it a more elegant feel. This one really popped during “Dream On” when Steven stood on top of the piano with his hair and clothes blowing in the wind from the stage fans. The song opened with a clip of rolling clouds and those fit nicely with this look too.

The most challenging cue each show was called “Face Melt,” where a combination of keying effects made Steven's skin translucent to reveal twisting (Cosmic Strings!) graphics. Most of the time we used this at the beginning of "Sweet Emotion” under moody ultraviolet light, which is incredibly tough to key against. I had presets that got it close and dialed in the rest by hand to make sure content only played over him and didn't spill out over the rest of the image. This look was part of my original Resolume demo for Steven.

What were the technical setup and signal flow like?
As VJs we’re often confined to prosumer gear that fits in a carry-on case. Not here. My equipment and the main video system were provided by Screenworks NEP out of LA, giving me access to considerable resources at the beginning of the project. During the system build I was able to pull broadcast-grade hardware and converters off the shelf, test them out, and get exactly what I needed. Having a lab of sorts to experiment with integration was a real luxury. Once the spec was complete, our tech-saavy video engineer went through each piece of gear from camera to screen and shaved off every millisecond of latency possible.
My rig was located backstage with the rest of video world since I needed access to several HD sources and quick communication with the video crew. Resolume ran on a 2013 Mac Pro and captured two discrete video signals using a Blackmagic DeckLink Duo. The card took any combination of cameras and switcher feeds based on my selection with remote panels connected to the main system router. Resolume’s output passed through an Image Pro for genlock and HD-SDI conversion, then went back to the central router so we could place it upstream or downstream of anything else in the signal flow to the LED screen.
For control I used Livid Instruments’ CNTRL:R. It has both regular and “endless" knobs, small drum pads, plenty of RGB buttons, long-throw faders, and a layout that works well with how I operate. Everyone of course has their own cup of tea when it comes to MIDI controllers, but when Resolume is open I almost always have the CNTRL:R plugged in too.
The heart of the video system was a Ross Vision, a high end broadcast switcher with all kinds of mixing, keying, and automation abilities. We had one look driven by the Vision that was a grid of nine different 1080 HD sources with no drop in frame-rate or performance. For another song we had switcher buttons triggering sequences of comic book styled playback based on which band member and camera angle were being shown, then a layer of effected live video from Resolume was keyed into a panel to match the look. Top-notch hardware opens the door to some pretty imaginative workflows.

Where do you see Resolume fitting in to a crowded scene of media servers and VJ software?
What originally got me into Resolume is its simplicity and intuitiveness, which let me focus on being creative. This is particularly important when you’re working with a high profile artist whose time is very valuable. In a creative session you have to quickly translate ideas into a repeatable cues, so you need a fast and flexible workflow. There is always time to go back and get technical with optimizing effect stacks, layering, and MIDI mapping. What doesn’t work is a rock star tapping his foot waiting on you to configure something.
One of Resolume’s best advantages that seems to either be overlooked or taken for granted is that it’s cross-platform. It’s important to me that no matter what tools and hardware I want to use, I don’t have to worry about changing the main piece of software I use to operate a show. Especially with Syphon and now Spout, a lot just comes down to user preference and project needs.
Looking forward, I’m excited to see how Resolume tackles new trends like it did with projection mapping. Things like timeline-based control data and DMX output are readily available using third party apps, but the process could be simpler.
Resolume is still a new tool to the industry as a whole and has a lot of room to grow beyond the EDM scene. As more artists embrace interactive technologies, generative show elements, and live content operators, having a powerful creative hub that can adapt to different workflows is key. Before this project I wouldn’t have expected Aerosmith to be part of this conversation, and was pleasantly surprised that even rock legends are riding the new wave of visual art.
See more of Danny's work at http://programfeed.com/
Artist profile: Peruggia
A while back we posted a video of some banging AV drumming wickedness, rocking an MPC and a projector. We loved how it shows the use of Resolume as a real visual instrument, which needs practice and time to learn how to play. Also it rocked our socks off.
[fold][/fold]
Intrigued by both the technique and the Botch sample at the end, we asked Gidon Schocken, the man behind Peruggia, to say a few words about himself, his projects and stuff in general.

Q: Say a few words about yourself and your projects.
A: When I was 16 I started up a rock/metal band with a few friends. We did quite well, but had to stop performing when most of us were drafted to the Israeli army. After being discharged I studied in a music academy and worked as a VJ. I was involved in numerous projects varying from performing as a solo folk artist, creating compositions for theater, playing guitar in a noise/rock band and so on. As I was pursuing all these activities I always enjoyed researching and experimenting with the "visual" side of things. For example, in the noise/rock band, I strapped a webcam to my guitar which filmed the audience. The images were augmented using Resolume's audio input, and the output was projected on a large screen behind the band.
At some point I began working solo. A sampler such as the MPC combined with Resolume seemed to be the perfect combination. The project I'm currently working on is called "Peruggia". It is named after Vincenzo Peruggia, the famous art thief who stole the Mona Lisa in 1911. All the samples are played live and feature some of my favorite artists such as Simian Mobile Disco, NIN, Botch, Bach, NosajThing, The Gaslamp Killer, The Prodigy, Helms Alle, Korn, James Blake, Bjork, Trifonic, John Frusciante, Verdi, Deftones, Dice Raw, Sage Francis, Lorn, Service Lab, Yud Kei, Dilinger Escape Plan, Noisia, Akimbo, Broken Note and many more. The project is still evolving and I'm presently working on my first performance.
Q: Why did you choose Resolume for the visuals and MPC2500 for the music?
A: In software and in hardware I try to look not for what yields the best result quality-wise, but rather what gets me faster and easier to the result I'm imagining. I could have used a Maschine and Module8 and probably gotten similar results, but it would have taken me much longer. Resolume's UI has always been clear and straightforward to me; I can imagine what ever I want and get there within just a few steps, and if I don't get there it still comes out pretty cool.
Q: What is your technical setup?
A: The MPC2500 is connected to Resolume via midi through a Motu Ultralite. I also have an LPD8 connected via USB that I use to switch between compositions in Resolume. The MPC has four different pad banks, each holding up to 16 different samples. Each time I move forward a bank I tap the LPD8 to load the next composition. I've encountered several problems during the process. For example, the performance that I've created is structured like a linear sequence of samples that I remember by heart. This forces me to move forward through the MPC banks, and, in parallel, through Resolume's compositions. Now, if I miss a sample in the MPC which is supposed to trigger a mute of a feedback in Resolume, this particular feedback will keep running until the end of the composition. So, practice, practice practice...
Q: What process and creative techniques did you use?
A: I've always been fascinated with feedbacks and the endless possibilities they offer, and in Resolume you can play with feedbacks for days on end. One of the goals that I set for myself in my project was creating all the visuals using Resolume's internal sources only (feedbacks, solids, effects etc. -- I reckoned that if I "borrow" the music from other artists at least I'll create the visuals myself ...). I developed a method that I call the "back in time technique" which I find both useful and simple.
Q: Can you describe this technique?
A: I go through the following routine:
1. Take a feedback source and put it in a layer
2. Take a solid color and put it above the feedback
3. Change the trigger style of the solid to piano
4. Map the solid to a keyboard key or midi controller
5. Change the scale of the feedback to 99%
6. The more you reduce the feedback scale, the faster the solid will go "back in time".
Check out more at https://www.facebook.com/PeruggiaLive
[fold][/fold]
Intrigued by both the technique and the Botch sample at the end, we asked Gidon Schocken, the man behind Peruggia, to say a few words about himself, his projects and stuff in general.
Q: Say a few words about yourself and your projects.
A: When I was 16 I started up a rock/metal band with a few friends. We did quite well, but had to stop performing when most of us were drafted to the Israeli army. After being discharged I studied in a music academy and worked as a VJ. I was involved in numerous projects varying from performing as a solo folk artist, creating compositions for theater, playing guitar in a noise/rock band and so on. As I was pursuing all these activities I always enjoyed researching and experimenting with the "visual" side of things. For example, in the noise/rock band, I strapped a webcam to my guitar which filmed the audience. The images were augmented using Resolume's audio input, and the output was projected on a large screen behind the band.
At some point I began working solo. A sampler such as the MPC combined with Resolume seemed to be the perfect combination. The project I'm currently working on is called "Peruggia". It is named after Vincenzo Peruggia, the famous art thief who stole the Mona Lisa in 1911. All the samples are played live and feature some of my favorite artists such as Simian Mobile Disco, NIN, Botch, Bach, NosajThing, The Gaslamp Killer, The Prodigy, Helms Alle, Korn, James Blake, Bjork, Trifonic, John Frusciante, Verdi, Deftones, Dice Raw, Sage Francis, Lorn, Service Lab, Yud Kei, Dilinger Escape Plan, Noisia, Akimbo, Broken Note and many more. The project is still evolving and I'm presently working on my first performance.
Q: Why did you choose Resolume for the visuals and MPC2500 for the music?
A: In software and in hardware I try to look not for what yields the best result quality-wise, but rather what gets me faster and easier to the result I'm imagining. I could have used a Maschine and Module8 and probably gotten similar results, but it would have taken me much longer. Resolume's UI has always been clear and straightforward to me; I can imagine what ever I want and get there within just a few steps, and if I don't get there it still comes out pretty cool.
Q: What is your technical setup?
A: The MPC2500 is connected to Resolume via midi through a Motu Ultralite. I also have an LPD8 connected via USB that I use to switch between compositions in Resolume. The MPC has four different pad banks, each holding up to 16 different samples. Each time I move forward a bank I tap the LPD8 to load the next composition. I've encountered several problems during the process. For example, the performance that I've created is structured like a linear sequence of samples that I remember by heart. This forces me to move forward through the MPC banks, and, in parallel, through Resolume's compositions. Now, if I miss a sample in the MPC which is supposed to trigger a mute of a feedback in Resolume, this particular feedback will keep running until the end of the composition. So, practice, practice practice...
Q: What process and creative techniques did you use?
A: I've always been fascinated with feedbacks and the endless possibilities they offer, and in Resolume you can play with feedbacks for days on end. One of the goals that I set for myself in my project was creating all the visuals using Resolume's internal sources only (feedbacks, solids, effects etc. -- I reckoned that if I "borrow" the music from other artists at least I'll create the visuals myself ...). I developed a method that I call the "back in time technique" which I find both useful and simple.
Q: Can you describe this technique?
A: I go through the following routine:
1. Take a feedback source and put it in a layer
2. Take a solid color and put it above the feedback
3. Change the trigger style of the solid to piano
4. Map the solid to a keyboard key or midi controller
5. Change the scale of the feedback to 99%
6. The more you reduce the feedback scale, the faster the solid will go "back in time".
Check out more at https://www.facebook.com/PeruggiaLive
Kid Meets Cougar meets Technology
The following info arrived on our virtual doorstep a while back, and our hearts simply melted:
Check out Brett and Courtney's website for soothing tunes and great videos. And a cat. And a robot. http://www.kidmeetscougar.com/
Cyber-Hermits, Guilt, & How We Built Our New Live Show:
Musicians, programmers, mappers, visual artists, and all of you other wonderful creative people of the internet, I have a confession to make. Over the last few years I have been silently climbing in your forums and snatching your knowledge up, trying to collect and hoard all of the pieces we needed to make our new live projection mapped show possible.
[fold][/fold]
Fortunately, we were able to make it happen with the power of the almighty internet, the passionate communities of creative/supportive people that it connects, and your free web-wisdom.
Here’s the thing that has been bugging me though: I visit a bunch of sites on a daily basis (I owe a lot especially to the forums/blogs at Resolume, Create Digital Music & Motion, Kineme, and 1024 Architecture) but I hardly ever leave a comment, ask a question, or share anything in return. Shame on me.
In an attempt to shed some of the guilt that comes with being a thieving, greedy, info hoarding, good-for-nothin’ cyber-hermit, I decided to give a little back and put together a geek-tasticly detailed overview of how we ended up connecting and operating our new show.
Here’s a picture of our video pipeline to give you a taste:
And you can read the original Google Doc in it’s full glory here.
Though we love playing with technology, the most important part of this project from the beginning was to create a show that visually enhanced the music in a way that created a seamless and intensified emotional experience for our audience without feeling forced (i.e., projection mapped visuals for the sake of projection mapped visuals). We wanted to make something really unique, impactful, and interesting to experience. I think we've done a pretty good job so far, but it'll get better as we continue to learn and create.
Okay, so now that I’ve fully confessed and made a small payment on my debt… I beg you, please have mercy on me and don’t take my internet connection away. I promise to be a contributing member of the interweb from here on out!
Forever yours,
Brett
Check out Brett and Courtney's website for soothing tunes and great videos. And a cat. And a robot. http://www.kidmeetscougar.com/