Visual Pixation with Rick Jacobs

Our quest for excellence in the visual space has now brought us to Rick Jacobs of RJB visuals.
me.jpg
Touring currently with Nicky Romero, and the man behind the operation and visual design of his entire show, the stuff that Rick does is epic of massive proportions. [fold][/fold]

Chile.png
What do we love about him?
He makes some great, heavily detailed content which is then displayed perfectly in sync with what Nicky is doing on stage. I, personally, love the magnitude & depth with which he portrays infinity, space and the inexplicable wonders of it.

core.png
glowroom.png
ocean.png
We reached out to Rick to talk to us, and throw some light on the great work he is doing.

What is touring with Nicky like? When did this great journey begin & how would you say you have grown with it?

It started 4 years ago, my first show with Nicky was Ultra 2013, the Main Stage. I was so nervous, everybody at home watching, my friends, family. Before that I had vj’d at clubs with just 1 output always. So, for Ultra, I brought 2 laptops to handle multiple outputs - being the newby I were back then ;)

Nicky and the team were impressed with that first show and offered me to tour with them. I chose to finish school first, because it was just 3 months left. I graduated as a game design and developer and missed my graduation ceremony as I went straight to Vegas to tour with Nicky.


electriclove_salzburg1.png
When I finished the tour I started RJB Visuals and teamed up with my brother Bob who was studying game art. Our teamwork was put to the test, immediately. We needed to conceptualize and create a 5min intro visual in 3 days!
Nowadays, we plan 1 month for an intro- This has become kind of our signature.


Here are links to some intros: Novell & Omnia

It’s been a really awesome journey so far, Nicky and the team trust Bob and I with the whole visualization of the show. When I started, they more or less had just the guy fawkes mask, so I had the freedom to design and develop a whole new visual style for his shows, which was really great!

Here is a sneakpeak into the latest intro for Nicky:

new_intro_conceptart2.jpg
new_intro_conceptart1.jpg
You and the LD do a great job live. How much of a role does SMPTE play in this & how much is freestyle?

The first 2 years that I toured with Nicky, we didn’t have a LD. After that, Koen van Elderen joined the team and I couldn’t have been happier! The guy is great, he programs really fast and we come up with new things while we are doing the show. We just understand each other immediately.



The whole show is freestyle, we never use SMPTE.It keeps us focused. Also, I don’t link all visuals to songs. One day this song has these visuals the next day you’ll see something different, it depends on what colors Koen and I yell at each other.

s2o_2.png
For all lyrics I use cue points so as soon as I hear Nicky mixing in a track with vocals I’ll ready up the visual and start cueing it.

From on point strobes, to perfect transitions, to super color changes- there’s gotta be a lot of concentration, communication & practice involved between you and Koen.

Like I said, Koen and I are just really on the same page. We make up new stuff during the show and remember it for the next show.We normally don’t receive a playlist or a lot of info on his set so we often get some nice surprises and have to come up with something, along the way.

It usually goes something like this:If you take the TISKS I’ll take the BOEMS.. Sure thing..Or whenever there are really powerful accents in a song we look at each other and ask “do you want to take these or shall I take them?” Haha!

It’s fun to change stuff around now and then.


s2o_1.png
DontletDaddyKnow.jpg
Also at each outro of a song we turn to each other and one of us will say the next color and we change it at the same time, when the song changes over. Or, if it’s a familiar song with its own visuals we both already know what to do or I make hand gestures of the visual that is coming up next so he will know the color. Sometimes, I will be stone faced visualizing a sea with my hands and he will know which visual is coming up.

What are your favorite effects & features on Resolume, that you would encourage VJs to incorporate into their show?

Mostly my effects are small flashy clips linked to buttons on my MIDI panel, but my knobs are linked to various screenshake/twitch/strobe effects. Mainly all sorts of small effects to highlight certain melodies or accentuate the bass.

What brief/ thought process do you follow while designing content for the show. We see a whole range from nature to space to waterfalls to abstract.

We try to create contrast between drops and breaks by changing the color scheme, style and pace while at the same time try to have the transitions be as fluid as possible. Break visuals for Nicky Romero's show are often desaturated/black-and-white realistic looking visuals while the drop visuals are full of flashing neon colors and abstract shapes loosely based on the realistic styled visual. Putting these completely different styles together in one song works as a great crowd booster.

forrest_visual_gif copy.gif
The risk of mixing these completely different styles after each other is that it could lead to too harsh of a transition. We're not a big fan of fade ins so several visuals have an actual intro clip that will autopilot to the next clip which is a loop. They're sort of a 'special fade in' to a loop starting of black and having the visual's scene unfold in a smooth transition.

Here are some Intro Clips:

intro_raindrops_gif.gif
intro_machine_gif.gif
intro_earth_gif.gif
Talk to us about your go to software & hardware (both for content creation & operation).

Most of our content is created in Cinema4D with the Octane renderer. For all the intros we use Autodesk Maya, since we have a history in game design and development we were pretty used to working in Maya or 3ds Max at school. It just has a little bit more extra options to get that exact look you want for the intro.

When we started creating short visual loops we soon realized Cinema4D is much more straight forward for creating visuals.For post we are using After Effects. And, of course, for vjing Resolume! 

As for hardware, I’m using an MSI GT73VR 6RF Titan Pro and the Akai APC40MK2.


Tell us about your studio. What gear is an absolute must have, and what would you like to change/ upgrade?

My studio isn’t that great actually, haha, we have a small office in Limburg at my parents place. One of our employees is also from Limburg so half of the week we’re working in Limburg and the other in Utrecht.

Limburg_office.jpg
We have a small setup in my apartment in Utrecht, my brother lives with me so it’s easy to work from home. In the near future we’re planning to get an office as we’re expanding and looking for new people to work with.

As for an upgrade, I really need more render power, haha, with these 4k visual content rendering is a nightmare.


heartbeat.png
novell_2.png
Any words of advice for budding visual artists out there?

Less is more! Don’t layer 6 completely different visuals on top of each other and mix them every beat. It can become chaos really easily. Also black is your friend, leave enough black space to make your visual really pop out.

Is there anything else you would like to talk about? We would love to hear it.

Our most recent development is that we’re starting a co-operation called Visual Lab with Hatim. Hatim was the reason I started vj’ing for Nicky and over the years we built a great bond as he is Nicky Romero’s tour/production manager.

As probably all of us here know, talking and arranging gigs/assignments is the least fun part of our job so it seems like a great idea to have someone do that for us. It also seems like the next big step for our company and will lead to us hiring more talented vj’s and content creators.


Also, recently we’ve been working on creating a more generic visualpack we would like to sell on Resolume.

It’s interesting creating visuals that are not for your own use because normally we create pretty specific visuals for certain parts of the show. Now we need to forget about that and create visuals that can be used by anyone in any situation. It’s a good practice. I think we have come up with a pretty cool style of modern styled visuals and classic kaleidoscopic visuals for your enjoyment. :)


two.zero_7.png
two.zero_3.png
And, on a last note ,we are working on a VR horror escape room game in between all the visual work related stuff. Got to keep those university-skills going! :D

If you’re interested we’ll post something about it on our social media in the future.


Horror.jpg
*Shudders* Oooh this gave us chills.

Thanks for doing this interview Rick. We all look forward to those visual packs from you, and wish you so much success with Visual Lab.

With skills like that, you’re miles ahead already :)

Check out some more of Rick’s work here

Credits:
Rick and Bob, RJB Visuals + Visual Lab
Follow them on: Instagram & their website

Make Some Noisia

Dutch electronic megahouse Noisia has been rocking the planet with their latest album ‘Outer Edges’.

noisia.png
Photo by Diana Gheorghiu

It was a wait. But one that was truly worth it. Essentially a concept album, they pushed the boundaries on this one by backing it up with a ‘concept tour’.

An audio-visual phenomenon with rivetting content, perfect sync & melt-yo-face energy, the Outer Edges show is one that could not pass our dissection.
[fold][/fold]
We visited Rampage, one of the biggest Drum & Bass gigs around the world & caught up with Roy Gerritsen (Boompje Studio) & Manuel Rodrigues (DeepRED.tv), on video and lighting duty respectively, to talk to us about the levels of excellence the Noisia crew has achieved, with this concept show.

Here is a look at Diplodocus, a favorite amongst bass heads:

Video by Diana Gheorghiu

Thanks for doing this guys! Much appreciated.

What exactly is a concept show and how is preparation for it different from other shows?

When Noisia approached us they explained they wanted to combine the release of their next album “Outer Edges” with a synchronized audio visual performance. It had been 6 years since Noisia released a full album so you can imagine it was a big thing.

Together, we came up with a plan to lay the foundation for upcoming shows. We wanted to focus on developing a workflow and pipeline to create one balanced and synchronized experience.
Normally, all the different elements within a show (audio, light, visual, performance) focus on their own area. There is one general theme or concept and everything then comes together in the end - during rehearsals.

We really wanted to create a show where we could focus on the total picture. Develop a workflow where we could keep refining the show and push the concept in all different elements in a quick and effective way, without overlooking the details.


What was the main goal you set out to achieve as you planned the Outer Edges show?
How long did it take to come together, from start to end?


We wanted to create a show where everything is 100% synchronized and highly adaptable. Having one main control computer which connects to all elements within the show in a perfect synchronized way.This setup gave us the ability to find a perfect balance and narrative between sound, performance, lights and visuals. Besides that we wanted to have a modular and highly flexible show. Making it easy and quick to adapt or add new content.

We started with the project in March 2016 and our premiere was at the Let It Roll festival in Prague (July 2016).
The show is designed in such a way that it has an “open-end”. We record every show and because of the open infrastructure we are constantly refining it on all fronts.


noisia_crew_photo_letitroll.JPG
What are the different gadgets and software you use to achieve that perfect sync between audio/video & lighting?

Roy:Back in the day, my graduation project at the HKU was a vj mix tool where I used the concept of “cue based” triggering. Instead of the widely used timecode synchronization where you premix all the content (the lights and the video tracks), we send a MIDI trigger of every beat and sound effect.This saves a lot of time in the content creation production process.

The edit and mix down of the visuals are basically done live on stage instead of on After effects. This means we don't have to render out 8 minute video clips and can focus on only a couple of key visual loops per track. (Every track consists of about 5 clips which get triggered directly from Ableton Live using a custom midi track).Inside Ableton we group a couple of extra designated clips so they all get triggered at the same time.

For every audio clip we sequence separate midi clips for the video and lighting, which get played perfectly in sync with the audio. These midi tracks then get sent to the VJ laptop and Manuel's lighting desk.


We understand you trigger clips off Resolume from Abelton Live using the extremely handy Max for Live patches?

Yes, we sequence a separate midi track for each audio track. We divided up the audio track in 5 different elements (beats, snares, melody , fx etc.), which corresponds with 5 video layers in Resolume.

When a note gets triggered, a Max for Live patch translates it to an OSC message and sends if off to the VJ laptop. The OSC messages get caught by a small tool we built in Derivative’s TouchDesigner. In its essence this tool translates the incoming messages into OSC messages which Resolume understands. Basically, operating Resolume automatically with the triggers received from Ableton.

This way of triggering videoclips was a result of an experiment from Martin Boverhof and Sander Haakman during a performance at an art festival in Germany, a couple of years ago. Only two variables are being used- triggering video files and adjusting the opacity of a clip. We were amazed how powerful these two variables are.


screenshot_touchdesigner.jpg

screenshot_noisia_resolumeset.jpg

screenshot_noisia_sync_tool.jpg
Regarding lighting, we understand the newer Chamsys boards have inbuilt support for MIDI/ timecode. What desk do you use?

Manuel:To drive the lighting in the Noisia - Outer Edges show I use a Chamsys Lighting desk. It is a very open environment. You can send Midi, Midi showcontrol, OSC, Timecode LTC & MTC, UDP, Serial Data and off course DMX & Artnet to the desk.

The support of Chamsys is extremely good and the software version is 100% free. Compared to other lighting desk manufacturers, the hardware is much cheaper.

A lighting desk is still much more expensive than a midi controller.
It might look similar as both have faders and buttons but the difference is that a lighting desk has a brain.
You can store, recall and sequence states, something which is invaluable for a lighting designer and now is happening is videoland more and more.

I have been researching on bridging the gap between Ableton Live and ChamSys for 8 years.
This research has led me to M2Q, acronym of Music-to-Cue which acts as a bridge between Ableton live and ChamSys. M2Q is a hardware solution designed together with Lorenzo Fattori, an Italian lighting designer and engineer. M2Q listens to midi messages sent from Ableton Live and converts them to Chamsys Remote Control messages, providing cue triggering and playback intensity control.
M2Q is reliable, easy and fast lighting sync solution. It enables non linear lighting sync.

When using Timecode it is impossible to loop within a song, do the chorus one more time or alter the playback speed on the fly. One is basically limited to pressing play.
Because our lighting sync system is midi based the artist on stage has exactly the same freedom Ableton audio playback offers.


Do you link it to Resolume?

Chamsys has a personality file (head file) for Resolume and this enables driving Resolume as a media server from the lighting desk. I must confess that I’m am considering switching to Resolume now for some time as it is very cost effective and stable solution compared to other mediaserver platforms.


Video by Diana Gheorghiu

Tell us about the trio’s super cool headgear. They change color, strobe, are dimmable. How?!

The led suits are custom designed and built by Élodie Laurent and are basically 3 generic led parcans and have similar functionality.
They are connected to the lighting desk just as the rest of the lighting rig and are driven using the same system.
Fun fact: These are the only three lights we bring with us so the Outer Edges show is extremely tour-able.


noisia_led_suits_leds.JPG

noisia_led_suits_final.jpg
The Noisia content is great in it’s quirkyness. Sometimes we see regular video clips, sometimes distorted human faces, sometimes exploding planets, mechanical animals- what’s the thought process behind the content you create? Is it track specific?

The main concept behind this show is that every track has his own little world in this Outer Edges universe. Every track stands on its own and has a different focus on style and narrative.
Nik (one third of Noisia & Art director) compiled a team of 10 international motion graphic artists and together we took on the visualization of the initial 34 audio tracks. Cover artwork, videoclips and general themes from the audio tracks formed the base for most of the tracks.


Screen Shot 2017-04-06 at 9.09.26 PM.png
Photo by Diana Gheorghiu

Screen Shot 2017-04-06 at 9.08.54 PM.png
Photo by Diana Gheorghiu

The lighting & video sync is so on point, we can’t stop talking about it. It must have taken hours of studio time & programming?

That was the whole idea behind the setup.
Instead of synchronizing everything in the light and video tracks, we separated the synchronizing process from the design process. Meaning that we sequence everything in Ableton and on the content side Resolume takes care of the rest. Updating a vj clip is just a matter of dragging a new clip into Resolume.

This also resulted in Resolume being a big part in the design process (instead of normally only serving as a media server).

During the design process we run the Ableton set and see how clips get triggered, if we don't like something we can easily replace the video clip with a new one or adjust for instance the scaling size inside Resolume.

Some tracks which included 3D rendered images took a bit longer, but there is one track “Diplodocus” which took 30 minutes to make from start to finish. Originally, meant as a placeholder but after seeing it being synchronized we liked the simplicity and boldness of it and decided to keep it in the show.


Here is some more madness that went down:

Video by Diana Gheorghiu

Is it challenging to adapt your concept show into different, extremely diverse festival setups? How do you output the video to LED setups that are not standard?

We mostly work with our rider setup consisting of a big LED screen in the back and LED banner in front of the booth, but in case of bigger festivals we can easily adjust the mapping setup inside Resolume.
In the case of Rampage we had another challenge to come up with a solution to operate with 7 full HD outputs.

Screen Shot 2017-04-06 at 9.08.32 PM.png
Photo by Diana Gheorghiu

Normally Nik is controlling everything from stage and we have a direct video line to the LED processor. Since all the connections to the LED screens were located in the front of house we used 2 laptops positioned there.

It was easy to adjust the Ableton Max for Live patch to send the triggers to two computers instead of one, and we wrote a small extra tool that sends all the midi-controller data from the stage to the FOH (to make sure Nik was still able to operate everything from the stage).


Talk to us about some features of Resolume that you think are handy, and would advice people out there to explore.

Resolume was a big part of the design process in this show. Using it almost as a small little After Effects, we stacked up effects until we reached our preferred end result. We triggered scalings, rotations, effects and opacity using the full OSC control option Resolume offers. This makes it super easy to create spot on synchronized shows. With a minimal amount of pre - production.
This in combination with the really powerful mapping options makes it an ideal tool to build our shows on!


What a great interview, Roy & Manuel.
Thanks for giving us a behind-the-scenes understanding of what it takes to run this epic show, day after day.

Noisia has been ruling the Drum & Bass circuit, for a reason. Thumping, fresh & original music along with a remarkable show- what else do we want?

Here is one last video for a group rage :


Video by Diana Gheorghiu

Rinseout.

Credits:
Photo credits Noisa setup: Roy Gerritsen
Adhiraj, Refractor for the on point video edits.

Mad About Madeon

Madeon is a French electronic producer, who uses gadgets and technology like they’re an extension of his very being.

With an on stage setup that baffles even the best in the business, this 22 year old producer has reached where he is because of his focus on the audio-visual aspect of a performance, as a unit.

His stage setup should be trademarked. It’s a diamond with arrow- like shapes on either side.
All made of LED.
mad2.jpg
Geometric.
Symmetric.
Minimalist.

We, here at Resolume, couldn’t pass on the chance of understanding his rig and how he perfectly triggers his visuals to the music, live.

Thanks very much for speaking to us Hugo!
[fold][/fold]
First things first, the answer many have been curious to know, can you explain your live setup to us? All the gadgets you use and their purpose?

The show is run on two laptops which are on stage with me.
One runs the audio side of things in Ableton and sends MIDI through ethernet in real time to a second, dedicated video laptop running Resolume.

I have two Novation Launchpads to play musical parts and modify existing stems, one Novation Launchcontrol XL to handle some additional fx and general controls (including tempo) and a Korg SV-1 keyboard.

There is also a Xone K2 plugged into Resolume to control some video effects.


Madeon-InfiniteInterview-6 (1).jpg
You do a great job of syncing your visuals to the music. Can you explain to us how you do this with Resolume?
All of the audio clips and parts in Ableton are grouped with matching MIDI clips that trigger videos and effect in Resolume.

All of the editing is done in real time, it's really useful as it means I can edit the video show easily between shows by simply changing the MIDI score.

It also means that I can improvise, extend or shorten a section, with the knowledge that the video show will keep up.


We have noticed some LED strips being used in your setups. Do you control DMX fixtures with Resolume as well?

No, we haven't explored this yet but i'm looking forward to it! At the moment, all of the fixtures are triggered manually (no timecode, shoutout to Alex Cerio!)

We really like the pixel mapping of the buttons on your Launchpads. Tell us about this.
This is a simple MIDI score sent to the Launchpad to animate it. Novation kindly built custom Launchpads for me with unique firmware features enabling me to switch between this type of "animation" mode and a regular functioning mode seamlessly.

Launchpad.gif
Audio-visuals is so important to you- sometimes the content looks like the launchpad. It’s gotta be intentional?

Absolutely! For the 2012 shows, there were sections of the show where the screen matched the Launchpad completely. There were also pixel-style grid animations that were completely in real-time (with 64 layers in Resolume for each of the 64 pad), each pad corresponding to a different MIDI note. Very fun to program!

madeon_1 (1).jpg
What thought process do you go through while creating visuals in your studio? What software do you use? How long does it take for you to prepare a render/ clip?

I work with a number of companies on making the content for the show but I make the content for about a third of the show.

I mostly use After Effects. I'm not very familiar with 3D softwares so I make 3D animations in AE polygon by polygon which is quite excruciating!

I like to keep making content on tour as new ideas occur to me, it's always a work in progress.


mad1.jpg
Give us a rundown of your studio equipment. What is an absolute must-have?What would you like to change/ upgrade?

A great computer has to be the most indispensable gear.

Whenever I upgrade, my production style always seems to adapt to use more plugins until I reach the limit again, it's constant frustration!

A zero-latency, unlimited-resources dream computer would be the best imaginable upgrade.


Why did you pick Resolume over the other software available out there?

Resolume reminded me a lot of audio softwares I was already familiar with.
It's intuitive and powerful, the effects are extremely usable and the latest updates in Arena 5 added mapping options that enabled my latest "diamond/chevron" LED setup.


mad4.jpg
With this, we come to the end of this interview.

Thanks much for taking the time out to do this Hugo, we are all very grateful.

Our hunger for technology and the things you can do with it has been duly satiated. For now.
Time to go try all of this out now, eh? :)

You can check out Madeon's work here:


Photo Cred: Charles Edouard Dangelser

On Tour with Zedd: Gabe Damast

Working for Resolume, we're lucky enough to see some of the most amazing VJ talent in action. One such person is Gabe Damast, whose live show for Zedd blew me away. Gabe is a true VJ and seldom we see a show this tight and in sync with the music. And most amazing of all, it's pure VJ skill, no SMPTE or other tricks.

Take a look at the video for an idea of how Gabe rocks it, and then read on below for what he has to say about all this.



[fold][/fold]

How did you start VJ'ing?

My introduction to the world of VJing came through music. I grew up in the San Francisco Bay Area playing saxophone and piano in a couple different Jazz and Funk bands, and as my love for electronic music developed I got into beat making, record producing, and sound engineering. I spent years learning basically every major production software set up a small studio in my parents basement where I'd record myself and my musician friends goofing off, and sometimes they'd turn into actual songs.

At the end of college, a friend of mine showed me Resolume, which was really the first time I was exposed to any visual performance software. I remember a lot of things clicked for me all at once, coming from a background using Ableton Live and FL Studio, Resolume felt like a very user friendly video version of the DAWs I was familiar with. It wasn't long before I got ahold of a projector and started working on my first VJ sets in my tiny dark bedroom late at night. At first I would use found footage and VJ clips from vimeo, but I eventually got into cinema 4D and after effects and started making my own video content, some of which is being used in the Zedd show currently!

productionclub-zedd-truecolorstour-worldwide-2015-20.jpg

Can you tell us a bit more about the Zedd tour? How does such a tour get organised when it comes to the stage design, the content, the operating of video, lights and laser? Who does what?

The True Colors - which was the latest Arena tour we did with Zedd - all started more than two years ago with scribbles on a paper napkin. Many artists will hire a specific designer to conceptualize a stage production, but from the very beginning, the Zedd touring team been extremely close-knit, and we always share roles and creative ideas freely. Zedd likes to be incredibly close with pretty much every aspect of his live show, so many of the crucial design decisions would happen in a group discussions during a meal at an airport, or a van ride on the way to a music festival. Our lighting director Stevie Hernandez would create renderings of different ideas in vector works pretty much in real time, which helped different ideas evolve and change.

Video content has always been the central focus of the Zedd show (and I'm NOT just saying that because I'm a VJ!!). For the True Colors Tour we wanted to give fans the most immersive experience possible, so the design we landed on was pretty much a giant 84 foot wide LED wall, framed with all sorts of light fixtures, lasers, and special effects. We were able to use an LED wall that was fully 4K in width - a dream come true for any pixel pusher. It's been really exciting to watch the rapid development of LED technology in recent years. Bigger walls, higher resolutions, soon I'm sure we're going to be watching shows in retina quality! In the five months leading up to the start of the tour, we worked closely with Beeple (Mike Winkelman) to create the bulk of the new show visuals rendered in stunning 4418x1080 resolution. Scott Pagano and myself also contributed to the content push, which enabled me to curate an entirely new Zedd visual show from our previous tour.

Read more about Production Works process here: http://www.productionclub.net/work/truecolors



The thing that stands out most to me is how video, laser and light play the accents in the music as a team, almost like a band. Is this something that you practice?

"Practicing" is always a tricky subject in the world of live production. The cost of renting enough gear to do a proper rehearsal is so high that it only really makes sense surrounding a tour where the costs are being spread over a few months. We were lucky to have two weeks of rehearsals before our tour rolled out, where we built the full size production in a sweaty, cavernous warehouse in Las Vegas, and Zedd, myself, Ken (our tour manager AND laser operator), and Stevie (lights) spent 12+ hours a day listening to music and creating unique looks for each song Zedd wanted to play during the tour. We brought in a mobile studio for Zedd to use, and each day would usually begin with us brainstorming visual ideas, and then taking breaks where me and Stevie could program the looks, and Zedd could work on musical edits and tweaks. It was hard to leave the rehearsal space at the end of the day because we were getting so much done!

It's all live right, no SMPTE? What would you say to people that are just starting out and are looking to get a tight sync like that?

No SMPTE! Every single video clip, strobe hit, and pyro shot are all cued live. That's why our rehearsals took so long. I have a lot of respect for people who put together time coded shows, and there are a lot of things you can do with that kind of sync that just aren't possible with live triggering, but for me, realtime performance is the only way I like to work. Music is what drives the visuals, and Zedd always DJs live, so there is a certain level of authenticity that is communicated by including some human error into the visual performance.

Whenever someone asks me how they should get into VJing, I always tell them to start by understanding music. You can definitely be a technical person and excel in the visual performance world, but in order to deliver an on-time show (with no timecode) you really have to learn music and rhythm. If you have good timing, and understand the basics of music theory, you can put on an amazing show even with the worst video content on the smallest screens.

productionclub-zedd-truecolorstour-worldwide-2015-08.jpg

What gear are you bringing with you? Is it easy to deal with airport customs?

For a normal fly-in show, I use a Macbook Pro Retina with three midi controllers: 2 tractor control F1s and a MIDI fighter 3D. My whole kit fits nicely in a Pelican 1510 carryon case, and if customs ever tries to hassle me I just say "it's for making computer music!!!" and they always leave me alone. Flying around with three laptops sometimes raises a few eyebrows, but I've never gotten seriously held up (yet! *knock on wood*)

How does Resolume fit into all this?

Resolume's simple layout makes it SUPER easy to organize our visual show. I always try to think about telling a story through our video content, and all of my Resolume compositions are arranged in a timeline that I navigate around depending on what songs are being played. Since everything is live, choosing a media server that allowed for quick re-organization was really important to me. Add in the first class customer service from the Resolume team, and it's a no brainer!

productionclub-zedd-truecolorstour-worldwide-2015-09.jpg

Where can we find you online?

You can find my work on the web at:
--- http://www.gabedamast.com ---

or other platforms like:
--- vimeo: https://vimeo.com/user5953855 ---
--- behance: https://www.behance.net/gabedamast ---

Touring Latin America - Viaje Tour Ricardo Arjona

The Viaje Tour Ricardo Arjona has been called the most successful latin tour in 2014-2015 by Pollstar and Billboard, with an attendance of more than 1.7 million people.



The man behind the visuals on this tour is Camilo Mejia, also known as VJ Bastard. Read more about what he has to say on the touring experience below.
[fold][/fold]
We have been touring for a year using Resolume with no issues. We've been to Argentina (8 cities, 25 shows and we're going back for at least 5 shows more in November), Mexico (16 cities, 35 shows), Uruguay, Paraguay, Panama, Costa rica (2 shows), Chile (7 shows), Puerto Rico (5 shows), USA (13 cities, and we're going back next month to 8 cities), Ecuador (3 cities), Venezuela (5 cities), Guatemala (2 shows, 3 cities), El Salvador, Honduras (2 cities), Nicaragua, Colombia (5 cities) and we are waiting for the confirmation of the Europe tour.

RicardoArjona_1.jpg
I have been using Resolume since 2.4.1, and have a good 15 years of experience playing with video.

I was called for this tour in May of 2014 as visualist and video engineer. We made rehearsals for a month in Mexico, during which we decided that the perfect system for our tour would be Resolume Arena.

First of all is the stability. I've played HUGE clips from 2gb to 70 or 80 gb, with no issues, so I know i will not have any problem with that. Because we don't run with a backup signal, that's a serious point.

Second of all we have a lot to stuff in our tour. Props (cars, trains, bikes, chairs…), back line, consoles, screens, and everything is travelling with us. As you may notice the screens of the tour are huge, and it is the first thing that we prepare for the show. Build up time is around 4 hours for the screens alone. With other systems it's easy for things to go missing, so the portability is really important for us.



The set up for the show that we use for the tour consists 436 modules of 6mm pitch LED screens. Resolume runs on a MacPro 12core 2,7Ghz /dual gpu with 64GbRam DDR3 with 1TB of storage, and 2 GPU Fire Pro d700 AMD of 6gm of VRAM.

The outputs are set up 1 for the main screen, 1 for the “leeds” or totems by the sides, 1 for the backing for the musicians (moving door), and 1 for the tunnels. The full comp is made for 2115 px X 1080 px, with no scalers, Folsoms or anything, I go straight to the processors with dual links and that's it.
RicardpArjona_2.jpg
I play the show with an Akai APC40 (older version) and some of the songs had SMPTE sync sent from Pro Tools. A Blackmagic capture card is used to capture an HD-SDI signal that is used in some cues to show the musicians and other live shots.


Read more about Camilo at his website: http://www.vjbastard.com or check out his work on Vimeo at https://vimeo.com/visualbastard

Rocking Out and Getting Your Geek On: Negrita!

We spoke with Marino Cecada, an italian visual designer who has been doing some out of the box work for various pop and rock acts. Where most rock shows visually rely on simple live camera registration, Marino uses Arena and some custom FFGL wizardry to take things to the next level.

Negrita-ilGioco.jpg
[fold][/fold]
Tell us about yourself
I live and work in Italy, and since 2006 I have been working in the video production business, in the beginning as a cameraman and editor. In 2007, together with 2 colleagues, we worked on our first visual project for concerts: it was the "Soundtrack" tour for Elisa, a famous Italian singer.

There were no live cameras for that tour. We had PVC screens on which our videos were projected. Each video was done especially for each song (track). During the years, I have been more and more attracted by video ­art and, generally speaking, everything regarding video production in the music and concert world.

Some of the first interactive video installations I made were with Quartz Composer. I used it to work on musical video­clips and collaborated on other tours in which I have always supervised the visuals and recently also direction and broadcast.

Then, in 2012, while preparing for a tour in the US with Il Volo, my colleague Steve Sarzi proposed to work with Arena. I never heard about it, but after using it with Steve for a couple hours, we had already set the basics for the project I had in mind and most of all, it worked! I was impressed by how fast and easy the software is to use. Also, it immediately read SMPTE signals which was extremely important as I usually work with pre-made videos which are synced.

Tell us about your last work
The last job was for Negrita, an Italian rock group that has a 30 year long career. For spring 2015 they had an in­door stadium tour in mind. The lighting and stage designer, Jo Campana, conceived a very essential stage: the background was made of three big LED screens, 5,60 meters tall and 4,20 Meters wide, occupying 16 meters in length.

Because of the importance given in terms of space, a lot of the show was centered on what was happening on these screens. The idea we had was to mainly use live footage and to exclude tracks with simply a video in the background. The live images had been conceived as a graphic element in support of the set/scenic design, so the important thing was that what we were filming had to be processed and filtered to give a different interpretation for each track.

Being a live broadcast, the result was a sequence of live videos that followed the dynamics and the energy of what was going on on stage. No pre­-created video could have given the same feeling. The only part we left unprocessed was the very last song, in which all the lights were turned on and the show came back to a more "earthly" environment.

Tell us about the video system you used
To realize what we had in mind, Telemauri , which is a video rental company Steve and I closely work with, gave us 3 cameras with cameramen, three remote­controlled cameras plus some small ones which were placed on stage, one of which on the drum set.

All cameras were routed into a ATEM switcher, which was extremely versatile. Thanks to it we could independently control 4 SDI and 4 other input signals to the computer with Arena.

What came out of Arena, went directly to a video matrix and consequently to the screen. Our output was a single full HD signal, the mapping of the three screens were directly done on Arena, deciding what should go on each screen. I prefer to keep some of the controls of Arena in manual mode, like the master fader for example, so we connected a Samson Graphite MF8 controller to the computer.

Have you used particular effects?
One of the aspects that made us choose Arena towards other more "prestigious" media servers, is that through FFGL we could develop our own plugins. In fact, also in the previous tour, Elisa's "L'Anima Vola" we created some plugins to make a choreography of the singer moving on stage, while on the screens her movements were repeated several times to create a trail.

Elisa-Anima vola.pngElisa tour 2014

Another plugin I enjoy, which has been developed together with Davide Mania (an FFGL programmer I have been working with for years) we named ScrumblePict. I often use it, and it allows us to have copies of the signals without having to use more of Arena's clip slots . These copies can be moved, rotated, scaled and cropped, allowing to always create different templates.

Elisa-Labyrinth.jpgElisa tour 2014

Antonacci.jpgBiagio Antonacci tour 2014

Could you show us some of the graphic styles used for the show?
As I mentioned, I very much enjoy working with image decompositions, so in this tour we also got busy with breakdowns and recomposing the signal that came through the switcher.

Negrita-radioConga.jpgNegrita live, example of the ScrumblePict effect.

For other tracks, we took advantage of the edge detection and pixel screen effects.

Negrita-Atomo.jpg

Another fantastic aspect of Arena is the possibility to use DXV files with the alpha channel. In this way we can create moving masks for live inputs.

Negrita-Love.jpgNegrita live, mask with alpha channel and live inside

More info and other works and productions at http://www.editart.it

Artist profile: Ghostdad

A breath of fresh air in the saturated landscape of abstract EDM visuals, Ghostdad aka Ryan Sciaino caught our eye running the impressive visuals for Porter Robinson's Worlds tour. After spending a morning scouring the Interwebs for concert footage, we figured we just might as well get in touch with the legend himself.

DM_SH01.jpgPorter Robinson - Worlds still image courtesy of Invisible Light Network
[fold][/fold]
How did you get started in the VJ game? When did you discover Resolume?

I grew up playing music, and eventually DJing, and then creating visuals for my own music. At some point digging for records and samples turned into digging for found footage from VHS tapes and dollar store DVD's. That was around the time internet video was becoming popular too so I’d comb youtube and archive.org for weird stuff also.

I went to college for computer music and started learning max/msp while I was there. I built a video sampler I could use to switch through clips while DJing, but eventually amped it up to take it on the road with my band WIN WIN when we were doing a synched up DJ/VJ set. It was sort of a monster with cue points and BPM synch and effects so programming it got pretty intense!



When I started working with Porter I knew I needed something faster and easier to throw new content into on the fly. I was making lots of new looks to layer up with other clips and logos etc. It seemed like Resolume could handle anything I threw at it, and the triggering was the fastest I’d ever used, making it really fun to jam with.

Who are some of the artists that inspired you early on? Who is knocking your head back currently?

I listened to a lot of indie rock in high school and Cornelius was one of my favourite artists from Japan. I was lucky enough to see him do his Point show in NYC. I had never seen anything like it. I grew up watching music videos and even got into film and video art so I was used to seeing music and video together but never with live music in person like that. The content and degree of synch were incredible. It really blew my mind.



I’m a pretty big fan boy of artists who use multimedia in a conceptual way but also keep it really clean design wise like Ryoji Ikeda or Rafael Rozendaal. I’ve found more and more of my Vimeo likes being taken up by things that have been featured at http://ghosting.tv. And I definitely try to check out other artists when I’m at festivals too. I saw Bassnectar at Buku in New Orleans a few weeks ago and that was an awesome show.

You have a very varied but distinct style. From anime characters to mayan mysticism to abstract glitch to low-poly geometry to ponies, the list goes on and on. Where does it all come from? Didn't your mom tell you not to spend so much time on the internet?

No actually! We didn’t have the internet when I was growing up! We got a connection by the time I was in high school but it was maybe dial up speed at best. I’m a little older then what I consider to be the first real “internet generation” so when things got really high speed and dazzling it made me feel like a stranger in a strange land. There was so much amazing stuff happening on Tumblr or Vimeo or Second Life that I just wanted to check it all out. I get sucked down the rabbit hole online pretty easily, especially when I want to find out more about a genre or an artists or a meme. Some design trends I see online do remind me of things I grew up with like 8 bit video games or low poly 3D graphics so maybe that makes me think “I can do that!”

Tigerlily2.pngVisuals for DJ Tigerlily courtesy of Ryan

What caught my eye about the Porter Robinson 'Worlds' content is that it almost seems to be cinematic, in that it seems to be telling a story. Now our minds will always create a narrative with what we see, but is this an experience you consciously set out to create?

I think Porter’s goal is to invoke a feeling rather then tell a specific story. There’s definitely a tendency to connect what you’re seeing on the screen and create a story in your mind but that’s also the process that pulls you in and allows you to really feel it. In programming the show we give you every audio/video/lighting cue we can for the theme and timing and mood, and as a result I think the viewer gets to paint their own story and put themselves into it in the process.

FF_SH01.jpgPorter Robinson - Worlds still image courtesy of Invisible Light Network

The Worlds tour content is a collaborative effort, with you playing content created by a larger group of visual artists. Who are the people that you've been working with and how has it been working with them?

We worked with Invisible Light Network on the animated looks you see in the show. They’re based in NY also and had about 9 or 10 illustrators and animators working on their team. We were also able to grab additional content from some of Porter’s music videos like Flicker by Coyote Post.

I made content for the show as well and Porter was super in touch with everyone throughout the creation process. It was a lot of different footage to wrangle in Premiere, but I spent a week with Porter before the start of the Worlds tour where we really figured out the visual flow and style of the show while putting it all together. Porter has a tremendous amount of vision when it comes to his music which is totally inspiring.

So when looking at Youtube videos from your shows, I came across this: https://www.youtube.com/watch?v=AdotsHAzfVA. It looks like someone has been re-creating the content he saw at the show. How do you like them apples?

Yah we just saw that also! I think fans are dying to take home a piece of the show and it’s really cool they’d go so far as to recreate it from the bits of media that are floating around out there. I think that excitement starts with Porter’s music though since there are practically whole new versions of songs from the album in the live set.

Someone even put the entire set together from cell phone footage taken at shows with homemade recreations of the live music. Okay and here’s where it gets really crazy, someone even started building the live rig into a 3D game engine: https://youtu.be/kq3TcMxpcV4

The expanded presentation of Worlds as an album is what makes it special, but I think the live and communal aspects are still super important. Maybe someday we’ll all be able to log into an MMO and experience something similar but even that won’t be able to beat being there in person experiencing the show with other fans. My guess is there will be a complete version of the Worlds show you can watch at home someday but for now we try to keep certain things exclusive to the live set so you have to show up and get the full experience.


Hopefully this crowd video from the Youtubes captures some of that live experience!

Recently you've been playing with Unity to make realtime visuals. What's the main thing that makes realtime more fun than pre-rendered?

Render time is never fun and playing video games is always fun right? I’ve never been very patient with 3D software. A lot of the 3D stuff I work on has a lo-fi video game aesthetic as well so its sort of a no brainer to start throwing stuff into Unity. I jump in and out of Blender as well but I figured if I’m going to put my time into learning a 3D environment I wanted it to be real time.

edit Copy 09_changes_140817.00_46_21_15.Still004 copy.jpgPorter Robinson - Worlds still image courtesy of Ryan

Alex my band mate in WIN WIN is way more under hood with Blender and rendered some really weird stuff for our last music video. We really liked the effect of video footage height mapped to a mesh and the objects came out really smooth and organic looking, in part thanks to some render time:



What are the main stumbling blocks you run into when working in realtime as opposed to keyframe everything? What about the liberating moments of the freedom it offers?

Scripting is something I wrestle with. It’s great that objects can do what you want in real time but you still have to tell them what to do! The benefit of course being you can see those changes instantly, and tweak it endlessly.

Controlling things in real time keeps me a little more engaged and expressive. I think coming from a music background makes that important to me.

Tigerlily3.pngVisuals for DJ Tigerlily courtesy of Ryan

Do you like to control things in realtime during show? Or is the appeal more during the creative process?

It’s been great to get a chance to do both this year. Both VJing live but also spending time editing and programming a show I mean. There are always things that will look better when edited ahead of time, but even in a show like Worlds I leave myself a few things to do by hand. Sometimes that’s so I can follow what Porter’s doing live, but also for me to feel more involved in the performative aspect of the show. I still play guitar and keys so I don’t want to let go of that live aspect of playing visuals also like an instrument.

People seem to get really excited when discussing realtime vs rendered, some people even get militant about it. You seem to switch seamlessly between both. Do you think one or the other has more potential? Are they mutually exclusive? Where would you like to see visuals heading in the next five years?

Part of my thinking about learning realtime is definitely about the future. Ideally real time processing will catch up to how good it can look when rendering. I don’t mind it looking a little rough around the edges for now if I can play it to the beat.

Ryan is a prolific internet user, so you can catch him in a variety of digital media. Get started down the rabbit hole at his website: http://www.djghostdad.com/

Dream On: Rocking It Out with Aerosmith

One of the great things about hosting Resolume workshops is you get to meet so many amazing people from all over the world.

One such amazing person is Danny Purdue.

After joining us for a session last year, he showed us the impressive work he was doing for Light Nightclub in Las Vegas. Soon after that, we got word he would be running Resolume for the visuals on the Aerosmith world tour. Live visuals are common in EDM and club scenes, but still a relatively new thing on rock shows, so of course we had to get the lowdown on this.

Here's the interview with Danny himself.
[fold][/fold]

ResolumeSetupSmall.jpg

Who are you and how the heck did you land a job with Aerosmith? What other work have you done?

I’ve spent most of the last ten years touring and producing live video at concerts. I started out running camera and editing, then eventually got into directing and more of the overall show design. On the side I had an interest in VJing, and the two paths crossed when I directed the live video for Swedish House Mafia's “One Last Tour” in 2012. The camera shots needed to be stylized to complement the LED visuals, so I integrated Resolume with a broadcast switcher to mix effects with my line cut. Creatively it was really fun and being out there inspired me to pursue VJing more seriously.

When that tour was over I headed to Las Vegas for a residency at two clubs opening in Mandalay Bay. One was Daylight Beach Club, a straightforward VJ gig, and the other was Cirque du Soleil’s Light Nightclub, an ambitious project to combine the theatrics of Cirque with EDM and the nightclub environment. Light has a massive LED installation, wireless HD cameras, motion tracking equipment for generative visuals, and custom content built for the architecture of the room. It took a lot of talented people to bring all the pieces together and make Light successful.

In March I got a call about putting together a Resolume demo for the upcoming Aerosmith tour. It sounded like a cool opportunity, so I went over to LA and worked out a formula similar to the Swedish House rig with Arena running inline with broadcast video equipment. A few days later Steven Tyler came by, I demoed some looks, then we spent a couple hours trying out all kinds of effects using recordings from previous shows. He liked what he saw and asked me to join the tour.

Why was Resolume chosen over other media servers?

The choice to use Resolume came from Steven. I was pretty surprised he knew about it, and even more so when we first met that he had actually downloaded the demo, gone through footage on the website, and rattled off names of the animators he liked. The man does his homework. After seeing what VJs were doing with Resolume, Steven was excited to use the large palette of effects to create visual moments in Aerosmith's show.

We didn’t have production rehearsals before the tour, so the immediate benefit of Resolume was rapidly developing ideas. Instead a lengthy creative process with renderings and reviewals, we knew a server running Arena could achieve whatever visual treatments we came up with on the road.

How is operating a rock show different from an EDM style event?

The main difference on this project was using visuals and effects to accent a show rather than drive it. What fans want to see at an Aerosmith concert isn’t graphics, it’s these rock icons playing their instruments and Steven Tyler’s wild stage presence. So it was a video-first approach where several elements had to be right for a visual moment to be effective.

After Steven and I developed a concept, I worked with our lighting designer Cosmo Wilson, video director Jeff Claire, and the camera crew to sort out the practical side of things like colors, spotlights, and filming angles. It was a much different environment than VJing at a rave where your content is the show and you’re more in control of the ambience.

How was your deck structured?

I ended up using a single deck mostly because it simplified my workflow with live inputs. Rather than having a lot of device source clips, I stuck with two and used layer routers to get signal wherever I needed it in the
composition. For one of the keying effects this routing allowed me to send an upper layer back down in the render order, which is a feature that’s hard to appreciate until you need it.

The deck was mostly effect clips and a small selection of content. Out of seven layers total, two were essentially fixed tools and the other five gave me plenty of room to stack up each look in a single column. One feature of Resolume I had rarely used for VJing but came to rely on for this project was Composition MIDI mapping. It saved a lot of time by not having to remap as I shuffled things around and tried different orders of effects.

What effects did you use to create the different looks for the songs?

Each look was a combination of multiple effects with the most significant parameters linked to dashboard controls. Here are a few of my favorites.

One of the first looks we created was for “Eat the Rich,” which started with an upbeat, tribal-esque breakdown of percussion and vocals. Steven walked downstage, faced the camera, and did some crazy dance moves with an effect we called "Shiva.” It was based on Delay RGB with some color strobing, and I had a couple knobs controlling distortion effects based on his moves that particular night.

EatTheRich.jpeg

The trick with this Edge Detection effect was keeping detail on Steven so it didn’t look gaudy, then I added the glare to give it a more elegant feel. This one really popped during “Dream On” when Steven stood on top of the piano with his hair and clothes blowing in the wind from the stage fans. The song opened with a clip of rolling clouds and those fit nicely with this look too.

PurpleGlow.jpg

The most challenging cue each show was called “Face Melt,” where a combination of keying effects made Steven's skin translucent to reveal twisting (Cosmic Strings!) graphics. Most of the time we used this at the beginning of "Sweet Emotion” under moody ultraviolet light, which is incredibly tough to key against. I had presets that got it close and dialed in the rest by hand to make sure content only played over him and didn't spill out over the rest of the image. This look was part of my original Resolume demo for Steven.

FaceMelt.jpg

What were the technical setup and signal flow like?

As VJs we’re often confined to prosumer gear that fits in a carry-on case. Not here. My equipment and the main video system were provided by Screenworks NEP out of LA, giving me access to considerable resources at the beginning of the project. During the system build I was able to pull broadcast-grade hardware and converters off the shelf, test them out, and get exactly what I needed. Having a lab of sorts to experiment with integration was a real luxury. Once the spec was complete, our tech-saavy video engineer went through each piece of gear from camera to screen and shaved off every millisecond of latency possible.

My rig was located backstage with the rest of video world since I needed access to several HD sources and quick communication with the video crew. Resolume ran on a 2013 Mac Pro and captured two discrete video signals using a Blackmagic DeckLink Duo. The card took any combination of cameras and switcher feeds based on my selection with remote panels connected to the main system router. Resolume’s output passed through an Image Pro for genlock and HD-SDI conversion, then went back to the central router so we could place it upstream or downstream of anything else in the signal flow to the LED screen.

For control I used Livid Instruments’ CNTRL:R. It has both regular and “endless" knobs, small drum pads, plenty of RGB buttons, long-throw faders, and a layout that works well with how I operate. Everyone of course has their own cup of tea when it comes to MIDI controllers, but when Resolume is open I almost always have the CNTRL:R plugged in too.

The heart of the video system was a Ross Vision, a high end broadcast switcher with all kinds of mixing, keying, and automation abilities. We had one look driven by the Vision that was a grid of nine different 1080 HD sources with no drop in frame-rate or performance. For another song we had switcher buttons triggering sequences of comic book styled playback based on which band member and camera angle were being shown, then a layer of effected live video from Resolume was keyed into a panel to match the look. Top-notch hardware opens the door to some pretty imaginative workflows.

VideoWorld.jpg

Where do you see Resolume fitting in to a crowded scene of media servers and VJ software?

What originally got me into Resolume is its simplicity and intuitiveness, which let me focus on being creative. This is particularly important when you’re working with a high profile artist whose time is very valuable. In a creative session you have to quickly translate ideas into a repeatable cues, so you need a fast and flexible workflow. There is always time to go back and get technical with optimizing effect stacks, layering, and MIDI mapping. What doesn’t work is a rock star tapping his foot waiting on you to configure something.

One of Resolume’s best advantages that seems to either be overlooked or taken for granted is that it’s cross-platform. It’s important to me that no matter what tools and hardware I want to use, I don’t have to worry about changing the main piece of software I use to operate a show. Especially with Syphon and now Spout, a lot just comes down to user preference and project needs.

Looking forward, I’m excited to see how Resolume tackles new trends like it did with projection mapping. Things like timeline-based control data and DMX output are readily available using third party apps, but the process could be simpler.

Resolume is still a new tool to the industry as a whole and has a lot of room to grow beyond the EDM scene. As more artists embrace interactive technologies, generative show elements, and live content operators, having a powerful creative hub that can adapt to different workflows is key. Before this project I wouldn’t have expected Aerosmith to be part of this conversation, and was pleasantly surprised that even rock legends are riding the new wave of visual art.

See more of Danny's work at http://programfeed.com/