Examining the Executioner: Excision
Excision. The man who has taken over the interwebz along with his mean machine: The Paradox.
For those living under a rock, Excision dropped some gigantic virtual bombs with his latest live experience “The Paradox”, on tour since Jan 2017. Forget at an actual gig, even if you’re at home looking at some videos on your phone, it is guaranteed to make your jaw drop. The rig looks deceptively simple. In reality, far, far from it. Deliciously seamless LED, a mobile DJ fascia, brilliant lighting, and slamming special effects- The Paradox is truly one of a kind. And we must say, has Excision written all over it.
All The Paradox shows are run by visual moguls Beama. We caught up with Brady Villadsen and Butz to understand what goes down in creating & running this monster, day after day. [fold][/fold]
Thanks much for doing this, guys! *tips hat
To start off, lets get the guys to give us some background about Beama.
BUTZ: "We got started by being inspired at Shambhala. The Fractal Forrest at Shambhala had a huge video system 10 years ago. I had the opportunity to volunteer under Gordon Blunt from Blunt Factory visuals and he got me started."
"There was very little visual arts in our home town of Calgary so we picked up a few projectors, a triplehead2go and a Mac Pro Running Modul8. Our first big gig was the Pagoda Stage at Shambhala, there we met Ben Leonard who is now one of our main Animators. After he joined the team we started doing video Mapping gigs with pixel perfect content. We started doing mapping before Mapping software was available and it was much more difficult back then to map a building. We used to take the projectors onsite and map the buildings through the projector using illustrator. Then we would go home make the content and come back and try to set the projector up in the exact same place."
"In 2012 two weeks after Mad Mapper came out I hit the road with Excision with our first touring stage called X Vision and ever since we have been working year-round on Excision tours and festival shows."
"The Executioner was our first time- coded show using the D3 and Resolume. We used the D3 for its auto calibration features which became very valuable at Coachella in 2013. At Coachella we had a 15 minute change over to focus, map and blend two Christie Hd35K projectors. Vello Virkhaus was the house VJ and was very skeptical of Beama’s claims but later did a series of speeches which praised our setup. To date the Paradox is our flagship tour and employs our animators Ben Leonard and Noah Freeman full time for a year leading up to the tour. Along with other tours such as Seven lions and Datsik."
Brady: "I’m a recent addition to the Beama team, I’ve known everyone through the Shambhala music festival for years. I was hired for the paradox after DJ Shadow's the mountain will fall tour. I really enjoy how diverse Beama is with their wide range of clients. It was a natural fit but still allowed me to work with friendly companies such as V-Squared labs and VJCLA."
Brady believes that the content of the Paradox tries to fit a large production rock show more than a standard loop based EDM event. He affirms that the level of excellence required by Beama and Jeff (Excision) are the highest of any tour that he has worked on.
Let’s find out why.
The Paradox is a modular set with 110 meters of 5 mm LED. Straight up, the first question that jumps to our mind is how the crew manages so many back to back tours- From load in to out.
Turns out, there are 5-6 variations of the Paradox, with some basic equations. It has been specifically engineered with a self-climbing truss system over motors, so that twice as many fixtures & video panels can be crammed into a venue.
Says Brady,“The Paradox is an experience that has overwhelming properties to our audience. It's a very demanding tour with multiple 12 show runs. On the 2017 tour, we loaded in and out 360 semi trucks of equipment over 10 weeks. Everyone on this tour is multi-disciplined and wears several hats. You're expected to be up, building truss, hanging panels and wiring systems with all the other crew. I was actually the LED tech as well as media server/digital systems tech. Personally, I spend a lot of time eating healthy and not drinking. That being said several crew members have the most impressive liquor constitutions I have ever seen.”
Aren’t we surprised? ;)
The next thing that struck us about this monster is the content. From robots shooting flames, to rampaging dinosaurs, to angry gorillas- it is a mech geek’s wildest dreams come true. It’s oh so cool.
All of this has been developed by longtime Beama & Excision Animator Ben Leonard. Also, the man behind the robotic T-rex & Robo Kitty fame.
“I've been working with Jeff for a long time now, and he pretty much trusts me to do whatever I want when it comes to content. Most of the time when given a song to animate, I'll just listen to it several times on headphones with my eyes closed, and whatever pops into my head is what I'll run with.
After a lifetime of saturating my brain with comics, anime, graffiti, video games and cheesy horror movies, my mind can wander into some weird places. I love making robots, aliens and heavy mecha inspired designs so a lot of that goes into the Paradox. Sometimes I will base my animations around a specific movement, like the camera moving up and down. Then I will build a scene around that movement, like an elevator dropping or a spaceship blasting off. But really, when making the content, it comes down to the music and what jumps into my head while I listen to it.”
So, how much studio time did Ben spend creating this stuff, you ask?
“I'm afraid to tell you how many hours it takes to make the Paradox content each year. Mostly I'm afraid to count that high. Let's put it this way, I'm working on next years content right now.”
Whew. As you let that sink in, we’re going to opt for a cure for our next itch- The dinosaurs. Were our eyes fooling us or does the Fascia LED actually rise for the CO2 fumes?
“Yes you are correct, our DJ booth is motorized and actually controlled by Jeff(Excision). We make sure to have different animation looks set for each song depending upon the booth state. This is a custom stage piece that was created when the Paradox was first designed. Everything of the set has been custom designed down to the mounting system for our LED panels. Our specifics are so rigid that Dave Hauss of the Hauss collective had to engineer everything down to the nuts and bolts.
The DJ booth was separate to this design but our specifications usually can’t be replicated by the average rental house. Because of this Jeff actually owns and had everything created for the Paradox. Jeff’s dedication to the show and bringing the best experience possible means he actually owns everything but the light fixtures.”
What a guy!
Cheggit, Beama actually flew out to China and worked with a manufacturer to refine the LED wall to their touring standards and build requirements. They stuck with a 5mm panel because of the clarity it gives the crowd from a standard stage distance. Their new 4k processors certainly helped when syncing the signal sent out.
And oh, the sync is spot on. It isn’t as simple as running a SMPTE time-code, though. The whole show is created with several different systems working together. Jeff (Excision) actually controls most elements of the show. The Cryo, DJ booth, visuals, lasers and lights all run off an intricate system with Resolume at the center.
Brady says, “Resolume does an excellent job of taking multiple protocols and being able to route it to multiple sources. We run a custom DVS system that feeds out to Resolume and then to a custom program I created in Touchdesigner that parses required data. The GrandMA2 has a dedicated SMPTE AUX but our LD Chris Pekar still runs various elements depending on the rig and what we have patched in from the house.”
“The custom program shows me the state of Jeff’s faders, song position and the deck currently controlling the rig. Behind the interface this program controls the timecode being sent out to the laser and lighting desks. It controls several pieces of audio gear and actually will run macros on the light desk. So, Jeff's faders will actually change laser or lighting settings along with audio/video.”
So, Jeff is in control of it all?
“Yep! This level of precision allowed our lights, lasers, cryo, automation, video and music to be coming from a single control surface. We frequently had whole shows being controlled by Jeff alone. A major requirement for the Paradox is the need of a true sync system that allows Jeff to properly DJ and mix while controlling everything.
On stage, Jeff has your standard CDJ/DJM setup with a control panel for cryo/automation and monitors. The monitors show the video for each track, master output and a dedicated camera feed. Jeff will frequently jump between cue points when setting up a track to mix in so we couldn’t use traditional timecode for our video system. Because he frequently needs to see the video as he scrubs through audio, which makes SMPTE impractical for our video system.”
Some more interesting trivia: ‘The Paradox’ doesn’t have a single loop based visual. Every song has a video custom made for it, timed exactly for it.
Brady explains this further, “Jeff is just Djing as he normally would, if a fan at a meet and greet has mentioned an obscure song you can damn well expect it to suddenly just be in the set. There isn’t any loop based songs, each one has been created and edited to fit each auditory element. We’re bringing in new animations mid tour, switching animations for songs if we’re in cities multiple days and making edits based on Jeff’s mid tour changes. Each video is cued by Jeff, he picks the song on the CDJ and whatever song he picks is loaded automatically. Some days I don't touch the video machines after programming. Sometimes Jeff just randomly picks whatever song he wants. We have specifically designed the system like this.”
Finally, we ask Brady about Resolume.
“We made some huge changes to our video architecture this year, Resolume was part of that foundation upgrade. Last year Beama stuck with Resolume 4 and madmapper until Resolume 5 had matured past its initial stages. I was extremely happy with the performance and framerate gains that came from moving to R5. Resolume itself had zero issues this tour and it was fun looking at how much the Resolume team has built.
I’m a frequent contributor to the Resolume forums and feel like the layer router feature in the advanced output is completely overlooked. This allowed us to easily map decks to Jeff’s monitors, but also do some impressive bits of mapping during festivals.
Because of the changes made at the start of the tour we have already shifted to a new custom system that allows major elements of the Paradox control system to be used at festivals. The Paradox in a small way is just becoming the Excision experience. Resolume is a central part with small obscure needs being filled by Touchdesigner and D3.”
That’s always so satisfying to hear :)
Thanks Brady, Butz & Ben for talking us through this epic show. Cheers to the hard working, crazy crew at Beama. We can hardly wait for the next visual monstrosity.
Brace yourselves. It’s coming.
Check out more of Beama's work here.
Follow them on Facebook & Instagram
For those living under a rock, Excision dropped some gigantic virtual bombs with his latest live experience “The Paradox”, on tour since Jan 2017. Forget at an actual gig, even if you’re at home looking at some videos on your phone, it is guaranteed to make your jaw drop. The rig looks deceptively simple. In reality, far, far from it. Deliciously seamless LED, a mobile DJ fascia, brilliant lighting, and slamming special effects- The Paradox is truly one of a kind. And we must say, has Excision written all over it.
All The Paradox shows are run by visual moguls Beama. We caught up with Brady Villadsen and Butz to understand what goes down in creating & running this monster, day after day. [fold][/fold]
Thanks much for doing this, guys! *tips hat
To start off, lets get the guys to give us some background about Beama.
BUTZ: "We got started by being inspired at Shambhala. The Fractal Forrest at Shambhala had a huge video system 10 years ago. I had the opportunity to volunteer under Gordon Blunt from Blunt Factory visuals and he got me started."
"There was very little visual arts in our home town of Calgary so we picked up a few projectors, a triplehead2go and a Mac Pro Running Modul8. Our first big gig was the Pagoda Stage at Shambhala, there we met Ben Leonard who is now one of our main Animators. After he joined the team we started doing video Mapping gigs with pixel perfect content. We started doing mapping before Mapping software was available and it was much more difficult back then to map a building. We used to take the projectors onsite and map the buildings through the projector using illustrator. Then we would go home make the content and come back and try to set the projector up in the exact same place."
"In 2012 two weeks after Mad Mapper came out I hit the road with Excision with our first touring stage called X Vision and ever since we have been working year-round on Excision tours and festival shows."
"The Executioner was our first time- coded show using the D3 and Resolume. We used the D3 for its auto calibration features which became very valuable at Coachella in 2013. At Coachella we had a 15 minute change over to focus, map and blend two Christie Hd35K projectors. Vello Virkhaus was the house VJ and was very skeptical of Beama’s claims but later did a series of speeches which praised our setup. To date the Paradox is our flagship tour and employs our animators Ben Leonard and Noah Freeman full time for a year leading up to the tour. Along with other tours such as Seven lions and Datsik."
Brady: "I’m a recent addition to the Beama team, I’ve known everyone through the Shambhala music festival for years. I was hired for the paradox after DJ Shadow's the mountain will fall tour. I really enjoy how diverse Beama is with their wide range of clients. It was a natural fit but still allowed me to work with friendly companies such as V-Squared labs and VJCLA."
Brady believes that the content of the Paradox tries to fit a large production rock show more than a standard loop based EDM event. He affirms that the level of excellence required by Beama and Jeff (Excision) are the highest of any tour that he has worked on.
Let’s find out why.
The Paradox is a modular set with 110 meters of 5 mm LED. Straight up, the first question that jumps to our mind is how the crew manages so many back to back tours- From load in to out.
Turns out, there are 5-6 variations of the Paradox, with some basic equations. It has been specifically engineered with a self-climbing truss system over motors, so that twice as many fixtures & video panels can be crammed into a venue.
Says Brady,“The Paradox is an experience that has overwhelming properties to our audience. It's a very demanding tour with multiple 12 show runs. On the 2017 tour, we loaded in and out 360 semi trucks of equipment over 10 weeks. Everyone on this tour is multi-disciplined and wears several hats. You're expected to be up, building truss, hanging panels and wiring systems with all the other crew. I was actually the LED tech as well as media server/digital systems tech. Personally, I spend a lot of time eating healthy and not drinking. That being said several crew members have the most impressive liquor constitutions I have ever seen.”
Aren’t we surprised? ;)
The next thing that struck us about this monster is the content. From robots shooting flames, to rampaging dinosaurs, to angry gorillas- it is a mech geek’s wildest dreams come true. It’s oh so cool.
All of this has been developed by longtime Beama & Excision Animator Ben Leonard. Also, the man behind the robotic T-rex & Robo Kitty fame.
“I've been working with Jeff for a long time now, and he pretty much trusts me to do whatever I want when it comes to content. Most of the time when given a song to animate, I'll just listen to it several times on headphones with my eyes closed, and whatever pops into my head is what I'll run with.
After a lifetime of saturating my brain with comics, anime, graffiti, video games and cheesy horror movies, my mind can wander into some weird places. I love making robots, aliens and heavy mecha inspired designs so a lot of that goes into the Paradox. Sometimes I will base my animations around a specific movement, like the camera moving up and down. Then I will build a scene around that movement, like an elevator dropping or a spaceship blasting off. But really, when making the content, it comes down to the music and what jumps into my head while I listen to it.”
So, how much studio time did Ben spend creating this stuff, you ask?
“I'm afraid to tell you how many hours it takes to make the Paradox content each year. Mostly I'm afraid to count that high. Let's put it this way, I'm working on next years content right now.”
Whew. As you let that sink in, we’re going to opt for a cure for our next itch- The dinosaurs. Were our eyes fooling us or does the Fascia LED actually rise for the CO2 fumes?
“Yes you are correct, our DJ booth is motorized and actually controlled by Jeff(Excision). We make sure to have different animation looks set for each song depending upon the booth state. This is a custom stage piece that was created when the Paradox was first designed. Everything of the set has been custom designed down to the mounting system for our LED panels. Our specifics are so rigid that Dave Hauss of the Hauss collective had to engineer everything down to the nuts and bolts.
The DJ booth was separate to this design but our specifications usually can’t be replicated by the average rental house. Because of this Jeff actually owns and had everything created for the Paradox. Jeff’s dedication to the show and bringing the best experience possible means he actually owns everything but the light fixtures.”
What a guy!
Cheggit, Beama actually flew out to China and worked with a manufacturer to refine the LED wall to their touring standards and build requirements. They stuck with a 5mm panel because of the clarity it gives the crowd from a standard stage distance. Their new 4k processors certainly helped when syncing the signal sent out.
And oh, the sync is spot on. It isn’t as simple as running a SMPTE time-code, though. The whole show is created with several different systems working together. Jeff (Excision) actually controls most elements of the show. The Cryo, DJ booth, visuals, lasers and lights all run off an intricate system with Resolume at the center.
Brady says, “Resolume does an excellent job of taking multiple protocols and being able to route it to multiple sources. We run a custom DVS system that feeds out to Resolume and then to a custom program I created in Touchdesigner that parses required data. The GrandMA2 has a dedicated SMPTE AUX but our LD Chris Pekar still runs various elements depending on the rig and what we have patched in from the house.”
“The custom program shows me the state of Jeff’s faders, song position and the deck currently controlling the rig. Behind the interface this program controls the timecode being sent out to the laser and lighting desks. It controls several pieces of audio gear and actually will run macros on the light desk. So, Jeff's faders will actually change laser or lighting settings along with audio/video.”
So, Jeff is in control of it all?
“Yep! This level of precision allowed our lights, lasers, cryo, automation, video and music to be coming from a single control surface. We frequently had whole shows being controlled by Jeff alone. A major requirement for the Paradox is the need of a true sync system that allows Jeff to properly DJ and mix while controlling everything.
On stage, Jeff has your standard CDJ/DJM setup with a control panel for cryo/automation and monitors. The monitors show the video for each track, master output and a dedicated camera feed. Jeff will frequently jump between cue points when setting up a track to mix in so we couldn’t use traditional timecode for our video system. Because he frequently needs to see the video as he scrubs through audio, which makes SMPTE impractical for our video system.”
Some more interesting trivia: ‘The Paradox’ doesn’t have a single loop based visual. Every song has a video custom made for it, timed exactly for it.
Brady explains this further, “Jeff is just Djing as he normally would, if a fan at a meet and greet has mentioned an obscure song you can damn well expect it to suddenly just be in the set. There isn’t any loop based songs, each one has been created and edited to fit each auditory element. We’re bringing in new animations mid tour, switching animations for songs if we’re in cities multiple days and making edits based on Jeff’s mid tour changes. Each video is cued by Jeff, he picks the song on the CDJ and whatever song he picks is loaded automatically. Some days I don't touch the video machines after programming. Sometimes Jeff just randomly picks whatever song he wants. We have specifically designed the system like this.”
Finally, we ask Brady about Resolume.
“We made some huge changes to our video architecture this year, Resolume was part of that foundation upgrade. Last year Beama stuck with Resolume 4 and madmapper until Resolume 5 had matured past its initial stages. I was extremely happy with the performance and framerate gains that came from moving to R5. Resolume itself had zero issues this tour and it was fun looking at how much the Resolume team has built.
I’m a frequent contributor to the Resolume forums and feel like the layer router feature in the advanced output is completely overlooked. This allowed us to easily map decks to Jeff’s monitors, but also do some impressive bits of mapping during festivals.
Because of the changes made at the start of the tour we have already shifted to a new custom system that allows major elements of the Paradox control system to be used at festivals. The Paradox in a small way is just becoming the Excision experience. Resolume is a central part with small obscure needs being filled by Touchdesigner and D3.”
That’s always so satisfying to hear :)
Thanks Brady, Butz & Ben for talking us through this epic show. Cheers to the hard working, crazy crew at Beama. We can hardly wait for the next visual monstrosity.
Brace yourselves. It’s coming.
Check out more of Beama's work here.
Follow them on Facebook & Instagram
Resolume Blog
This blog is about Resolume, VJ-ing and the inspiring things the Resolume users make. Do you have something interesting to show the community? Send in your work!
Highlights
Resolume Footage: Fresh Blood, Fresh Life
Hello my beautiful creaturesss of ze night. This veek ve have a vonderful selection of new visuals for you. These visuals are fresh, like a pulsing vein, teeming with ze promise of eternal bliss. So good you could almost taste it, yes? So drink up, and join me in eternity forever.
Welcome to Dumb Robot, who sets the bar high with a great theme and a lovely pastel color scheme on MachineHead.
Get MachineHead by DumbRobot from Resolume Footage
And another welcome to VisualLab, who drop the Mother Of Abstract Bombs: TwoPointZero.
Get TwoPointZero by VisualLab from Resolume Footage
And to close it off, we have VJ Galaxy, the zookeeper on a low poly diet.
Get LoPolyZoo by VJ Galaxy from Resolume Footage
Welcome to Dumb Robot, who sets the bar high with a great theme and a lovely pastel color scheme on MachineHead.
Get MachineHead by DumbRobot from Resolume Footage
And another welcome to VisualLab, who drop the Mother Of Abstract Bombs: TwoPointZero.
Get TwoPointZero by VisualLab from Resolume Footage
And to close it off, we have VJ Galaxy, the zookeeper on a low poly diet.
Get LoPolyZoo by VJ Galaxy from Resolume Footage
Where Wolves Roam
Wolves are slightly eccentric, mega talented visual artists who are making their presence felt across the globe. With fabulous skills to boast of, Joshua Dmello & Jash Reen have been fast racing to the front of the (visual) pack.
They’ve worked with all sizes of setups- from tiny to large to omgsomassive and have delivered to the hilt, over & over again. Be it with projection, or LED.
[fold][/fold]
From Sunburn to EDC to Beyond wonderland; Noisia to Nucleya to Flux Pavillion- Wolves have definitely carved a niche for themselves and are well on their way to achieve their dream of world domination.
[attachment=27]sb-2.jpg[/attachment]
When you look through their work, what is striking is their creativity in led mapping.
This blog post looks to cover some of their best art, understand their psyche & give you a grand display of some of their prized possessions.
Thanks for doing this, you guys!
Tell us a little something about where it all began for you. At what point did you realize visuals was your calling?
It pretty much started when we met in high school and played a whole lot of video games. We were so obsessed with films, comic books and games growing up that after trying to chase the proverbial ‘calling’ in the real world — Josh has a background in his Dad’s light and sound business and Jash an audio engineer and a journalist — we somehow snapped back to a way to playing video games again. Only we’ve got way bigger screens now and a lot more people are watching.
What do you prefer: Projection or LED? And why?
LED. We respect and admire a lot of projection mapping set ups in the art installation space. But we reached a point of exhaustion with upholding that medium in larger music arenas and festivals. It felt like we were alienating the larger elements of production like the lighting rigs, lasers, stage fx and of course, the performing artists themselves. You can’t having them performing at the same capacity if you have to worry about the lights overshadowing the projections.
So we took the fundamentals of projection mapping and chopped and screwed our input and output maps to fit to the most luminous and stubborn of LED surfaces. Every show’s a new challenge and we never repeat a set up twice.
Tell us the process you follow for pixel mapping.
It starts with us coming up with a stage design, which we conceptualize from scratch or is given to us by the client/ festival. Once we get the tile sizes and resolutions we figure out a way to create the most effective pixel map. One that can fit custom content and existing 1080p content that will look correct moving across the entire stage. Doing a 5-6 hour music festival is not feasible on custom content alone. If you have the accurate outmap map as well you can do most of your mapping from the hotel room itself. Aside from a few onsite tweaks.
Resolume offers mapping and playout of content in the same platform, very few other software offer the same with the ease of access and user-friendly interface. The snapping features, keyboard short cuts and the ability the set virtual displays makes mapping on Arena a breeze.
From the Imperial guard to Nucleya’s crazy stage to the current tour rig with Flux, how do you come up with different LED designs? Is there a brief you follow? Or do you adapt as per the content that you visualize?
There’s never a hardcore brief in play rather than an intense few weeks of conversation between us and the artists/clients we’re working with. Our inspirations personally come from larger than life cinematic universes — think Guillermo Del toro explaining Pacific Rim for the first time to a boardroom of studio execs and you know the kind of take we go in swinging with. Often we get so invested in bringing our content to life they become part of larger narratives we rarely get to talk about in a field like live visuals; but it definitely helps us tie it all together. We can give you 3 examples:
1) When Nucleya’s team approached us for his Bass Yatra Tour, we had around 600-700 sq ft of LED available and that many different versions of the LED layout to explore. We do a few days of pure illustration work along these layouts. Sometimes things just don't click and we break it up and start again. There’s a moment where the practicality of having this layout on stage meets this insane war demon sketch you’ve been rooting for in the start, and then we all sleep well at night.
2) Some of the bigger stages we’ve done like the ‘Outer Realm’ stage at Beyond Wonderland in Los Angeles throw a real curve ball at us. The stage had multiple narrow arches extending from the stage over the audience with very little room for seamless content.
We took it upon us to create two worlds loosely themed as — An Enchanted Forest and A Lost Ship — and somehow adapt it to the whole spread of LED. Keep in mind this was for a stage hosted by the incredible Bassrush crew and we knew things were about to get heavy. Those pleasant environments gave way to a forgotten labyrinth of cogs and pipes powered by a mechanical eye and a set of makeshift Icarus wings - and it all fit!
3) Fast forward to the Flux Pavilion tour and we’re working with a limited itinerary. The production company and label (Bigabox Productions and Circus Records) stock their LED in-house and had to scale it down to a realistic number of panels that would tour safely across all cities and fit in every venue. When they gave us the LED layout and pixel map, we were determined to follow through and not fallback on a 16:9 screen.
The other half of the conversation was with Flux Pavilion himself who had a very distinct vision of these worlds he wanted us to create for different sections of his set. It really pushed us to get a collective vision across that would fit on four sections of LED (one being the fascia) and still be immersive enough to draw audiences in every night.
We ended up with one hell of a ride — a quirky intro feature that draws people from the a scenic British landscape to the strange worlds of Flux Pavilion. At one section of the we have a bionic ship carrier take over the LED and transport him between these worlds. At another, a mad-scientist rig of electricity Pylons take over the screens and charges up flags made of coloured electricity. It gets weirder as we go on, and we’re pretty much developing these worlds as we go along on tour. Every night’s had a great response and we cant wait to see what the show turns into.
Tell us about your content creation process. From scratch to the final render. What software you use, and how much time one clip takes, on an average.
We throw ideas back and forth to settle on a base LED design or projection surface. Once thats finalized, we start to sketch over it- not only does it help control our ideas, but it also assures we’re creating something that will no doubt function at every step.
The sketches are then scanned, digitized and modelled in a 3D realm. We then gather up as many parts as possible to see what we can play around with in terms of animation. We start of with Illustrator to vectorize the sketches and then move on the Cinema 4Dfor the 3D modeling and animation. Post that, its taken into After Effects for final compositing and animation.
We have a great team on board that makes doing all of this such a breeze. They work round the clock and deliver spot on content. Shout out to them!
It can take anywhere between one day to around three to four days based on how complex the clip is in terms of 3D animation and texturing and lighting.
Let's take a walk through your studio. What hardware do you use, what is your most prized possession and what would you like to change/ upgrade?
Like most VJ's out there, we've also realized that the macbook pros just don’t cut it anymore. The heating issues, performance related to the AMD cards and throw in the 'Donglepocalypse', its a no brainer. We've switched to windows laptops for our tour machines.
We're currently using the Octane II from PC Specialist.
Specs:
Intel® CoreTMi7 Quad Core Processor i7-6700k
32Gb Ram
GeForce GTX 1070
1 TB Samsung SSD + 1 TB Seagate Hybrid.
For display outs they have 2 x mini DP, 1 x HDMI , 1 x USB C/Thunderbolt 3.
For our live feed needs we use the Magewell USB 3 to HDMI capture card.
We’re using multiple Windows servers for content generation with dual GTX 1080 GPU’s and the Intel i7 5960x CPU, 32 GB of RAM and 2TB SSD’s in each machine.
Our most prized possession would have to be the TMNT statues we have based on art by James Jean.
What process do you follow live? Do you prefer freestyling or is Midi/ SMPTE your friend?
We’re firm believers in doing everything live. We make our content in layers with alpha so we can explore different versions easily and no two sets end up being alike. It keeps the set interesting for us and helps us play it out with an instrument just like any performing artist would.
It also helps us build upon the content on the road and add more layers to it as our heads come up with something. If we go a bit too far we use Resolume’s crazed set of essential effects to roll out of a situation and come back with another banger. True story.
Is there anything more you would like to talk about?
We're playing around with expanding the Wolves brand because a lot of people seem to connect with it. It's all non profit and DIY at this stage. Our first step was starting a merch line that we want to promote within an immediate community of artists before they make their way out to a larger group of people. We started handing them ourselves at shows backstage, outside venues, on long tours and of course to all the wonderful crew we've had a chance to work with at front of house. It's like marking our path through the trenches of the creative industry with a sort-of-cult symbol rather than it become a household brand.
For people in India who'd like to represent, we've partnered with Redwolf to stock the latest 'MK-II' designs online: http://www.redwolf.in/wolves
We've also toying around with the tag 'New Wolf Order' to host a series of video content. At present, we're running with a VLog called transmissions.
It's really rough, surreal first person edits from what it's like to be at a Wolves show. Josh and I often argue that some of the videos lose the point entirely haha so hopefully we'll promote it better soon and start featuring more from a technical standpoint. If there's an audience for that, we're game.
Thanks guys for taking the time out to talk to us and giving us all these cool details.
Next up for Wolves: The Basspod Stage at EDC, Las Vegas.
Howl at ya’ll there.
Photo Credits: BRXVN. Follow him on Facebook & Instagram
They’ve worked with all sizes of setups- from tiny to large to omgsomassive and have delivered to the hilt, over & over again. Be it with projection, or LED.
[fold][/fold]
From Sunburn to EDC to Beyond wonderland; Noisia to Nucleya to Flux Pavillion- Wolves have definitely carved a niche for themselves and are well on their way to achieve their dream of world domination.
[attachment=27]sb-2.jpg[/attachment]
When you look through their work, what is striking is their creativity in led mapping.
This blog post looks to cover some of their best art, understand their psyche & give you a grand display of some of their prized possessions.
Thanks for doing this, you guys!
Tell us a little something about where it all began for you. At what point did you realize visuals was your calling?
It pretty much started when we met in high school and played a whole lot of video games. We were so obsessed with films, comic books and games growing up that after trying to chase the proverbial ‘calling’ in the real world — Josh has a background in his Dad’s light and sound business and Jash an audio engineer and a journalist — we somehow snapped back to a way to playing video games again. Only we’ve got way bigger screens now and a lot more people are watching.
What do you prefer: Projection or LED? And why?
LED. We respect and admire a lot of projection mapping set ups in the art installation space. But we reached a point of exhaustion with upholding that medium in larger music arenas and festivals. It felt like we were alienating the larger elements of production like the lighting rigs, lasers, stage fx and of course, the performing artists themselves. You can’t having them performing at the same capacity if you have to worry about the lights overshadowing the projections.
So we took the fundamentals of projection mapping and chopped and screwed our input and output maps to fit to the most luminous and stubborn of LED surfaces. Every show’s a new challenge and we never repeat a set up twice.
Tell us the process you follow for pixel mapping.
It starts with us coming up with a stage design, which we conceptualize from scratch or is given to us by the client/ festival. Once we get the tile sizes and resolutions we figure out a way to create the most effective pixel map. One that can fit custom content and existing 1080p content that will look correct moving across the entire stage. Doing a 5-6 hour music festival is not feasible on custom content alone. If you have the accurate outmap map as well you can do most of your mapping from the hotel room itself. Aside from a few onsite tweaks.
Resolume offers mapping and playout of content in the same platform, very few other software offer the same with the ease of access and user-friendly interface. The snapping features, keyboard short cuts and the ability the set virtual displays makes mapping on Arena a breeze.
From the Imperial guard to Nucleya’s crazy stage to the current tour rig with Flux, how do you come up with different LED designs? Is there a brief you follow? Or do you adapt as per the content that you visualize?
There’s never a hardcore brief in play rather than an intense few weeks of conversation between us and the artists/clients we’re working with. Our inspirations personally come from larger than life cinematic universes — think Guillermo Del toro explaining Pacific Rim for the first time to a boardroom of studio execs and you know the kind of take we go in swinging with. Often we get so invested in bringing our content to life they become part of larger narratives we rarely get to talk about in a field like live visuals; but it definitely helps us tie it all together. We can give you 3 examples:
1) When Nucleya’s team approached us for his Bass Yatra Tour, we had around 600-700 sq ft of LED available and that many different versions of the LED layout to explore. We do a few days of pure illustration work along these layouts. Sometimes things just don't click and we break it up and start again. There’s a moment where the practicality of having this layout on stage meets this insane war demon sketch you’ve been rooting for in the start, and then we all sleep well at night.
2) Some of the bigger stages we’ve done like the ‘Outer Realm’ stage at Beyond Wonderland in Los Angeles throw a real curve ball at us. The stage had multiple narrow arches extending from the stage over the audience with very little room for seamless content.
We took it upon us to create two worlds loosely themed as — An Enchanted Forest and A Lost Ship — and somehow adapt it to the whole spread of LED. Keep in mind this was for a stage hosted by the incredible Bassrush crew and we knew things were about to get heavy. Those pleasant environments gave way to a forgotten labyrinth of cogs and pipes powered by a mechanical eye and a set of makeshift Icarus wings - and it all fit!
3) Fast forward to the Flux Pavilion tour and we’re working with a limited itinerary. The production company and label (Bigabox Productions and Circus Records) stock their LED in-house and had to scale it down to a realistic number of panels that would tour safely across all cities and fit in every venue. When they gave us the LED layout and pixel map, we were determined to follow through and not fallback on a 16:9 screen.
The other half of the conversation was with Flux Pavilion himself who had a very distinct vision of these worlds he wanted us to create for different sections of his set. It really pushed us to get a collective vision across that would fit on four sections of LED (one being the fascia) and still be immersive enough to draw audiences in every night.
We ended up with one hell of a ride — a quirky intro feature that draws people from the a scenic British landscape to the strange worlds of Flux Pavilion. At one section of the we have a bionic ship carrier take over the LED and transport him between these worlds. At another, a mad-scientist rig of electricity Pylons take over the screens and charges up flags made of coloured electricity. It gets weirder as we go on, and we’re pretty much developing these worlds as we go along on tour. Every night’s had a great response and we cant wait to see what the show turns into.
Tell us about your content creation process. From scratch to the final render. What software you use, and how much time one clip takes, on an average.
We throw ideas back and forth to settle on a base LED design or projection surface. Once thats finalized, we start to sketch over it- not only does it help control our ideas, but it also assures we’re creating something that will no doubt function at every step.
The sketches are then scanned, digitized and modelled in a 3D realm. We then gather up as many parts as possible to see what we can play around with in terms of animation. We start of with Illustrator to vectorize the sketches and then move on the Cinema 4Dfor the 3D modeling and animation. Post that, its taken into After Effects for final compositing and animation.
We have a great team on board that makes doing all of this such a breeze. They work round the clock and deliver spot on content. Shout out to them!
It can take anywhere between one day to around three to four days based on how complex the clip is in terms of 3D animation and texturing and lighting.
Let's take a walk through your studio. What hardware do you use, what is your most prized possession and what would you like to change/ upgrade?
Like most VJ's out there, we've also realized that the macbook pros just don’t cut it anymore. The heating issues, performance related to the AMD cards and throw in the 'Donglepocalypse', its a no brainer. We've switched to windows laptops for our tour machines.
We're currently using the Octane II from PC Specialist.
Specs:
Intel® CoreTMi7 Quad Core Processor i7-6700k
32Gb Ram
GeForce GTX 1070
1 TB Samsung SSD + 1 TB Seagate Hybrid.
For display outs they have 2 x mini DP, 1 x HDMI , 1 x USB C/Thunderbolt 3.
For our live feed needs we use the Magewell USB 3 to HDMI capture card.
We’re using multiple Windows servers for content generation with dual GTX 1080 GPU’s and the Intel i7 5960x CPU, 32 GB of RAM and 2TB SSD’s in each machine.
Our most prized possession would have to be the TMNT statues we have based on art by James Jean.
What process do you follow live? Do you prefer freestyling or is Midi/ SMPTE your friend?
We’re firm believers in doing everything live. We make our content in layers with alpha so we can explore different versions easily and no two sets end up being alike. It keeps the set interesting for us and helps us play it out with an instrument just like any performing artist would.
It also helps us build upon the content on the road and add more layers to it as our heads come up with something. If we go a bit too far we use Resolume’s crazed set of essential effects to roll out of a situation and come back with another banger. True story.
Is there anything more you would like to talk about?
We're playing around with expanding the Wolves brand because a lot of people seem to connect with it. It's all non profit and DIY at this stage. Our first step was starting a merch line that we want to promote within an immediate community of artists before they make their way out to a larger group of people. We started handing them ourselves at shows backstage, outside venues, on long tours and of course to all the wonderful crew we've had a chance to work with at front of house. It's like marking our path through the trenches of the creative industry with a sort-of-cult symbol rather than it become a household brand.
For people in India who'd like to represent, we've partnered with Redwolf to stock the latest 'MK-II' designs online: http://www.redwolf.in/wolves
We've also toying around with the tag 'New Wolf Order' to host a series of video content. At present, we're running with a VLog called transmissions.
It's really rough, surreal first person edits from what it's like to be at a Wolves show. Josh and I often argue that some of the videos lose the point entirely haha so hopefully we'll promote it better soon and start featuring more from a technical standpoint. If there's an audience for that, we're game.
Thanks guys for taking the time out to talk to us and giving us all these cool details.
Next up for Wolves: The Basspod Stage at EDC, Las Vegas.
Howl at ya’ll there.
Photo Credits: BRXVN. Follow him on Facebook & Instagram
New Footage Releases: Program patterns
Greetings programs! Hop on your lightcycles and enter the grid because this time we've got three absolute bangers for you.
WTFlow gives a new spin on a familiar theme. The slight offset on the color animation gives this pack a deliciously organic feel.
Get Metalive from Resolume Footage
Unit44 is back with the sequel to Pattern.
Get Pattern 2 from Resolume Footage
And Ostwerker gets in on that Tron action.
Get Trontastik from Resolume Footage
WTFlow gives a new spin on a familiar theme. The slight offset on the color animation gives this pack a deliciously organic feel.
Get Metalive from Resolume Footage
Unit44 is back with the sequel to Pattern.
Get Pattern 2 from Resolume Footage
And Ostwerker gets in on that Tron action.
Get Trontastik from Resolume Footage
Visual Pixation with Rick Jacobs
Our quest for excellence in the visual space has now brought us to Rick Jacobs of RJB visuals.
Touring currently with Nicky Romero, and the man behind the operation and visual design of his entire show, the stuff that Rick does is epic of massive proportions. [fold][/fold]
What do we love about him?
He makes some great, heavily detailed content which is then displayed perfectly in sync with what Nicky is doing on stage. I, personally, love the magnitude & depth with which he portrays infinity, space and the inexplicable wonders of it.
We reached out to Rick to talk to us, and throw some light on the great work he is doing.
What is touring with Nicky like? When did this great journey begin & how would you say you have grown with it?
It started 4 years ago, my first show with Nicky was Ultra 2013, the Main Stage. I was so nervous, everybody at home watching, my friends, family. Before that I had vj’d at clubs with just 1 output always. So, for Ultra, I brought 2 laptops to handle multiple outputs - being the newby I were back then ;)
Nicky and the team were impressed with that first show and offered me to tour with them. I chose to finish school first, because it was just 3 months left. I graduated as a game design and developer and missed my graduation ceremony as I went straight to Vegas to tour with Nicky.
When I finished the tour I started RJB Visuals and teamed up with my brother Bob who was studying game art. Our teamwork was put to the test, immediately. We needed to conceptualize and create a 5min intro visual in 3 days!
Nowadays, we plan 1 month for an intro- This has become kind of our signature.
Here are links to some intros: Novell & Omnia
It’s been a really awesome journey so far, Nicky and the team trust Bob and I with the whole visualization of the show. When I started, they more or less had just the guy fawkes mask, so I had the freedom to design and develop a whole new visual style for his shows, which was really great!
Here is a sneakpeak into the latest intro for Nicky:
You and the LD do a great job live. How much of a role does SMPTE play in this & how much is freestyle?
The first 2 years that I toured with Nicky, we didn’t have a LD. After that, Koen van Elderen joined the team and I couldn’t have been happier! The guy is great, he programs really fast and we come up with new things while we are doing the show. We just understand each other immediately.
The whole show is freestyle, we never use SMPTE.It keeps us focused. Also, I don’t link all visuals to songs. One day this song has these visuals the next day you’ll see something different, it depends on what colors Koen and I yell at each other.
For all lyrics I use cue points so as soon as I hear Nicky mixing in a track with vocals I’ll ready up the visual and start cueing it.
From on point strobes, to perfect transitions, to super color changes- there’s gotta be a lot of concentration, communication & practice involved between you and Koen.
Like I said, Koen and I are just really on the same page. We make up new stuff during the show and remember it for the next show.We normally don’t receive a playlist or a lot of info on his set so we often get some nice surprises and have to come up with something, along the way.
It usually goes something like this:If you take the TISKS I’ll take the BOEMS.. Sure thing..Or whenever there are really powerful accents in a song we look at each other and ask “do you want to take these or shall I take them?” Haha!
It’s fun to change stuff around now and then.
Also at each outro of a song we turn to each other and one of us will say the next color and we change it at the same time, when the song changes over. Or, if it’s a familiar song with its own visuals we both already know what to do or I make hand gestures of the visual that is coming up next so he will know the color. Sometimes, I will be stone faced visualizing a sea with my hands and he will know which visual is coming up.
What are your favorite effects & features on Resolume, that you would encourage VJs to incorporate into their show?
Mostly my effects are small flashy clips linked to buttons on my MIDI panel, but my knobs are linked to various screenshake/twitch/strobe effects. Mainly all sorts of small effects to highlight certain melodies or accentuate the bass.
What brief/ thought process do you follow while designing content for the show. We see a whole range from nature to space to waterfalls to abstract.
We try to create contrast between drops and breaks by changing the color scheme, style and pace while at the same time try to have the transitions be as fluid as possible. Break visuals for Nicky Romero's show are often desaturated/black-and-white realistic looking visuals while the drop visuals are full of flashing neon colors and abstract shapes loosely based on the realistic styled visual. Putting these completely different styles together in one song works as a great crowd booster.
The risk of mixing these completely different styles after each other is that it could lead to too harsh of a transition. We're not a big fan of fade ins so several visuals have an actual intro clip that will autopilot to the next clip which is a loop. They're sort of a 'special fade in' to a loop starting of black and having the visual's scene unfold in a smooth transition.
Here are some Intro Clips:
Talk to us about your go to software & hardware (both for content creation & operation).
Most of our content is created in Cinema4D with the Octane renderer. For all the intros we use Autodesk Maya, since we have a history in game design and development we were pretty used to working in Maya or 3ds Max at school. It just has a little bit more extra options to get that exact look you want for the intro.
When we started creating short visual loops we soon realized Cinema4D is much more straight forward for creating visuals.For post we are using After Effects. And, of course, for vjing Resolume!
As for hardware, I’m using an MSI GT73VR 6RF Titan Pro and the Akai APC40MK2.
Tell us about your studio. What gear is an absolute must have, and what would you like to change/ upgrade?
My studio isn’t that great actually, haha, we have a small office in Limburg at my parents place. One of our employees is also from Limburg so half of the week we’re working in Limburg and the other in Utrecht.
We have a small setup in my apartment in Utrecht, my brother lives with me so it’s easy to work from home. In the near future we’re planning to get an office as we’re expanding and looking for new people to work with.
As for an upgrade, I really need more render power, haha, with these 4k visual content rendering is a nightmare.
Any words of advice for budding visual artists out there?
Less is more! Don’t layer 6 completely different visuals on top of each other and mix them every beat. It can become chaos really easily. Also black is your friend, leave enough black space to make your visual really pop out.
Is there anything else you would like to talk about? We would love to hear it.
Our most recent development is that we’re starting a co-operation called Visual Lab with Hatim. Hatim was the reason I started vj’ing for Nicky and over the years we built a great bond as he is Nicky Romero’s tour/production manager.
As probably all of us here know, talking and arranging gigs/assignments is the least fun part of our job so it seems like a great idea to have someone do that for us. It also seems like the next big step for our company and will lead to us hiring more talented vj’s and content creators.
Also, recently we’ve been working on creating a more generic visualpack we would like to sell on Resolume.
It’s interesting creating visuals that are not for your own use because normally we create pretty specific visuals for certain parts of the show. Now we need to forget about that and create visuals that can be used by anyone in any situation. It’s a good practice. I think we have come up with a pretty cool style of modern styled visuals and classic kaleidoscopic visuals for your enjoyment. :)
And, on a last note ,we are working on a VR horror escape room game in between all the visual work related stuff. Got to keep those university-skills going! :D
If you’re interested we’ll post something about it on our social media in the future.
*Shudders* Oooh this gave us chills.
Thanks for doing this interview Rick. We all look forward to those visual packs from you, and wish you so much success with Visual Lab.
With skills like that, you’re miles ahead already :)
Check out some more of Rick’s work here
Credits:
Rick and Bob, RJB Visuals + Visual Lab
Follow them on: Instagram & their website
Touring currently with Nicky Romero, and the man behind the operation and visual design of his entire show, the stuff that Rick does is epic of massive proportions. [fold][/fold]
What do we love about him?
He makes some great, heavily detailed content which is then displayed perfectly in sync with what Nicky is doing on stage. I, personally, love the magnitude & depth with which he portrays infinity, space and the inexplicable wonders of it.
We reached out to Rick to talk to us, and throw some light on the great work he is doing.
What is touring with Nicky like? When did this great journey begin & how would you say you have grown with it?
It started 4 years ago, my first show with Nicky was Ultra 2013, the Main Stage. I was so nervous, everybody at home watching, my friends, family. Before that I had vj’d at clubs with just 1 output always. So, for Ultra, I brought 2 laptops to handle multiple outputs - being the newby I were back then ;)
Nicky and the team were impressed with that first show and offered me to tour with them. I chose to finish school first, because it was just 3 months left. I graduated as a game design and developer and missed my graduation ceremony as I went straight to Vegas to tour with Nicky.
When I finished the tour I started RJB Visuals and teamed up with my brother Bob who was studying game art. Our teamwork was put to the test, immediately. We needed to conceptualize and create a 5min intro visual in 3 days!
Nowadays, we plan 1 month for an intro- This has become kind of our signature.
Here are links to some intros: Novell & Omnia
It’s been a really awesome journey so far, Nicky and the team trust Bob and I with the whole visualization of the show. When I started, they more or less had just the guy fawkes mask, so I had the freedom to design and develop a whole new visual style for his shows, which was really great!
Here is a sneakpeak into the latest intro for Nicky:
You and the LD do a great job live. How much of a role does SMPTE play in this & how much is freestyle?
The first 2 years that I toured with Nicky, we didn’t have a LD. After that, Koen van Elderen joined the team and I couldn’t have been happier! The guy is great, he programs really fast and we come up with new things while we are doing the show. We just understand each other immediately.
The whole show is freestyle, we never use SMPTE.It keeps us focused. Also, I don’t link all visuals to songs. One day this song has these visuals the next day you’ll see something different, it depends on what colors Koen and I yell at each other.
For all lyrics I use cue points so as soon as I hear Nicky mixing in a track with vocals I’ll ready up the visual and start cueing it.
From on point strobes, to perfect transitions, to super color changes- there’s gotta be a lot of concentration, communication & practice involved between you and Koen.
Like I said, Koen and I are just really on the same page. We make up new stuff during the show and remember it for the next show.We normally don’t receive a playlist or a lot of info on his set so we often get some nice surprises and have to come up with something, along the way.
It usually goes something like this:If you take the TISKS I’ll take the BOEMS.. Sure thing..Or whenever there are really powerful accents in a song we look at each other and ask “do you want to take these or shall I take them?” Haha!
It’s fun to change stuff around now and then.
Also at each outro of a song we turn to each other and one of us will say the next color and we change it at the same time, when the song changes over. Or, if it’s a familiar song with its own visuals we both already know what to do or I make hand gestures of the visual that is coming up next so he will know the color. Sometimes, I will be stone faced visualizing a sea with my hands and he will know which visual is coming up.
What are your favorite effects & features on Resolume, that you would encourage VJs to incorporate into their show?
Mostly my effects are small flashy clips linked to buttons on my MIDI panel, but my knobs are linked to various screenshake/twitch/strobe effects. Mainly all sorts of small effects to highlight certain melodies or accentuate the bass.
What brief/ thought process do you follow while designing content for the show. We see a whole range from nature to space to waterfalls to abstract.
We try to create contrast between drops and breaks by changing the color scheme, style and pace while at the same time try to have the transitions be as fluid as possible. Break visuals for Nicky Romero's show are often desaturated/black-and-white realistic looking visuals while the drop visuals are full of flashing neon colors and abstract shapes loosely based on the realistic styled visual. Putting these completely different styles together in one song works as a great crowd booster.
The risk of mixing these completely different styles after each other is that it could lead to too harsh of a transition. We're not a big fan of fade ins so several visuals have an actual intro clip that will autopilot to the next clip which is a loop. They're sort of a 'special fade in' to a loop starting of black and having the visual's scene unfold in a smooth transition.
Here are some Intro Clips:
Talk to us about your go to software & hardware (both for content creation & operation).
Most of our content is created in Cinema4D with the Octane renderer. For all the intros we use Autodesk Maya, since we have a history in game design and development we were pretty used to working in Maya or 3ds Max at school. It just has a little bit more extra options to get that exact look you want for the intro.
When we started creating short visual loops we soon realized Cinema4D is much more straight forward for creating visuals.For post we are using After Effects. And, of course, for vjing Resolume!
As for hardware, I’m using an MSI GT73VR 6RF Titan Pro and the Akai APC40MK2.
Tell us about your studio. What gear is an absolute must have, and what would you like to change/ upgrade?
My studio isn’t that great actually, haha, we have a small office in Limburg at my parents place. One of our employees is also from Limburg so half of the week we’re working in Limburg and the other in Utrecht.
We have a small setup in my apartment in Utrecht, my brother lives with me so it’s easy to work from home. In the near future we’re planning to get an office as we’re expanding and looking for new people to work with.
As for an upgrade, I really need more render power, haha, with these 4k visual content rendering is a nightmare.
Any words of advice for budding visual artists out there?
Less is more! Don’t layer 6 completely different visuals on top of each other and mix them every beat. It can become chaos really easily. Also black is your friend, leave enough black space to make your visual really pop out.
Is there anything else you would like to talk about? We would love to hear it.
Our most recent development is that we’re starting a co-operation called Visual Lab with Hatim. Hatim was the reason I started vj’ing for Nicky and over the years we built a great bond as he is Nicky Romero’s tour/production manager.
As probably all of us here know, talking and arranging gigs/assignments is the least fun part of our job so it seems like a great idea to have someone do that for us. It also seems like the next big step for our company and will lead to us hiring more talented vj’s and content creators.
Also, recently we’ve been working on creating a more generic visualpack we would like to sell on Resolume.
It’s interesting creating visuals that are not for your own use because normally we create pretty specific visuals for certain parts of the show. Now we need to forget about that and create visuals that can be used by anyone in any situation. It’s a good practice. I think we have come up with a pretty cool style of modern styled visuals and classic kaleidoscopic visuals for your enjoyment. :)
And, on a last note ,we are working on a VR horror escape room game in between all the visual work related stuff. Got to keep those university-skills going! :D
If you’re interested we’ll post something about it on our social media in the future.
*Shudders* Oooh this gave us chills.
Thanks for doing this interview Rick. We all look forward to those visual packs from you, and wish you so much success with Visual Lab.
With skills like that, you’re miles ahead already :)
Check out some more of Rick’s work here
Credits:
Rick and Bob, RJB Visuals + Visual Lab
Follow them on: Instagram & their website
Ironed Out Resolume: Arena 5.1.4 & Avenue 4.6.4 Released
Ladies & Gentlemen! Please update your Resolumes. We have released Arena 5.1.4 and Avenue 4.6.4. These new versions fix a few crashes and iron out some wrinkles. Nobody likes a wrinkled Resolume while they're VJ-ing, so hit that download.
Resolume Arena 5.1.4 & Avenue 4.6.4
[FIXED] Crash when triggering clip before deck is loaded
[FIXED] Crash when contextual menu is open and active layer is switched
[FIXED] Crash when switching decks during transition
[FIXED] Desktop is shown for a moment when using cancel to exit the advanced screen setup
[FIXED] Invert toggle on AutoMask effect broken
[FIXED] Transition Mixer dropdown doesn't update when creating a new composition
[FIXED] MulitTask mixer no worky on 5.1.3 OSX10.10
[FIXED] Right clicking parameters in the advanced output changes their range to -1.0...1.0
[FIXED] Request for RGBAW color space in DMX fixtures
[FIXED] Right clicking a slice parameter resets it's value but does not apply it to the slice
[FIXED] Crash: Rotate a polygon slice in output map like a mad man
[FIXED] Distorted image when using odd sized image as a mask
[FIXED] Multiple point selection dragger gets perspective after dragging a perspective handler on a slice
Resolume Arena 5.1.4 & Avenue 4.6.4
[FIXED] Crash when triggering clip before deck is loaded
[FIXED] Crash when contextual menu is open and active layer is switched
[FIXED] Crash when switching decks during transition
[FIXED] Desktop is shown for a moment when using cancel to exit the advanced screen setup
[FIXED] Invert toggle on AutoMask effect broken
[FIXED] Transition Mixer dropdown doesn't update when creating a new composition
[FIXED] MulitTask mixer no worky on 5.1.3 OSX10.10
[FIXED] Right clicking parameters in the advanced output changes their range to -1.0...1.0
[FIXED] Request for RGBAW color space in DMX fixtures
[FIXED] Right clicking a slice parameter resets it's value but does not apply it to the slice
[FIXED] Crash: Rotate a polygon slice in output map like a mad man
[FIXED] Distorted image when using odd sized image as a mask
[FIXED] Multiple point selection dragger gets perspective after dragging a perspective handler on a slice
The Ultimate Guide to Multiscreen Output with Resolume
“I want to connect a dozen and then some screens, what hardware should I get?”
We get this question quite often.
The question sounds simple, the answer is always complicated. It’s the same as asking: “I want to buy a house, which house should I buy?”
Consider us your multiscreen real estate managers. [fold][/fold]We like to help you make the right decisions and find the house that’s right for you. After gathering lots of use cases, possible problems and possible solutions, we came to this document. This document will guide you through the overwhelming multiscreen adventure.
The adventure starts here!
You’ll find some essentials explained on the do’s and don’ts when using Resolume for multiple outputs. The different options are listed in order of preference and it even contains a flowchart. Yay. Just answer the questions and you will be guided to your ideal dream home. One click and you’ll be taken to all the essential information you need on that snazzy 3 story condo with all copper plumbing. We’ll try to avoid the shady parts of town, but if you like, we can show you some options in the extender hub ghettos as well.
Always remember, buying a large house is a big investment. Before going house hunting, you need to make sure your computer has enough pixel power in the bank to build that pretty picture. It would suck if you get all the gear together to run a beautiful 4 story pixel map and then realise your Intel Iris Pro chokes at more than a single bedroom NY apartment. When in doubt, check them benchmarks.
Here's that URL one more time, in case you missed it the first time
Make Some Noisia
Dutch electronic megahouse Noisia has been rocking the planet with their latest album ‘Outer Edges’.
Photo by Diana Gheorghiu
It was a wait. But one that was truly worth it. Essentially a concept album, they pushed the boundaries on this one by backing it up with a ‘concept tour’.
An audio-visual phenomenon with rivetting content, perfect sync & melt-yo-face energy, the Outer Edges show is one that could not pass our dissection.
[fold][/fold]
We visited Rampage, one of the biggest Drum & Bass gigs around the world & caught up with Roy Gerritsen (Boompje Studio) & Manuel Rodrigues (DeepRED.tv), on video and lighting duty respectively, to talk to us about the levels of excellence the Noisia crew has achieved, with this concept show.
Here is a look at Diplodocus, a favorite amongst bass heads:
Video by Diana Gheorghiu
Thanks for doing this guys! Much appreciated.
What exactly is a concept show and how is preparation for it different from other shows?
When Noisia approached us they explained they wanted to combine the release of their next album “Outer Edges” with a synchronized audio visual performance. It had been 6 years since Noisia released a full album so you can imagine it was a big thing.
Together, we came up with a plan to lay the foundation for upcoming shows. We wanted to focus on developing a workflow and pipeline to create one balanced and synchronized experience.
Normally, all the different elements within a show (audio, light, visual, performance) focus on their own area. There is one general theme or concept and everything then comes together in the end - during rehearsals.
We really wanted to create a show where we could focus on the total picture. Develop a workflow where we could keep refining the show and push the concept in all different elements in a quick and effective way, without overlooking the details.
What was the main goal you set out to achieve as you planned the Outer Edges show?
How long did it take to come together, from start to end?
We wanted to create a show where everything is 100% synchronized and highly adaptable. Having one main control computer which connects to all elements within the show in a perfect synchronized way.This setup gave us the ability to find a perfect balance and narrative between sound, performance, lights and visuals. Besides that we wanted to have a modular and highly flexible show. Making it easy and quick to adapt or add new content.
We started with the project in March 2016 and our premiere was at the Let It Roll festival in Prague (July 2016).
The show is designed in such a way that it has an “open-end”. We record every show and because of the open infrastructure we are constantly refining it on all fronts.
What are the different gadgets and software you use to achieve that perfect sync between audio/video & lighting?
Roy:Back in the day, my graduation project at the HKU was a vj mix tool where I used the concept of “cue based” triggering. Instead of the widely used timecode synchronization where you premix all the content (the lights and the video tracks), we send a MIDI trigger of every beat and sound effect.This saves a lot of time in the content creation production process.
The edit and mix down of the visuals are basically done live on stage instead of on After effects. This means we don't have to render out 8 minute video clips and can focus on only a couple of key visual loops per track. (Every track consists of about 5 clips which get triggered directly from Ableton Live using a custom midi track).Inside Ableton we group a couple of extra designated clips so they all get triggered at the same time.
For every audio clip we sequence separate midi clips for the video and lighting, which get played perfectly in sync with the audio. These midi tracks then get sent to the VJ laptop and Manuel's lighting desk.
We understand you trigger clips off Resolume from Abelton Live using the extremely handy Max for Live patches?
Yes, we sequence a separate midi track for each audio track. We divided up the audio track in 5 different elements (beats, snares, melody , fx etc.), which corresponds with 5 video layers in Resolume.
When a note gets triggered, a Max for Live patch translates it to an OSC message and sends if off to the VJ laptop. The OSC messages get caught by a small tool we built in Derivative’s TouchDesigner. In its essence this tool translates the incoming messages into OSC messages which Resolume understands. Basically, operating Resolume automatically with the triggers received from Ableton.
This way of triggering videoclips was a result of an experiment from Martin Boverhof and Sander Haakman during a performance at an art festival in Germany, a couple of years ago. Only two variables are being used- triggering video files and adjusting the opacity of a clip. We were amazed how powerful these two variables are.
Regarding lighting, we understand the newer Chamsys boards have inbuilt support for MIDI/ timecode. What desk do you use?
Manuel:To drive the lighting in the Noisia - Outer Edges show I use a Chamsys Lighting desk. It is a very open environment. You can send Midi, Midi showcontrol, OSC, Timecode LTC & MTC, UDP, Serial Data and off course DMX & Artnet to the desk.
The support of Chamsys is extremely good and the software version is 100% free. Compared to other lighting desk manufacturers, the hardware is much cheaper.
A lighting desk is still much more expensive than a midi controller.
It might look similar as both have faders and buttons but the difference is that a lighting desk has a brain.
You can store, recall and sequence states, something which is invaluable for a lighting designer and now is happening is videoland more and more.
I have been researching on bridging the gap between Ableton Live and ChamSys for 8 years.
This research has led me to M2Q, acronym of Music-to-Cue which acts as a bridge between Ableton live and ChamSys. M2Q is a hardware solution designed together with Lorenzo Fattori, an Italian lighting designer and engineer. M2Q listens to midi messages sent from Ableton Live and converts them to Chamsys Remote Control messages, providing cue triggering and playback intensity control.
M2Q is reliable, easy and fast lighting sync solution. It enables non linear lighting sync.
When using Timecode it is impossible to loop within a song, do the chorus one more time or alter the playback speed on the fly. One is basically limited to pressing play.
Because our lighting sync system is midi based the artist on stage has exactly the same freedom Ableton audio playback offers.
Do you link it to Resolume?
Chamsys has a personality file (head file) for Resolume and this enables driving Resolume as a media server from the lighting desk. I must confess that I’m am considering switching to Resolume now for some time as it is very cost effective and stable solution compared to other mediaserver platforms.
Video by Diana Gheorghiu
Tell us about the trio’s super cool headgear. They change color, strobe, are dimmable. How?!
The led suits are custom designed and built by Élodie Laurent and are basically 3 generic led parcans and have similar functionality.
They are connected to the lighting desk just as the rest of the lighting rig and are driven using the same system.
Fun fact: These are the only three lights we bring with us so the Outer Edges show is extremely tour-able.
The Noisia content is great in it’s quirkyness. Sometimes we see regular video clips, sometimes distorted human faces, sometimes exploding planets, mechanical animals- what’s the thought process behind the content you create? Is it track specific?
The main concept behind this show is that every track has his own little world in this Outer Edges universe. Every track stands on its own and has a different focus on style and narrative.
Nik (one third of Noisia & Art director) compiled a team of 10 international motion graphic artists and together we took on the visualization of the initial 34 audio tracks. Cover artwork, videoclips and general themes from the audio tracks formed the base for most of the tracks.
Photo by Diana Gheorghiu
Photo by Diana Gheorghiu
The lighting & video sync is so on point, we can’t stop talking about it. It must have taken hours of studio time & programming?
That was the whole idea behind the setup.
Instead of synchronizing everything in the light and video tracks, we separated the synchronizing process from the design process. Meaning that we sequence everything in Ableton and on the content side Resolume takes care of the rest. Updating a vj clip is just a matter of dragging a new clip into Resolume.
This also resulted in Resolume being a big part in the design process (instead of normally only serving as a media server).
During the design process we run the Ableton set and see how clips get triggered, if we don't like something we can easily replace the video clip with a new one or adjust for instance the scaling size inside Resolume.
Some tracks which included 3D rendered images took a bit longer, but there is one track “Diplodocus” which took 30 minutes to make from start to finish. Originally, meant as a placeholder but after seeing it being synchronized we liked the simplicity and boldness of it and decided to keep it in the show.
Here is some more madness that went down:
Video by Diana Gheorghiu
Is it challenging to adapt your concept show into different, extremely diverse festival setups? How do you output the video to LED setups that are not standard?
We mostly work with our rider setup consisting of a big LED screen in the back and LED banner in front of the booth, but in case of bigger festivals we can easily adjust the mapping setup inside Resolume.
In the case of Rampage we had another challenge to come up with a solution to operate with 7 full HD outputs.
Photo by Diana Gheorghiu
Normally Nik is controlling everything from stage and we have a direct video line to the LED processor. Since all the connections to the LED screens were located in the front of house we used 2 laptops positioned there.
It was easy to adjust the Ableton Max for Live patch to send the triggers to two computers instead of one, and we wrote a small extra tool that sends all the midi-controller data from the stage to the FOH (to make sure Nik was still able to operate everything from the stage).
Talk to us about some features of Resolume that you think are handy, and would advice people out there to explore.
Resolume was a big part of the design process in this show. Using it almost as a small little After Effects, we stacked up effects until we reached our preferred end result. We triggered scalings, rotations, effects and opacity using the full OSC control option Resolume offers. This makes it super easy to create spot on synchronized shows. With a minimal amount of pre - production.
This in combination with the really powerful mapping options makes it an ideal tool to build our shows on!
What a great interview, Roy & Manuel.
Thanks for giving us a behind-the-scenes understanding of what it takes to run this epic show, day after day.
Noisia has been ruling the Drum & Bass circuit, for a reason. Thumping, fresh & original music along with a remarkable show- what else do we want?
Here is one last video for a group rage :
Video by Diana Gheorghiu
Rinseout.
Credits:
Photo credits Noisa setup: Roy Gerritsen
Adhiraj, Refractor for the on point video edits.
Photo by Diana Gheorghiu
It was a wait. But one that was truly worth it. Essentially a concept album, they pushed the boundaries on this one by backing it up with a ‘concept tour’.
An audio-visual phenomenon with rivetting content, perfect sync & melt-yo-face energy, the Outer Edges show is one that could not pass our dissection.
[fold][/fold]
We visited Rampage, one of the biggest Drum & Bass gigs around the world & caught up with Roy Gerritsen (Boompje Studio) & Manuel Rodrigues (DeepRED.tv), on video and lighting duty respectively, to talk to us about the levels of excellence the Noisia crew has achieved, with this concept show.
Here is a look at Diplodocus, a favorite amongst bass heads:
Video by Diana Gheorghiu
Thanks for doing this guys! Much appreciated.
What exactly is a concept show and how is preparation for it different from other shows?
When Noisia approached us they explained they wanted to combine the release of their next album “Outer Edges” with a synchronized audio visual performance. It had been 6 years since Noisia released a full album so you can imagine it was a big thing.
Together, we came up with a plan to lay the foundation for upcoming shows. We wanted to focus on developing a workflow and pipeline to create one balanced and synchronized experience.
Normally, all the different elements within a show (audio, light, visual, performance) focus on their own area. There is one general theme or concept and everything then comes together in the end - during rehearsals.
We really wanted to create a show where we could focus on the total picture. Develop a workflow where we could keep refining the show and push the concept in all different elements in a quick and effective way, without overlooking the details.
What was the main goal you set out to achieve as you planned the Outer Edges show?
How long did it take to come together, from start to end?
We wanted to create a show where everything is 100% synchronized and highly adaptable. Having one main control computer which connects to all elements within the show in a perfect synchronized way.This setup gave us the ability to find a perfect balance and narrative between sound, performance, lights and visuals. Besides that we wanted to have a modular and highly flexible show. Making it easy and quick to adapt or add new content.
We started with the project in March 2016 and our premiere was at the Let It Roll festival in Prague (July 2016).
The show is designed in such a way that it has an “open-end”. We record every show and because of the open infrastructure we are constantly refining it on all fronts.
What are the different gadgets and software you use to achieve that perfect sync between audio/video & lighting?
Roy:Back in the day, my graduation project at the HKU was a vj mix tool where I used the concept of “cue based” triggering. Instead of the widely used timecode synchronization where you premix all the content (the lights and the video tracks), we send a MIDI trigger of every beat and sound effect.This saves a lot of time in the content creation production process.
The edit and mix down of the visuals are basically done live on stage instead of on After effects. This means we don't have to render out 8 minute video clips and can focus on only a couple of key visual loops per track. (Every track consists of about 5 clips which get triggered directly from Ableton Live using a custom midi track).Inside Ableton we group a couple of extra designated clips so they all get triggered at the same time.
For every audio clip we sequence separate midi clips for the video and lighting, which get played perfectly in sync with the audio. These midi tracks then get sent to the VJ laptop and Manuel's lighting desk.
We understand you trigger clips off Resolume from Abelton Live using the extremely handy Max for Live patches?
Yes, we sequence a separate midi track for each audio track. We divided up the audio track in 5 different elements (beats, snares, melody , fx etc.), which corresponds with 5 video layers in Resolume.
When a note gets triggered, a Max for Live patch translates it to an OSC message and sends if off to the VJ laptop. The OSC messages get caught by a small tool we built in Derivative’s TouchDesigner. In its essence this tool translates the incoming messages into OSC messages which Resolume understands. Basically, operating Resolume automatically with the triggers received from Ableton.
This way of triggering videoclips was a result of an experiment from Martin Boverhof and Sander Haakman during a performance at an art festival in Germany, a couple of years ago. Only two variables are being used- triggering video files and adjusting the opacity of a clip. We were amazed how powerful these two variables are.
Regarding lighting, we understand the newer Chamsys boards have inbuilt support for MIDI/ timecode. What desk do you use?
Manuel:To drive the lighting in the Noisia - Outer Edges show I use a Chamsys Lighting desk. It is a very open environment. You can send Midi, Midi showcontrol, OSC, Timecode LTC & MTC, UDP, Serial Data and off course DMX & Artnet to the desk.
The support of Chamsys is extremely good and the software version is 100% free. Compared to other lighting desk manufacturers, the hardware is much cheaper.
A lighting desk is still much more expensive than a midi controller.
It might look similar as both have faders and buttons but the difference is that a lighting desk has a brain.
You can store, recall and sequence states, something which is invaluable for a lighting designer and now is happening is videoland more and more.
I have been researching on bridging the gap between Ableton Live and ChamSys for 8 years.
This research has led me to M2Q, acronym of Music-to-Cue which acts as a bridge between Ableton live and ChamSys. M2Q is a hardware solution designed together with Lorenzo Fattori, an Italian lighting designer and engineer. M2Q listens to midi messages sent from Ableton Live and converts them to Chamsys Remote Control messages, providing cue triggering and playback intensity control.
M2Q is reliable, easy and fast lighting sync solution. It enables non linear lighting sync.
When using Timecode it is impossible to loop within a song, do the chorus one more time or alter the playback speed on the fly. One is basically limited to pressing play.
Because our lighting sync system is midi based the artist on stage has exactly the same freedom Ableton audio playback offers.
Do you link it to Resolume?
Chamsys has a personality file (head file) for Resolume and this enables driving Resolume as a media server from the lighting desk. I must confess that I’m am considering switching to Resolume now for some time as it is very cost effective and stable solution compared to other mediaserver platforms.
Video by Diana Gheorghiu
Tell us about the trio’s super cool headgear. They change color, strobe, are dimmable. How?!
The led suits are custom designed and built by Élodie Laurent and are basically 3 generic led parcans and have similar functionality.
They are connected to the lighting desk just as the rest of the lighting rig and are driven using the same system.
Fun fact: These are the only three lights we bring with us so the Outer Edges show is extremely tour-able.
The Noisia content is great in it’s quirkyness. Sometimes we see regular video clips, sometimes distorted human faces, sometimes exploding planets, mechanical animals- what’s the thought process behind the content you create? Is it track specific?
The main concept behind this show is that every track has his own little world in this Outer Edges universe. Every track stands on its own and has a different focus on style and narrative.
Nik (one third of Noisia & Art director) compiled a team of 10 international motion graphic artists and together we took on the visualization of the initial 34 audio tracks. Cover artwork, videoclips and general themes from the audio tracks formed the base for most of the tracks.
Photo by Diana Gheorghiu
Photo by Diana Gheorghiu
The lighting & video sync is so on point, we can’t stop talking about it. It must have taken hours of studio time & programming?
That was the whole idea behind the setup.
Instead of synchronizing everything in the light and video tracks, we separated the synchronizing process from the design process. Meaning that we sequence everything in Ableton and on the content side Resolume takes care of the rest. Updating a vj clip is just a matter of dragging a new clip into Resolume.
This also resulted in Resolume being a big part in the design process (instead of normally only serving as a media server).
During the design process we run the Ableton set and see how clips get triggered, if we don't like something we can easily replace the video clip with a new one or adjust for instance the scaling size inside Resolume.
Some tracks which included 3D rendered images took a bit longer, but there is one track “Diplodocus” which took 30 minutes to make from start to finish. Originally, meant as a placeholder but after seeing it being synchronized we liked the simplicity and boldness of it and decided to keep it in the show.
Here is some more madness that went down:
Video by Diana Gheorghiu
Is it challenging to adapt your concept show into different, extremely diverse festival setups? How do you output the video to LED setups that are not standard?
We mostly work with our rider setup consisting of a big LED screen in the back and LED banner in front of the booth, but in case of bigger festivals we can easily adjust the mapping setup inside Resolume.
In the case of Rampage we had another challenge to come up with a solution to operate with 7 full HD outputs.
Photo by Diana Gheorghiu
Normally Nik is controlling everything from stage and we have a direct video line to the LED processor. Since all the connections to the LED screens were located in the front of house we used 2 laptops positioned there.
It was easy to adjust the Ableton Max for Live patch to send the triggers to two computers instead of one, and we wrote a small extra tool that sends all the midi-controller data from the stage to the FOH (to make sure Nik was still able to operate everything from the stage).
Talk to us about some features of Resolume that you think are handy, and would advice people out there to explore.
Resolume was a big part of the design process in this show. Using it almost as a small little After Effects, we stacked up effects until we reached our preferred end result. We triggered scalings, rotations, effects and opacity using the full OSC control option Resolume offers. This makes it super easy to create spot on synchronized shows. With a minimal amount of pre - production.
This in combination with the really powerful mapping options makes it an ideal tool to build our shows on!
What a great interview, Roy & Manuel.
Thanks for giving us a behind-the-scenes understanding of what it takes to run this epic show, day after day.
Noisia has been ruling the Drum & Bass circuit, for a reason. Thumping, fresh & original music along with a remarkable show- what else do we want?
Here is one last video for a group rage :
Video by Diana Gheorghiu
Rinseout.
Credits:
Photo credits Noisa setup: Roy Gerritsen
Adhiraj, Refractor for the on point video edits.
New Footage Releases: Abstractacadabra
Because abstract gots that magic that looks good on all the screens all the time
Laak drops another rectangular banger with SQR.
Get SQR by Laak from Resolume Footage
Artificially Awake was up all night trying to get things into Focus.
Get Focus by Artificially Awake from Resolume Footage
And Ican Agoesdjam knows that fiber is good for you.
Get FiberGlitch by Ican Agoesdjam from Resolume Footage
Laak drops another rectangular banger with SQR.
Get SQR by Laak from Resolume Footage
Artificially Awake was up all night trying to get things into Focus.
Get Focus by Artificially Awake from Resolume Footage
And Ican Agoesdjam knows that fiber is good for you.
Get FiberGlitch by Ican Agoesdjam from Resolume Footage
Footage Release: Down the Rabbithole
A little late, but worth the wait. This month, we have 3 artists taking you down a rabbithole of dreaming computers, analog fuzzy feels and Wonderland itself.
Catmac taps into the electronic shapes of a dreaming AI in Virtual Memory
Get VritualMemory by Catmac from Resolume Footage.
Ghosteam makes your larger, and Ghosteam makes you small.
Get Wonderland by Ghosteam from Resolume Footage.
And Ablaze Visuals knows that texture counts.
Get Lights by Ablaze Visuals from Resolume Footage.
Catmac taps into the electronic shapes of a dreaming AI in Virtual Memory
Get VritualMemory by Catmac from Resolume Footage.
Ghosteam makes your larger, and Ghosteam makes you small.
Get Wonderland by Ghosteam from Resolume Footage.
And Ablaze Visuals knows that texture counts.
Get Lights by Ablaze Visuals from Resolume Footage.