Resolume Arena 5.1 & Avenue 4.6 Level Boss
You know how it goes.
You think you know how this game works, and that you've got the level boss' attack pattern all figured out. You know his weak points, and all you need to do is wait for him to step left so you can shoot at that gap in his armor. Then, just as he reaches half health, he mutates and is back to full health. And he's full of new tricks.
The Arena 5.1 / Avenue 4.6 release is Resolume's Boss Fight. New features, new tricks, new workflows. You know, to keep you on your toes.
Hit that download!
[fold][/fold]
New:
- ArtPoll and Art-Net Unicast
- Much lower memory usage when using still images and non-DXV movies
- Load images as guides for in Input and Output mapping
- Export slices to .svg
- Clip Menu Item: Strip Audio
- Type 4 space separated numerical values to enter a slice top, left, width and height coordinates
- RGBA fixture colorspace
- Maintain fixture scale and proportion when switching fixtures
- Duplicate lumiverse
- Button to toggle snapping
- Multiselect and drag slices between and within screens
- Roll over rotation range for slices
- Mouse hover popup in DMX Output window that shows the channel value as a number
- When deleting a column with an active clip, make the first clip in the column before it the active one
- When moving a Slice mask completely outside of a slice, paint the slice outline red
- When creating new fixture automatically start giving it a name
Fixed:
- Blackmagic output broken in 504
- Soft edge params are universal
- Resolume does not accept Art-Net messages coming from the same computer
- Twitch crashes on OSX when moving it from clip to layer or comp
- Dragging a slice to a differently sized screen adjusts the slice's output width & height
- RGBW(A) fixtures don't output the correct values.
- Apparently there is now also a RevB for the Enttec Mk2
- Unneccesary Undomanager memory increase
- Alpha dropdown does not show on DXV content with alpha when Quicktime is not installed
- Params missing from Slice multiselect
- Slices are snapped without moving the mouse
- Art-Net output already in use by different entry doesn't update its error message when the output is freed
- Preview flickers on active clip with alpha
- Massive FPS drop after loading a different preset or after undo
- 360 softedging places the blend in the entire area outside the comp
- Advanced output window covers all other system menus
- Snapping works for only for slices on the selected screen
- Enabling autopilot after clip has looped makes it jump to the target clip immediately
- Softedge is applied on disabled slices
- SoftEdge textures dont get updated when slice is moved through params.
- DMX automap default value for Master Speed is off by 1
- Soft edge has a black line.
- Fixture is missing right click options
- Use computer name as default node name instead of "Resolume"
- Y Mirror Broken in Output
- No right click context menu for masks?
- CMD/CTRL-A in the Advanced Output gets passed to deck
- Ghost rendering when resizing slice using transformer draggers
- advanced.xml can cause Arena to crash on startup
- Duplicate plugin folders are scanned twice
- Reconnecting a clip with different length results in wrong playback speed
- X,Y controls missing when editing the points of an input mask
- Performance drops with a *lot* of slices
- Windows version loads hidden osx files as blank clips
- Clip preview is straight alpha, clip-in-layer preview is premult
- Composition bypass doesn't bypass layer routed slice
- No right click context menu for masks?
- Lumiverse output appears empty
- Fixture is missing right click options
- BPM stops when resync gets triggerd by dmx
- Can not move perspective point on output slice with keyboard arrow keys
- Stingy Sphere on a layer gets the wrong viewport values
- Time Remaining shows time till clip end, not till clip outpoint
Arena 5.1.1 & Avenue 4.6.1 Update:
- [FIXED] Master Fader & Composition Fade Out broken when using the output menu
- [FIXED] Solid Color Effect Show Image parameter not working
- [FIXED] Spout via Advanced output not working
Arena 5.1.2 & Avenue 4.6.2 Update:
- [FIXED] Crash during startup
- [FIXED] Advanced screen setup and outputs not working in OSX 10.8 and lower
- [FIXED] Bezier outlines rendering is broken
- [FIXED] Certain blend modes still show white
- [FIXED] SVG output does not take into account perspective warping
- [FIXED] Crash when creating and deleting a fixture on Windows
We now offer installers that do not include example footage so your download is much smaller & quicker. You'll find this link in the Other Version dropdown on the download page.
Resolume Blog
This blog is about Resolume, VJ-ing and the inspiring things the Resolume users make. Do you have something interesting to show the community? Send in your work!
Highlights
Footage Releases: Reflections on the Past and an Eye to the Future
We're proud to announce the first official release by Diana Gheorghiu. After sneaking in FerroCious under an alias, she's now back with the amazing Progressions:
Get Progressions *exclusively* from Resolume Footage.
Ghosteam is back with amazing eye to detail and gorgeous reflections
Get LightRooms from Resolume Footage.
And Ostwerker brings an HD update to his classic PartikelStorm.
Get PartikelStorm 2*exclusively* from Resolume Footage.
Get Progressions *exclusively* from Resolume Footage.
Ghosteam is back with amazing eye to detail and gorgeous reflections
Get LightRooms from Resolume Footage.
And Ostwerker brings an HD update to his classic PartikelStorm.
Get PartikelStorm 2*exclusively* from Resolume Footage.
Artist Profile: Richard De Souza // Manoeuvre
Say what you will about Facebook, it's an amazing medium to get in touch with people. You can get the skinny on what's happening on the other side of the world, and see things you would otherwise never know about. Case in point: Richard De Souza. I got connected to Richard about a year ago, and since then, my newsfeed started filling up with one jaw dropping stage design after another. Seems like the Australian continent, besides spiders, is hosting buckets of raw talent as well.
Who are you and how do you spend your day?
I am the director of MNVR (Manoeuvre); an Australian based new media company primarily focussed on concert visuals and interactive. I’m always working on something. If not in production mode I’m either VJing or developing custom VJ tools. I love tinkering and I spend most of my free time progressing my art and process.

[fold][/fold]
How and when did you get started in this whole VJ circus?
I’ve had a love of music for as long as I can remember though my strengths have always been with the visual arts. In 1999 The Big Day Out, a national touring festival here in Australia, put a callout for video artists. A friend and I threw a reel together and were selected to do visuals for the Big Day Out’s Boiler Room. That summer we were VJing in front of thousands of people for what was then Australia’s largest festival - I was hooked. It was all Betacam and VHS back then, cueing tapes and mixing cams. VJing at The Big Day Out became my summer hobby for many years and I would spend all my spare time working on show content. Live visuals in the festival dance scene were new here. There were no rules and few points of reference so I felt free to do what I liked. It was hugely exciting being there so early in the Australian scene and has given me the oppportunity to operate visuals for likes of Aphex Twin, Justice, Avicii, The Bloody Beetroots to name a few at various points in their careers.
The industry has changed so much from the early days. Visuals have become a respected and essential part of live performance. As more electronic music festivals took off I made the leap from the architectural industry to full time professional VJ, 3D artist and content producer.
You do a lot of amazing custom tour content. What's your process for coming up with the overall identity of the show and the look and feel of each song individually?
It's always about representing the artist to the best of my ability - put the show first - it sounds simple but many people get that wrong. Vision is there to strengthen or amplify the experience of the music; it should never detract or take over. My process varies on a show-to-show basis. Sometimes minimal visuals and different techniques should be employed but I decide this by getting to know the artist.
The best shows are either a great pairing of musician with like-minded visual artist or an environment that allows symbiosis. It’s really just about gauging the intent of the artist and their music, taking the time to listen and study their person, catalogue, performances and media. If you have spent some time with an artist it's pretty easy to translate their personalities and music visually. Music is highly emotive anyway; it already carries a ton thematic content within to draw from.

When dealing with single tracks I always consider the hero shot. This is generally a compositionally strong or thematic element that identifies the peak moments of a track. By placing these hero moments throughout the set you can avoid motion graphics run on, where an hour of mograph blends into a singular moment. The idea is that the audience will identify or recall key imagery attached to a particular song strengthening the emotional impact of the show. If you orchestrate these peaks and troughs you avoid a visually flat show.
You also do incredible stage designs. What's your stage design process like? Do you have ideas for specific visuals and design a stage around that? Do you also design the light plan and overall stage deco?
I've done a fair bit of concept design over the years. With my background in architectural visualisation and animation I can flesh out the initial concept to a point where I can sell an idea pretty easily. I’ve also been fortunate enough to work on some big stages at festivals like Stereosonic and Future Music Festival so I have an understanding of what works and what doesn’t.
I tend to think in 3D regardless. Much of my VJ content is 3D animation so stage design for me is just another structural element in a larger meta 3D composition.
I work alongside some very talented lighting designers and production crew that flesh out my initial concept designs. In a larger team environment it’s a matter of letting everyone play to his or her strengths. When you have skilled people around you and everyone has input and ownership of the creative process, that’s the environment in which you get the best results.
You've got a lot of nifty TouchDesigner responsive stuff going on. Especially 'California' on the Peking Duk tour intrigues me to no end. Can you elaborate a bit on that?
The idea behind the Peking Duk show is that it's a post-EDM punk rock show. Everything in the show follows that simple design ethos - from visual design to playback and performance. Visually it’s supposed to look like a live comic book mashup of punk t-shirt illustrations and rock poster art. Instead of using the usual white outs and strobes it has a bunch of animated visual stabs and toon effects that sync up with the different visual tracks. I really wanted it to be a departure from the generic stock mograph look that many shows have.
Peking Duk (Adam Hyde & Reuben Styles) are very active on stage and perform with a rotating cast of collaborators, guest vocalists and musicians. The show is never the same twice but is always a hell of a party so the visuals needed to reflect that.

The show is split up into key tracks that Peking Duk are likely to play for which I have accompanying visuals, and then freeform sections that I adapt to on the fly.
California was initially developed in TouchDesigner to carry the freeform sections of the show but it is now used throughout. Basically I'm using TouchDesigner to do audio analysis live, then I render a trigger based texture palette of sprite animations. They are very simple solid pulses, gradient ramps, animated noise that respond to audio. I send these rendered textures over to Resolume via Spout and position them behind alpha channelled layered animations using Visution Mapio on a layer and column basis. I then launch column compositions instead of running a solely clip based show.
The generative element gives the show added life. Having an automated element sitting behind what I’m doing live broadens the scope of the show and creates a much tighter visual experience overall.

VJ'ing at a show like Godskitchen must be very different from regular VJ work. How does it differ and how do you keep a show like that fresh for a full night?
Designing for a show like Godskitchen you have multiple show arcs. Not only do you have multiple artist set arcs, but you also have a larger show arc to take into account. It’s important to consider how a night builds, with the introduction of different elements as the night progresses, from the video rig, to lighting, pyro and lasers.
I usually have some thematic show content woven into the video design that is either audio reactive, real-time or real-time 3D. These framing elements are usually persistent throughout the whole show and are treated like a video light rig rather than just a screen. These design elements tie into the other show elements like artist intros to create a larger show story.
What has your overall software and hardware journey been like? Do you tend to stick to a particular set of tools or do you switch around a lot? You also design a lot of your own software, right?
I started in 3D computer graphics. By the late ‘90s I was working on SGI machines doing real-time 3D interactive environments in VRML. It was definitely a revelation - real-time 3D, graphics acceleration, interactive and web deployment but there were many things you just couldn’t do because the hardware and software weren’t up to it.
My interests haven’t really changed but hardware and technology have caught up. The only limitations now are time, resources and budget.
I jumped on laptop VJing as soon as I could. Originally I was pushing out 3 layers of 320x240 on Resolume 2, which seems ridiculous now, but given the tape-to-tape alternative, at the time it was revolutionary. Resolume is still my core go to performance application.
I’m a bit of a control scheme nut. I have a somewhat ridiculous midi controller collection and I'm always working on new control schemes. My TouchDesigner work creating custom performance interfaces is just an extension of that. I’ve always wondered what touch screens or motion controls like Leap Motion would be like in live situations so I created my own custom tools to integrate these elements into shows. Having my own tools removes many technical limitations and allows me to focus on the creative. For me creative is always at the forefront but sometimes there are technical hurdles to navigate first.

How does Resolume fit into all this?
Resolume has been hugely influential as it’s the first real-time compositional environment I used. I’ve always used it like a real-time version of Photoshop or After Effects. The immediacy of feedback that Resolume gives is essential for my creative process. It has influenced many of the software and workflow directions I’ve taken over the years.
I actually run through compositional design sketches in Resolume, creating collages of ideas and forms from video elements and effects that I then round-trip back through the preproduction process to create final artwork.
Even when working in TouchDesigner I pass the output through Resolume via Spout. Resolume’s video playback, effects stack, layer routing, and mapping are just so extremely powerful and flexible I can’t imagine creating a show without those tools.
Who is currently blowing your mind in your field?
I have always loved UVA (United Visual Artists), their work for Massive Attack in particular the 100th Window Tour was inspirational and influenced me to follow live visuals as a career path. Their blending of sculptural architecture, vision as lighting and considered space is fantastic. To this day they still do beautiful, political and thoughtful work.
Where can we find you online?
I can be found online at http://www.manoeuvre.tv
Who are you and how do you spend your day?
I am the director of MNVR (Manoeuvre); an Australian based new media company primarily focussed on concert visuals and interactive. I’m always working on something. If not in production mode I’m either VJing or developing custom VJ tools. I love tinkering and I spend most of my free time progressing my art and process.
[fold][/fold]
How and when did you get started in this whole VJ circus?
I’ve had a love of music for as long as I can remember though my strengths have always been with the visual arts. In 1999 The Big Day Out, a national touring festival here in Australia, put a callout for video artists. A friend and I threw a reel together and were selected to do visuals for the Big Day Out’s Boiler Room. That summer we were VJing in front of thousands of people for what was then Australia’s largest festival - I was hooked. It was all Betacam and VHS back then, cueing tapes and mixing cams. VJing at The Big Day Out became my summer hobby for many years and I would spend all my spare time working on show content. Live visuals in the festival dance scene were new here. There were no rules and few points of reference so I felt free to do what I liked. It was hugely exciting being there so early in the Australian scene and has given me the oppportunity to operate visuals for likes of Aphex Twin, Justice, Avicii, The Bloody Beetroots to name a few at various points in their careers.
The industry has changed so much from the early days. Visuals have become a respected and essential part of live performance. As more electronic music festivals took off I made the leap from the architectural industry to full time professional VJ, 3D artist and content producer.
You do a lot of amazing custom tour content. What's your process for coming up with the overall identity of the show and the look and feel of each song individually?
It's always about representing the artist to the best of my ability - put the show first - it sounds simple but many people get that wrong. Vision is there to strengthen or amplify the experience of the music; it should never detract or take over. My process varies on a show-to-show basis. Sometimes minimal visuals and different techniques should be employed but I decide this by getting to know the artist.
The best shows are either a great pairing of musician with like-minded visual artist or an environment that allows symbiosis. It’s really just about gauging the intent of the artist and their music, taking the time to listen and study their person, catalogue, performances and media. If you have spent some time with an artist it's pretty easy to translate their personalities and music visually. Music is highly emotive anyway; it already carries a ton thematic content within to draw from.
When dealing with single tracks I always consider the hero shot. This is generally a compositionally strong or thematic element that identifies the peak moments of a track. By placing these hero moments throughout the set you can avoid motion graphics run on, where an hour of mograph blends into a singular moment. The idea is that the audience will identify or recall key imagery attached to a particular song strengthening the emotional impact of the show. If you orchestrate these peaks and troughs you avoid a visually flat show.
You also do incredible stage designs. What's your stage design process like? Do you have ideas for specific visuals and design a stage around that? Do you also design the light plan and overall stage deco?
I've done a fair bit of concept design over the years. With my background in architectural visualisation and animation I can flesh out the initial concept to a point where I can sell an idea pretty easily. I’ve also been fortunate enough to work on some big stages at festivals like Stereosonic and Future Music Festival so I have an understanding of what works and what doesn’t.
I tend to think in 3D regardless. Much of my VJ content is 3D animation so stage design for me is just another structural element in a larger meta 3D composition.
I work alongside some very talented lighting designers and production crew that flesh out my initial concept designs. In a larger team environment it’s a matter of letting everyone play to his or her strengths. When you have skilled people around you and everyone has input and ownership of the creative process, that’s the environment in which you get the best results.
You've got a lot of nifty TouchDesigner responsive stuff going on. Especially 'California' on the Peking Duk tour intrigues me to no end. Can you elaborate a bit on that?
The idea behind the Peking Duk show is that it's a post-EDM punk rock show. Everything in the show follows that simple design ethos - from visual design to playback and performance. Visually it’s supposed to look like a live comic book mashup of punk t-shirt illustrations and rock poster art. Instead of using the usual white outs and strobes it has a bunch of animated visual stabs and toon effects that sync up with the different visual tracks. I really wanted it to be a departure from the generic stock mograph look that many shows have.
Peking Duk (Adam Hyde & Reuben Styles) are very active on stage and perform with a rotating cast of collaborators, guest vocalists and musicians. The show is never the same twice but is always a hell of a party so the visuals needed to reflect that.
The show is split up into key tracks that Peking Duk are likely to play for which I have accompanying visuals, and then freeform sections that I adapt to on the fly.
California was initially developed in TouchDesigner to carry the freeform sections of the show but it is now used throughout. Basically I'm using TouchDesigner to do audio analysis live, then I render a trigger based texture palette of sprite animations. They are very simple solid pulses, gradient ramps, animated noise that respond to audio. I send these rendered textures over to Resolume via Spout and position them behind alpha channelled layered animations using Visution Mapio on a layer and column basis. I then launch column compositions instead of running a solely clip based show.
The generative element gives the show added life. Having an automated element sitting behind what I’m doing live broadens the scope of the show and creates a much tighter visual experience overall.
VJ'ing at a show like Godskitchen must be very different from regular VJ work. How does it differ and how do you keep a show like that fresh for a full night?
Designing for a show like Godskitchen you have multiple show arcs. Not only do you have multiple artist set arcs, but you also have a larger show arc to take into account. It’s important to consider how a night builds, with the introduction of different elements as the night progresses, from the video rig, to lighting, pyro and lasers.
I usually have some thematic show content woven into the video design that is either audio reactive, real-time or real-time 3D. These framing elements are usually persistent throughout the whole show and are treated like a video light rig rather than just a screen. These design elements tie into the other show elements like artist intros to create a larger show story.
What has your overall software and hardware journey been like? Do you tend to stick to a particular set of tools or do you switch around a lot? You also design a lot of your own software, right?
I started in 3D computer graphics. By the late ‘90s I was working on SGI machines doing real-time 3D interactive environments in VRML. It was definitely a revelation - real-time 3D, graphics acceleration, interactive and web deployment but there were many things you just couldn’t do because the hardware and software weren’t up to it.
My interests haven’t really changed but hardware and technology have caught up. The only limitations now are time, resources and budget.
I jumped on laptop VJing as soon as I could. Originally I was pushing out 3 layers of 320x240 on Resolume 2, which seems ridiculous now, but given the tape-to-tape alternative, at the time it was revolutionary. Resolume is still my core go to performance application.
I’m a bit of a control scheme nut. I have a somewhat ridiculous midi controller collection and I'm always working on new control schemes. My TouchDesigner work creating custom performance interfaces is just an extension of that. I’ve always wondered what touch screens or motion controls like Leap Motion would be like in live situations so I created my own custom tools to integrate these elements into shows. Having my own tools removes many technical limitations and allows me to focus on the creative. For me creative is always at the forefront but sometimes there are technical hurdles to navigate first.
How does Resolume fit into all this?
Resolume has been hugely influential as it’s the first real-time compositional environment I used. I’ve always used it like a real-time version of Photoshop or After Effects. The immediacy of feedback that Resolume gives is essential for my creative process. It has influenced many of the software and workflow directions I’ve taken over the years.
I actually run through compositional design sketches in Resolume, creating collages of ideas and forms from video elements and effects that I then round-trip back through the preproduction process to create final artwork.
Even when working in TouchDesigner I pass the output through Resolume via Spout. Resolume’s video playback, effects stack, layer routing, and mapping are just so extremely powerful and flexible I can’t imagine creating a show without those tools.
Who is currently blowing your mind in your field?
I have always loved UVA (United Visual Artists), their work for Massive Attack in particular the 100th Window Tour was inspirational and influenced me to follow live visuals as a career path. Their blending of sculptural architecture, vision as lighting and considered space is fantastic. To this day they still do beautiful, political and thoughtful work.
Where can we find you online?
I can be found online at http://www.manoeuvre.tv
Footage Releases: Ghosts, Dreams and Nightmares
This month we got three sets straight from your rapid eye movements.
Chromosoom drops a set with dreamlike classical dance. Classy and elegant, it's a breath of fresh air between all the abstract 3d.
Get WhiteDancer by Chromosoom, *exclusively* from Resolume VJ Footage.
And lastly, BluElk is back with Cronenberg inspired set, made of nightmares and itchy glitch.
Get OrganicForms by BluElkfrom Resolume Footage.
Chromosoom drops a set with dreamlike classical dance. Classy and elegant, it's a breath of fresh air between all the abstract 3d.
Get WhiteDancer by Chromosoom, *exclusively* from Resolume VJ Footage.
And lastly, BluElk is back with Cronenberg inspired set, made of nightmares and itchy glitch.
Get OrganicForms by BluElkfrom Resolume Footage.
New Footage Releases: New Artist and Familiar Faces
This month, we're proud to welcome Nexus 6 to the label. His first release NeonMetalBeats is an LED stage banger.
Get NeonMetalBeats *exclusively* from Resolume Footage!
Unit44 is back from a long hiatus, and trust us, he's just getting warmed up.
Get Pattern *exclusively* from Resolume Footage!
And STV in Motion always brings something special to the table. Lick those luscious lippies and prepare to be teased.
Get TeaseMe from Resolume Footage.
Get NeonMetalBeats *exclusively* from Resolume Footage!
Unit44 is back from a long hiatus, and trust us, he's just getting warmed up.
Get Pattern *exclusively* from Resolume Footage!
And STV in Motion always brings something special to the table. Lick those luscious lippies and prepare to be teased.
Get TeaseMe from Resolume Footage.
New Footage Releases: Smorgasbord!
This month we got you covered whatever your taste is.
Catmac tunes into your psychedelic vibrations with ChaoticGeometries:
Get ChaoticGeometries, *exclusively* from Resolume Footage.
Laak is back with more minimalism in VJ Survival Kit 4:
Get VJ Survival Kit 4, *exclusively* from Resolume Footage.
And Ghosteam gets lethal by mixing water and electricity in WaterWorld:
Get WaterWorld, *exclusively* from Resolume Footage.
Catmac tunes into your psychedelic vibrations with ChaoticGeometries:
Get ChaoticGeometries, *exclusively* from Resolume Footage.
Laak is back with more minimalism in VJ Survival Kit 4:
Get VJ Survival Kit 4, *exclusively* from Resolume Footage.
And Ghosteam gets lethal by mixing water and electricity in WaterWorld:
Get WaterWorld, *exclusively* from Resolume Footage.
Review: Showjockey Art-Net devices
Arena 5 made outputting to LED strips a piece of cake. Arena can sample the video pixels and output RGB values to the strip via DMX.
DMX is great, because it’s been around since forever and has become an industry standard that most devices will be able to work with.
The downside is that it has the concept of ‘universes’, which is basically a fancy term to describe all the lights that are on a single control line. A universe is limited 512 DMX channels. This is fine for conventional lighting setups, and was more than enough when the protocol was established way back in the eighties. But with every RGB pixel taking up 3 channels (1 for Red, 1 for Green, 1 for Blue), you can only control a maximum of 170 pixels per universe. With LED strips becoming super cheap and more and more high-res, building anything fancy can quickly require 10 or more universes.

[fold][/fold]
In the past, we’ve always recommended USB to DMX devices to get started with DMX. These are great, because they’re simple in operation and relatively cheap. The trouble is they only output 1 or 2 universes at the most and don’t work via USB hubs. So using USB, you quickly run out of breathing room.
A great alternative is Art-Net. Art-Net lets you send DMX signals via a regular network connection. Especially when working with computers, Art-Net is awesome, because you can use the same Ethernet cable as you use to connect to the internet. Instead of connecting to your modem, you connect it to an Art-Net to DMX device, which will convert it to a signal that your DMX light will understand. Using a regular network switch or router, you can add as many Art-Net devices as you like, so you’re not limited by the amount of ports your computer has.
So the race is on to find affordable, reliable Art-Net to DMX devices. We took a look at Showjockey’s selection. Showjockey is a manufacturer based in China (and they have many interesting gadgets aside from Art-Net stuff too!). We got a chance to play with their 4 universe SJ-DMX-E4 and the crazy 16 universe SJ-DMX-E16.


The Showjockey material comes nicely packaged, but is very spartan when it comes to documentation. The package does contain some booklets, but these are just advertising material for their other products. Since setting up Art-Net can be a bit of a black box, this can be a deal breaker, especially since the Showjockey devices have a somewhat unique setup method. It would greatly benefit from a manual, or maybe just a short setup guide.
An Art-Net device is essentially a network device, so it needs an IP address in order to let the sender computer know where to send its packages. Unlike a computer, Art-Net devices generally are programmed to have a fixed IP address. We were informed that all Showjockey devices default to IP 192.168.1.200 and subnet mask 255.255.255.0. Since our office network expects all IPs to be in the 192.168.178.x range, we had to jump through some hoops to get the devices to work.
On the SJ-DMX-E16, things weren’t so complicated. It has a little onboard menu display where you can set things like the IP address, and once we figured that you need to turn it off and on again to have these changes take effect, we were blasting pixels in no time.

As long as your Art-Net device is in the right IP range, Resolume will broadcast to all IPs in that range, so no further tech setup was needed. Our test setup consisted of 12 8x8 RGB LED tiles. With each tile needing 192 channels (8 * 8 * 3), we were stuck to 2 tiles per universe. We ran a single Ethernet line to the SJ-DMX-E16. We then ran the first 6 DMX outputs to each of the 2 tiles in each universe, with 10 outputs still to spare.


We had a bit more trouble on the SJ-DMX-E4. This one doesn’t have an onboard menu, so you first need to set your computer’s ip to the 192.168.1.x range (we used 192.168.1.10) and connect the SJ-DMX-E4 directly to it with an Ethernet cable. At this point, you can access a setup page by typing 192.168.1.200 in a browser. Here you can change the IP address, the name the device has on the network and things like the universe and subnet offset. Once we set up the IP correctly and power cycled the little rascal, this device was also working correctly.

Another thing you can set via this page is the universe and subnet offset. This is really useful when you want to work with multiple devices on the same network. For instance, we can use both devices in the same network, simply by setting the first device to start on subnet 0 and the second device to start on subnet 1. This way, any Lumiverses patched to subnet 0 will always go to the first device, and Lumiverses patched on subnet 1 will always go to the second device.

All in all, the beauty of Art-Net is that once you actually have it working, it’s pretty rock solid. You can hot plug power and DMX cables, and the LEDs will just pick up right where they were. The Showjockey devices work with a web interface, which is a great way to configure the device without the need for additional software. It’s also worth to note that the SJ-DMX-E16 can be configured both via the onboard menu and the web interface, and changes on one end will be picked up on the other.
The downside of the Showjockey boxes is that they’re a bit of a black box without a setup guide. Especially when you have to jump through some hoops if your IP is not in the same range initially. They have some videos online showing you the process, but these are hilariously unprofessional. Also, there’s a status LED on the device, which we’ve never seen change from red. It would be nice if that gave some additional feedback on what’s going on.
On the upside, they’ve been very responsive to our requests, and can’t be beat on price: the 4 universe SJ-DMX-E4 is $196, the 16 universe SJ-DMX-E16 is $600, and there is also a 2 universe SJ-DMX-E2 for $120.
More info via http://www.showjockey.com
DMX is great, because it’s been around since forever and has become an industry standard that most devices will be able to work with.
The downside is that it has the concept of ‘universes’, which is basically a fancy term to describe all the lights that are on a single control line. A universe is limited 512 DMX channels. This is fine for conventional lighting setups, and was more than enough when the protocol was established way back in the eighties. But with every RGB pixel taking up 3 channels (1 for Red, 1 for Green, 1 for Blue), you can only control a maximum of 170 pixels per universe. With LED strips becoming super cheap and more and more high-res, building anything fancy can quickly require 10 or more universes.
[fold][/fold]
In the past, we’ve always recommended USB to DMX devices to get started with DMX. These are great, because they’re simple in operation and relatively cheap. The trouble is they only output 1 or 2 universes at the most and don’t work via USB hubs. So using USB, you quickly run out of breathing room.
A great alternative is Art-Net. Art-Net lets you send DMX signals via a regular network connection. Especially when working with computers, Art-Net is awesome, because you can use the same Ethernet cable as you use to connect to the internet. Instead of connecting to your modem, you connect it to an Art-Net to DMX device, which will convert it to a signal that your DMX light will understand. Using a regular network switch or router, you can add as many Art-Net devices as you like, so you’re not limited by the amount of ports your computer has.
So the race is on to find affordable, reliable Art-Net to DMX devices. We took a look at Showjockey’s selection. Showjockey is a manufacturer based in China (and they have many interesting gadgets aside from Art-Net stuff too!). We got a chance to play with their 4 universe SJ-DMX-E4 and the crazy 16 universe SJ-DMX-E16.
The Showjockey material comes nicely packaged, but is very spartan when it comes to documentation. The package does contain some booklets, but these are just advertising material for their other products. Since setting up Art-Net can be a bit of a black box, this can be a deal breaker, especially since the Showjockey devices have a somewhat unique setup method. It would greatly benefit from a manual, or maybe just a short setup guide.
An Art-Net device is essentially a network device, so it needs an IP address in order to let the sender computer know where to send its packages. Unlike a computer, Art-Net devices generally are programmed to have a fixed IP address. We were informed that all Showjockey devices default to IP 192.168.1.200 and subnet mask 255.255.255.0. Since our office network expects all IPs to be in the 192.168.178.x range, we had to jump through some hoops to get the devices to work.
On the SJ-DMX-E16, things weren’t so complicated. It has a little onboard menu display where you can set things like the IP address, and once we figured that you need to turn it off and on again to have these changes take effect, we were blasting pixels in no time.
As long as your Art-Net device is in the right IP range, Resolume will broadcast to all IPs in that range, so no further tech setup was needed. Our test setup consisted of 12 8x8 RGB LED tiles. With each tile needing 192 channels (8 * 8 * 3), we were stuck to 2 tiles per universe. We ran a single Ethernet line to the SJ-DMX-E16. We then ran the first 6 DMX outputs to each of the 2 tiles in each universe, with 10 outputs still to spare.
We had a bit more trouble on the SJ-DMX-E4. This one doesn’t have an onboard menu, so you first need to set your computer’s ip to the 192.168.1.x range (we used 192.168.1.10) and connect the SJ-DMX-E4 directly to it with an Ethernet cable. At this point, you can access a setup page by typing 192.168.1.200 in a browser. Here you can change the IP address, the name the device has on the network and things like the universe and subnet offset. Once we set up the IP correctly and power cycled the little rascal, this device was also working correctly.
Another thing you can set via this page is the universe and subnet offset. This is really useful when you want to work with multiple devices on the same network. For instance, we can use both devices in the same network, simply by setting the first device to start on subnet 0 and the second device to start on subnet 1. This way, any Lumiverses patched to subnet 0 will always go to the first device, and Lumiverses patched on subnet 1 will always go to the second device.
All in all, the beauty of Art-Net is that once you actually have it working, it’s pretty rock solid. You can hot plug power and DMX cables, and the LEDs will just pick up right where they were. The Showjockey devices work with a web interface, which is a great way to configure the device without the need for additional software. It’s also worth to note that the SJ-DMX-E16 can be configured both via the onboard menu and the web interface, and changes on one end will be picked up on the other.
The downside of the Showjockey boxes is that they’re a bit of a black box without a setup guide. Especially when you have to jump through some hoops if your IP is not in the same range initially. They have some videos online showing you the process, but these are hilariously unprofessional. Also, there’s a status LED on the device, which we’ve never seen change from red. It would be nice if that gave some additional feedback on what’s going on.
On the upside, they’ve been very responsive to our requests, and can’t be beat on price: the 4 universe SJ-DMX-E4 is $196, the 16 universe SJ-DMX-E16 is $600, and there is also a 2 universe SJ-DMX-E2 for $120.
More info via http://www.showjockey.com
On Tour with Zedd: Gabe Damast
Working for Resolume, we're lucky enough to see some of the most amazing VJ talent in action. One such person is Gabe Damast, whose live show for Zedd blew me away. Gabe is a true VJ and seldom we see a show this tight and in sync with the music. And most amazing of all, it's pure VJ skill, no SMPTE or other tricks.
Take a look at the video for an idea of how Gabe rocks it, and then read on below for what he has to say about all this.
[fold][/fold]
How did you start VJ'ing?
My introduction to the world of VJing came through music. I grew up in the San Francisco Bay Area playing saxophone and piano in a couple different Jazz and Funk bands, and as my love for electronic music developed I got into beat making, record producing, and sound engineering. I spent years learning basically every major production software set up a small studio in my parents basement where I'd record myself and my musician friends goofing off, and sometimes they'd turn into actual songs.
At the end of college, a friend of mine showed me Resolume, which was really the first time I was exposed to any visual performance software. I remember a lot of things clicked for me all at once, coming from a background using Ableton Live and FL Studio, Resolume felt like a very user friendly video version of the DAWs I was familiar with. It wasn't long before I got ahold of a projector and started working on my first VJ sets in my tiny dark bedroom late at night. At first I would use found footage and VJ clips from vimeo, but I eventually got into cinema 4D and after effects and started making my own video content, some of which is being used in the Zedd show currently!

Can you tell us a bit more about the Zedd tour? How does such a tour get organised when it comes to the stage design, the content, the operating of video, lights and laser? Who does what?
The True Colors - which was the latest Arena tour we did with Zedd - all started more than two years ago with scribbles on a paper napkin. Many artists will hire a specific designer to conceptualize a stage production, but from the very beginning, the Zedd touring team been extremely close-knit, and we always share roles and creative ideas freely. Zedd likes to be incredibly close with pretty much every aspect of his live show, so many of the crucial design decisions would happen in a group discussions during a meal at an airport, or a van ride on the way to a music festival. Our lighting director Stevie Hernandez would create renderings of different ideas in vector works pretty much in real time, which helped different ideas evolve and change.
Video content has always been the central focus of the Zedd show (and I'm NOT just saying that because I'm a VJ!!). For the True Colors Tour we wanted to give fans the most immersive experience possible, so the design we landed on was pretty much a giant 84 foot wide LED wall, framed with all sorts of light fixtures, lasers, and special effects. We were able to use an LED wall that was fully 4K in width - a dream come true for any pixel pusher. It's been really exciting to watch the rapid development of LED technology in recent years. Bigger walls, higher resolutions, soon I'm sure we're going to be watching shows in retina quality! In the five months leading up to the start of the tour, we worked closely with Beeple (Mike Winkelman) to create the bulk of the new show visuals rendered in stunning 4418x1080 resolution. Scott Pagano and myself also contributed to the content push, which enabled me to curate an entirely new Zedd visual show from our previous tour.
Read more about Production Works process here: http://www.productionclub.net/work/truecolors
The thing that stands out most to me is how video, laser and light play the accents in the music as a team, almost like a band. Is this something that you practice?
"Practicing" is always a tricky subject in the world of live production. The cost of renting enough gear to do a proper rehearsal is so high that it only really makes sense surrounding a tour where the costs are being spread over a few months. We were lucky to have two weeks of rehearsals before our tour rolled out, where we built the full size production in a sweaty, cavernous warehouse in Las Vegas, and Zedd, myself, Ken (our tour manager AND laser operator), and Stevie (lights) spent 12+ hours a day listening to music and creating unique looks for each song Zedd wanted to play during the tour. We brought in a mobile studio for Zedd to use, and each day would usually begin with us brainstorming visual ideas, and then taking breaks where me and Stevie could program the looks, and Zedd could work on musical edits and tweaks. It was hard to leave the rehearsal space at the end of the day because we were getting so much done!
It's all live right, no SMPTE? What would you say to people that are just starting out and are looking to get a tight sync like that?
No SMPTE! Every single video clip, strobe hit, and pyro shot are all cued live. That's why our rehearsals took so long. I have a lot of respect for people who put together time coded shows, and there are a lot of things you can do with that kind of sync that just aren't possible with live triggering, but for me, realtime performance is the only way I like to work. Music is what drives the visuals, and Zedd always DJs live, so there is a certain level of authenticity that is communicated by including some human error into the visual performance.
Whenever someone asks me how they should get into VJing, I always tell them to start by understanding music. You can definitely be a technical person and excel in the visual performance world, but in order to deliver an on-time show (with no timecode) you really have to learn music and rhythm. If you have good timing, and understand the basics of music theory, you can put on an amazing show even with the worst video content on the smallest screens.

What gear are you bringing with you? Is it easy to deal with airport customs?
For a normal fly-in show, I use a Macbook Pro Retina with three midi controllers: 2 tractor control F1s and a MIDI fighter 3D. My whole kit fits nicely in a Pelican 1510 carryon case, and if customs ever tries to hassle me I just say "it's for making computer music!!!" and they always leave me alone. Flying around with three laptops sometimes raises a few eyebrows, but I've never gotten seriously held up (yet! *knock on wood*)
How does Resolume fit into all this?
Resolume's simple layout makes it SUPER easy to organize our visual show. I always try to think about telling a story through our video content, and all of my Resolume compositions are arranged in a timeline that I navigate around depending on what songs are being played. Since everything is live, choosing a media server that allowed for quick re-organization was really important to me. Add in the first class customer service from the Resolume team, and it's a no brainer!

Where can we find you online?
You can find my work on the web at:
--- http://www.gabedamast.com ---
or other platforms like:
--- vimeo: https://vimeo.com/user5953855 ---
--- behance: https://www.behance.net/gabedamast ---
Take a look at the video for an idea of how Gabe rocks it, and then read on below for what he has to say about all this.
[fold][/fold]
How did you start VJ'ing?
My introduction to the world of VJing came through music. I grew up in the San Francisco Bay Area playing saxophone and piano in a couple different Jazz and Funk bands, and as my love for electronic music developed I got into beat making, record producing, and sound engineering. I spent years learning basically every major production software set up a small studio in my parents basement where I'd record myself and my musician friends goofing off, and sometimes they'd turn into actual songs.
At the end of college, a friend of mine showed me Resolume, which was really the first time I was exposed to any visual performance software. I remember a lot of things clicked for me all at once, coming from a background using Ableton Live and FL Studio, Resolume felt like a very user friendly video version of the DAWs I was familiar with. It wasn't long before I got ahold of a projector and started working on my first VJ sets in my tiny dark bedroom late at night. At first I would use found footage and VJ clips from vimeo, but I eventually got into cinema 4D and after effects and started making my own video content, some of which is being used in the Zedd show currently!
Can you tell us a bit more about the Zedd tour? How does such a tour get organised when it comes to the stage design, the content, the operating of video, lights and laser? Who does what?
The True Colors - which was the latest Arena tour we did with Zedd - all started more than two years ago with scribbles on a paper napkin. Many artists will hire a specific designer to conceptualize a stage production, but from the very beginning, the Zedd touring team been extremely close-knit, and we always share roles and creative ideas freely. Zedd likes to be incredibly close with pretty much every aspect of his live show, so many of the crucial design decisions would happen in a group discussions during a meal at an airport, or a van ride on the way to a music festival. Our lighting director Stevie Hernandez would create renderings of different ideas in vector works pretty much in real time, which helped different ideas evolve and change.
Video content has always been the central focus of the Zedd show (and I'm NOT just saying that because I'm a VJ!!). For the True Colors Tour we wanted to give fans the most immersive experience possible, so the design we landed on was pretty much a giant 84 foot wide LED wall, framed with all sorts of light fixtures, lasers, and special effects. We were able to use an LED wall that was fully 4K in width - a dream come true for any pixel pusher. It's been really exciting to watch the rapid development of LED technology in recent years. Bigger walls, higher resolutions, soon I'm sure we're going to be watching shows in retina quality! In the five months leading up to the start of the tour, we worked closely with Beeple (Mike Winkelman) to create the bulk of the new show visuals rendered in stunning 4418x1080 resolution. Scott Pagano and myself also contributed to the content push, which enabled me to curate an entirely new Zedd visual show from our previous tour.
Read more about Production Works process here: http://www.productionclub.net/work/truecolors
The thing that stands out most to me is how video, laser and light play the accents in the music as a team, almost like a band. Is this something that you practice?
"Practicing" is always a tricky subject in the world of live production. The cost of renting enough gear to do a proper rehearsal is so high that it only really makes sense surrounding a tour where the costs are being spread over a few months. We were lucky to have two weeks of rehearsals before our tour rolled out, where we built the full size production in a sweaty, cavernous warehouse in Las Vegas, and Zedd, myself, Ken (our tour manager AND laser operator), and Stevie (lights) spent 12+ hours a day listening to music and creating unique looks for each song Zedd wanted to play during the tour. We brought in a mobile studio for Zedd to use, and each day would usually begin with us brainstorming visual ideas, and then taking breaks where me and Stevie could program the looks, and Zedd could work on musical edits and tweaks. It was hard to leave the rehearsal space at the end of the day because we were getting so much done!
It's all live right, no SMPTE? What would you say to people that are just starting out and are looking to get a tight sync like that?
No SMPTE! Every single video clip, strobe hit, and pyro shot are all cued live. That's why our rehearsals took so long. I have a lot of respect for people who put together time coded shows, and there are a lot of things you can do with that kind of sync that just aren't possible with live triggering, but for me, realtime performance is the only way I like to work. Music is what drives the visuals, and Zedd always DJs live, so there is a certain level of authenticity that is communicated by including some human error into the visual performance.
Whenever someone asks me how they should get into VJing, I always tell them to start by understanding music. You can definitely be a technical person and excel in the visual performance world, but in order to deliver an on-time show (with no timecode) you really have to learn music and rhythm. If you have good timing, and understand the basics of music theory, you can put on an amazing show even with the worst video content on the smallest screens.
What gear are you bringing with you? Is it easy to deal with airport customs?
For a normal fly-in show, I use a Macbook Pro Retina with three midi controllers: 2 tractor control F1s and a MIDI fighter 3D. My whole kit fits nicely in a Pelican 1510 carryon case, and if customs ever tries to hassle me I just say "it's for making computer music!!!" and they always leave me alone. Flying around with three laptops sometimes raises a few eyebrows, but I've never gotten seriously held up (yet! *knock on wood*)
How does Resolume fit into all this?
Resolume's simple layout makes it SUPER easy to organize our visual show. I always try to think about telling a story through our video content, and all of my Resolume compositions are arranged in a timeline that I navigate around depending on what songs are being played. Since everything is live, choosing a media server that allowed for quick re-organization was really important to me. Add in the first class customer service from the Resolume team, and it's a no brainer!
Where can we find you online?
You can find my work on the web at:
--- http://www.gabedamast.com ---
or other platforms like:
--- vimeo: https://vimeo.com/user5953855 ---
--- behance: https://www.behance.net/gabedamast ---
New Footage and a New Artist
Checkout this first release by WTFlow aka Florian Michel! It's called Trilt and it's a banger. This kid has a lot of tricks up his sleeve so expect more from him soon.
Trilt by WTFlow
Dan Wise is back with more Dazzle camo tactics in RazzleDazzle2.
RazzleDazzle2 by Dan Wise
And if you can’t get enough of those Tron-tastic glowy lines and shiny surfaces then Video2000 has you covered with Construct.
Construct by Video2000
Trilt by WTFlow
Dan Wise is back with more Dazzle camo tactics in RazzleDazzle2.
RazzleDazzle2 by Dan Wise
And if you can’t get enough of those Tron-tastic glowy lines and shiny surfaces then Video2000 has you covered with Construct.
Construct by Video2000
Resolume Arena 5.0.3 and Avenue 4.5.3: Marathon
Update Friday April 29th
There we were, running our marathon. We were on a good pace and we felt like we were making good time. Then, disaster struck. We tripped on an untied shoelace and hit face first in the mud.
So, with our cheeks still blushing from the embarrassment and caked with mud, we give you Resolume 5.0.4.
The 5.0.4 release fixes three silly mistakes that snuck their way into 5.0.3.
Download here!
FIXED
#6259 Midi Clock is broken
#6258 Clips don't play when opacity is at 0
#5797 Japanese font rendering.
-------------------------------------------------------------------------------------
The long distance runner's eyes lose their focus. He can't remember how long he has been running. All he knows his still has to go on. Sometimes he feels like he can't make it. Every step feels like lead and his feet feel like fire. Other times he catches new wind and he feels like he's soaring across the skies and leaps tall buildings in a single bound.
His eyes refocus. The finish line is not in sight yet. But he knows he's going to make it. And he's loving every step along the way.
[fold][/fold]
Working on Resolume sometimes feels like running a marathon. It's not about speed but it takes endurance to reach the finish line. Except when creating software there is no finish line. Releases are just mile stones along the way. There is always more to improve and so our marathon continues.
Resolume 5.0.3 is another step in our ongoing marathon. Not so many new toys, but we tightened our shoe laces and stretched our muscles. So we're ready to run those marathon shows. With you.
Hit that download and read the full the release notes below...
NEW:
#6163 Fixture Editor can reveal fixture files via Show in Finder / Explorer
#6072 Copy Slice(s) (ctrl + c) in slice right-click menu
#5901 UI indicates difference between output and input masks
FIXED:
#6152 Crash in Advanced Output when rebuilding the output device list
#6134 Crash when using escape with fullscreen output on screen 1
#6097 Main interface shows empty in Chinese
#6088 ParticleSystem does not work correctly when being dragged from clip to comp when resolutions are different
#6061 Certain one frame DXV3 files don't play on PC
#6084 Selecting multiple slices sets the flip values to the same value
#6191 Resolume creates a folder named 'FixtureLibrary' in Documents
#6076 Drag and drop mask on composition doesn't work from Finder
#6043 Audio FFT input gain is not applied until preferences is opened
#6005 Buggy behaviour when switching deck and triggering clips with midi/keyboard
#5998 External control on bpm tab can result in wrong bpm calculation
#4572 Bpm Sync Random cannot be retriggered / Cue points ignore BPM Sync Random setting
#6168 Drawing a 512 by 512 fixture in the Fixture Editor is really slow
#6094 DMX Automap PDF is updated
#6050 Remove "Universe 0" from DMX Mapping info window