Orbit & Resolume at the Singapore Night Festival, 2018
What happens when a bunch of artists, architects, programmers and engineers come together? A burst of creation with a dollop of technology. Minimal yet complex. Simple yet amazing. This is what Litewerkz is.

Photo by Calvin Chan
Formed by students who met at the Singapore University of Technology and Design, Litewerkz is a newish design collective- with lots of vision. Gradually growing from indoor static installations to outdoor dynamic installations, they have worked at developing themselves by welcoming new members and perspectives, every year.
[fold][/fold]
Their installation this year at the Singapore Night festival really caught our eye. Named “Orbit” it was quick to evoke our inherent space obsession. With a central orb (as the primary source of energy- the sun), the installation consisted of different orbs (planets) with retroreflective cores, all wirelessly linked to each other. Very cool & scientific.

So, how did the idea of “Orbit” as an installation come to you.
We’ve worked with 3M (a global science company) on previous installations, making use of their iridescent films. These were displayed at the Singapore Night Festival in the past few years.
This year 3M came to us with a challenge to use their retroreflective materials. These materials are used on traffic signs and safety vests, so we really had to think hard about how we could use them in an art installation.
3M gave us the creative freedom, and provided us with custom colors of these materials, which really helped us achieve our vision for the installation.

We think they were really happy with the result as they have never seen their product used in this way.
Tell to us about the Installation- from idea to production.
Orbit is an abstraction of our solar system; 8 planets around a sun. Each of the planets and sun are individual units and react when the visitors spin them. The LED lights inside illuminate the planets and its unique make-up.

Within each planet is a unique polygonal gem, covered with retroreflective material. It is this gem that lights up with the use of flash photography or videography.

This gem is made computationally with the use of Grasshopper, cut on a corrugated board with a zund cutter, and assembled by hand.

Grasshopper script used to create the gems
Each planet uses a Particle Photon to connect to the main server, an Intel NUC. Each Photon has an app that allows it to understand the Artnet commands broadcasted from the NUC. With a MPU6050 6-axis accelerometer and gyroscope, the app is also able to detect the rotational speed and allow the planets to react accordingly.
While stationary or slow speed, the planets glow at half-brightness.

At high speeds, the planets will then glow at full-brightness.

Resolume runs on the Intel NUC, and allowed us to easily setup each planet and the sun; it allows for easy LED mapping as well as sequence control.
Tell us about the hardware that went into Orbit.
Hardware design was one of the most challenging parts of us, as this is our first time designing an interactive installation for the outdoors. There were also no off-the-shelf components for the curvature of each sphere we were using, so we had to design some components from scratch while trying to use as many off-the-shelf components as possible.

We went with an Aluminum tube for the shaft as it was easy to cut and was very sturdy at 40mm diameter. We opted for a brass bushing due to its low maintenance and low pricing.
We used a slip ring connector to transmit power from the stationary shaft to the rotating spheres. This is the centerpiece of the installation, the interface between the stationary aluminum tube, and the spinning sphere. Finally, the sphere holders were 3D printed in PET-G. PET-G was chosen as it was more heat and weather resistant as compared to PLA, important in the Singapore climate.
In our earlier versions, we were able to engage a printing service to print it as a single piece. But this single piece print took too long and demanded a large print bed. We then designed a split version so that each piece is printable on common print bed dimensions. The split versions aligned with each other with printed pins and holes. Split sections were secured with a PVC tube section and also when the attachments are screwed to the sphere.
And what exactly is retroreflective material?
Retroreflectivity is a property of the material which causes light to reflect in the same direction as the incident ray. If you have seen how road signs shine in the night while you are driving, it is because of this material which reflects the car’s headlights back in the same direction.
What this meant to us as artists, is that this material gave us an opportunity to design experiences that were unique to each individual’s perspective, and we sought to exploit this property in our installation. We encouraged visitors to turn on the flash on their camera phones while taking a photo or video, which would allow them to see the effect of retroreflection unique to the perspective of their camera.

Cool! So now tell us about the LED pixel strips. What purpose did they serve, how did you link them all and how did you manage to link their intensities with the speed of rotation of the orbs?
The LED pixel strips serve as a source of light for the retroreflective gems at the center of each sphere. We also intended for them to be eye catching from far, so that visitors nearby would be attracted to the installation.

Beyond these simple functions, we tried to see what else we could do to make the LEDs more interesting. We decided on a light show synchronized across all the spheres. This initially meant running data cables between the spheres, but we managed to find a microprocessor that had WiFi built in, and the relevant ArtNet and WS2812 LED libraries. There would be a central server pushing out the ArtNet signals, and the micro-controllers listening and displaying these signals. This gave us the ability to design light shows synchronized across all the spheres.
We used the MPU 6050 6-axis Accelerometer and Gyroscope, compatible with the libraries available for our micro-controller. After much discussion, we settled on a simple increase in intensity while the sphere rotates, localized to each sphere, as we felt that this would provide a feedback to the audience that is intuitive.
So, the Particle Photon micro-controller is clearly super important to the installation. Tell us more about it.
We chose the Particle Photon as it seemed really easy to connect it to the Internet and also debug with the status LEDs. Particle also has tools like the Particle CLI to help make setting it up and debugging easy. It also helps greatly that the Particle team does great documentation.
The Particle Photon has Digital Input and Output pins for us to control the LED lights. It also features a I2C bus which allows us to use I2C devices like the MPU6050 6-axis Accelerometer and Gyroscope Module.
We especially like the status LED for debugging. The different LED colors and patterns correspond to a list of statuses for the Particle Photon. This made it very easy for us in identifying what remedial action should be done without the need for a serial monitor.
Another benefit of having a micro-controller that connects to WiFi is the ability to push new code onto the micro-controller without having to connect a USB cable. This meant a lot time saved for us, as connecting a cable would require us to open up the sphere and electronics housing, for each of the 9 spheres

To put all these components together, we designed and printed a PCB. This PCB sat in the centre of each sphere, in a housing, and spins together with the sphere.

The PCB design allowed for us to easily put together the various electronics components. It also made the soldering and connections more consistent.We added a lithium-polymer battery backup system to prevent the micro-controller from shutting down if power connection is lost when the planets are spinning.

The PCB design was done on EAGLE, a PCB design software that is free for hobbyist and makers (this free version has limited features). The other good thing about EAGLE is that it works well with Fusion 360; PCB designs can be imported over from EAGLE to Fusion 360, allowing for inspections on the sizing with the overall design.
And so, tell us about the role that Resolume played in this installation.
Resolume allowed us to easily setup each planet and the sun; it allows for easy LED mapping as well as sequence control.
The central Intel NUC with an integrated GPU ran Resolume, and we mapped every planet to a section of the template. We came up with a template of 31x32 pixels to represent all the pixels in the whole installation. Eight 6x16 rectangles are mapped to the eight planets, and a 7x32 rectangle is mapped to the 7 smaller spheres within the sun. Each Particle Photon has a static IP, and we simply mapped each section of the video to the corresponding IP address.

After exporting the video sequences, we loaded them all into Resolume and set them to autoplay on start, and autopilot. All this happens when the power to the computer turned on, which meant we could let anyone turn on the installation without having to meddle with a computer.

We could have used code to run the entire installation, but the workflow of using After Effects to design the sequences, and Resolume to pixel map and convert this video into ArtNet, gave us the ability to quickly visualize new sequences.
It also democratized the designing of sequences within our team, as we no longer require coding knowledge to design sequences. Anyone can now export a sequence, put it onto Resolume, rearrange the playback sequence, change the speed, or add any other effects.
That is such a great way to use Resolume :)
So finally, considering the whole setup was wirelessly linked and controlled- I'm sure you faced challenges..?
Linking everything wirelessly gave us a lot of convenience, but of course we had to deal with the downside of using WiFi. We noticed that when the festival got crowded, the micro-controllers tend to disconnect from the network, and would freeze on the last received ArtNet frame. We then had to power cycle these frozen micro-controllers from time to time. It could have been interference from mobile phones with their WiFi and Bluetooth turned on, or because our low powered WiFi router was not equipped to deal with 10 simultaneous connections (1 server, 9 spheres).
Apart from these minor problems, we felt that wireless is the way to go logistically.
Interactive installations take the biggest abuse. The PCB was initially designed with female header pins for the outputs but when the wires keep coming out when the planets were spun, we replaced them with screw terminals to ensure that they were held in place.
On the first day of the festival, we witnessed kids and adults using all their strength to spin the spheres as fast as they could.

Well, can you blame them? :)
Thank you Keith & Team Litewerkz for talking to us. We look forward to seeing more of your cool work.
You can find out more about them here , on Instagram and Facebook.
Photo Credits:
Chiasuals
Natalie Chen (Litewerkz)
Photo by Calvin Chan
Formed by students who met at the Singapore University of Technology and Design, Litewerkz is a newish design collective- with lots of vision. Gradually growing from indoor static installations to outdoor dynamic installations, they have worked at developing themselves by welcoming new members and perspectives, every year.
Their installation this year at the Singapore Night festival really caught our eye. Named “Orbit” it was quick to evoke our inherent space obsession. With a central orb (as the primary source of energy- the sun), the installation consisted of different orbs (planets) with retroreflective cores, all wirelessly linked to each other. Very cool & scientific.
So, how did the idea of “Orbit” as an installation come to you.
We’ve worked with 3M (a global science company) on previous installations, making use of their iridescent films. These were displayed at the Singapore Night Festival in the past few years.
This year 3M came to us with a challenge to use their retroreflective materials. These materials are used on traffic signs and safety vests, so we really had to think hard about how we could use them in an art installation.
3M gave us the creative freedom, and provided us with custom colors of these materials, which really helped us achieve our vision for the installation.
We think they were really happy with the result as they have never seen their product used in this way.
Tell to us about the Installation- from idea to production.
Orbit is an abstraction of our solar system; 8 planets around a sun. Each of the planets and sun are individual units and react when the visitors spin them. The LED lights inside illuminate the planets and its unique make-up.
Within each planet is a unique polygonal gem, covered with retroreflective material. It is this gem that lights up with the use of flash photography or videography.
This gem is made computationally with the use of Grasshopper, cut on a corrugated board with a zund cutter, and assembled by hand.
Grasshopper script used to create the gems
Each planet uses a Particle Photon to connect to the main server, an Intel NUC. Each Photon has an app that allows it to understand the Artnet commands broadcasted from the NUC. With a MPU6050 6-axis accelerometer and gyroscope, the app is also able to detect the rotational speed and allow the planets to react accordingly.
While stationary or slow speed, the planets glow at half-brightness.
At high speeds, the planets will then glow at full-brightness.
Resolume runs on the Intel NUC, and allowed us to easily setup each planet and the sun; it allows for easy LED mapping as well as sequence control.
Tell us about the hardware that went into Orbit.
Hardware design was one of the most challenging parts of us, as this is our first time designing an interactive installation for the outdoors. There were also no off-the-shelf components for the curvature of each sphere we were using, so we had to design some components from scratch while trying to use as many off-the-shelf components as possible.
We went with an Aluminum tube for the shaft as it was easy to cut and was very sturdy at 40mm diameter. We opted for a brass bushing due to its low maintenance and low pricing.
We used a slip ring connector to transmit power from the stationary shaft to the rotating spheres. This is the centerpiece of the installation, the interface between the stationary aluminum tube, and the spinning sphere. Finally, the sphere holders were 3D printed in PET-G. PET-G was chosen as it was more heat and weather resistant as compared to PLA, important in the Singapore climate.
In our earlier versions, we were able to engage a printing service to print it as a single piece. But this single piece print took too long and demanded a large print bed. We then designed a split version so that each piece is printable on common print bed dimensions. The split versions aligned with each other with printed pins and holes. Split sections were secured with a PVC tube section and also when the attachments are screwed to the sphere.
And what exactly is retroreflective material?
Retroreflectivity is a property of the material which causes light to reflect in the same direction as the incident ray. If you have seen how road signs shine in the night while you are driving, it is because of this material which reflects the car’s headlights back in the same direction.
What this meant to us as artists, is that this material gave us an opportunity to design experiences that were unique to each individual’s perspective, and we sought to exploit this property in our installation. We encouraged visitors to turn on the flash on their camera phones while taking a photo or video, which would allow them to see the effect of retroreflection unique to the perspective of their camera.
Cool! So now tell us about the LED pixel strips. What purpose did they serve, how did you link them all and how did you manage to link their intensities with the speed of rotation of the orbs?
The LED pixel strips serve as a source of light for the retroreflective gems at the center of each sphere. We also intended for them to be eye catching from far, so that visitors nearby would be attracted to the installation.
Beyond these simple functions, we tried to see what else we could do to make the LEDs more interesting. We decided on a light show synchronized across all the spheres. This initially meant running data cables between the spheres, but we managed to find a microprocessor that had WiFi built in, and the relevant ArtNet and WS2812 LED libraries. There would be a central server pushing out the ArtNet signals, and the micro-controllers listening and displaying these signals. This gave us the ability to design light shows synchronized across all the spheres.
We used the MPU 6050 6-axis Accelerometer and Gyroscope, compatible with the libraries available for our micro-controller. After much discussion, we settled on a simple increase in intensity while the sphere rotates, localized to each sphere, as we felt that this would provide a feedback to the audience that is intuitive.
So, the Particle Photon micro-controller is clearly super important to the installation. Tell us more about it.
We chose the Particle Photon as it seemed really easy to connect it to the Internet and also debug with the status LEDs. Particle also has tools like the Particle CLI to help make setting it up and debugging easy. It also helps greatly that the Particle team does great documentation.
The Particle Photon has Digital Input and Output pins for us to control the LED lights. It also features a I2C bus which allows us to use I2C devices like the MPU6050 6-axis Accelerometer and Gyroscope Module.
We especially like the status LED for debugging. The different LED colors and patterns correspond to a list of statuses for the Particle Photon. This made it very easy for us in identifying what remedial action should be done without the need for a serial monitor.
Another benefit of having a micro-controller that connects to WiFi is the ability to push new code onto the micro-controller without having to connect a USB cable. This meant a lot time saved for us, as connecting a cable would require us to open up the sphere and electronics housing, for each of the 9 spheres
To put all these components together, we designed and printed a PCB. This PCB sat in the centre of each sphere, in a housing, and spins together with the sphere.
The PCB design allowed for us to easily put together the various electronics components. It also made the soldering and connections more consistent.We added a lithium-polymer battery backup system to prevent the micro-controller from shutting down if power connection is lost when the planets are spinning.
The PCB design was done on EAGLE, a PCB design software that is free for hobbyist and makers (this free version has limited features). The other good thing about EAGLE is that it works well with Fusion 360; PCB designs can be imported over from EAGLE to Fusion 360, allowing for inspections on the sizing with the overall design.
And so, tell us about the role that Resolume played in this installation.
Resolume allowed us to easily setup each planet and the sun; it allows for easy LED mapping as well as sequence control.
The central Intel NUC with an integrated GPU ran Resolume, and we mapped every planet to a section of the template. We came up with a template of 31x32 pixels to represent all the pixels in the whole installation. Eight 6x16 rectangles are mapped to the eight planets, and a 7x32 rectangle is mapped to the 7 smaller spheres within the sun. Each Particle Photon has a static IP, and we simply mapped each section of the video to the corresponding IP address.
After exporting the video sequences, we loaded them all into Resolume and set them to autoplay on start, and autopilot. All this happens when the power to the computer turned on, which meant we could let anyone turn on the installation without having to meddle with a computer.
We could have used code to run the entire installation, but the workflow of using After Effects to design the sequences, and Resolume to pixel map and convert this video into ArtNet, gave us the ability to quickly visualize new sequences.
It also democratized the designing of sequences within our team, as we no longer require coding knowledge to design sequences. Anyone can now export a sequence, put it onto Resolume, rearrange the playback sequence, change the speed, or add any other effects.
That is such a great way to use Resolume :)
So finally, considering the whole setup was wirelessly linked and controlled- I'm sure you faced challenges..?
Linking everything wirelessly gave us a lot of convenience, but of course we had to deal with the downside of using WiFi. We noticed that when the festival got crowded, the micro-controllers tend to disconnect from the network, and would freeze on the last received ArtNet frame. We then had to power cycle these frozen micro-controllers from time to time. It could have been interference from mobile phones with their WiFi and Bluetooth turned on, or because our low powered WiFi router was not equipped to deal with 10 simultaneous connections (1 server, 9 spheres).
Apart from these minor problems, we felt that wireless is the way to go logistically.
Interactive installations take the biggest abuse. The PCB was initially designed with female header pins for the outputs but when the wires keep coming out when the planets were spun, we replaced them with screw terminals to ensure that they were held in place.
On the first day of the festival, we witnessed kids and adults using all their strength to spin the spheres as fast as they could.
Well, can you blame them? :)
Thank you Keith & Team Litewerkz for talking to us. We look forward to seeing more of your cool work.
You can find out more about them here , on Instagram and Facebook.
Photo Credits:
Chiasuals
Natalie Chen (Litewerkz)
Resolume Blog
This blog is about Resolume, VJ-ing and the inspiring things the Resolume users make. Do you have something interesting to show the community? Send in your work!
Highlights
New Footage Releases by Julius Horsthuis, Ican Agoesdjam and Catmac
The general walked the empty streets of the city. The parade was over, the people had gone home. See you on the other side, private.
Get Cities by Julius Horsthuis
No need for alarms in the cities that never sleep.
Get Pop-Aganda by Ican Agoesdjam
You might think we are making propaganda here, but the only 'ganda we like is pop-aganda.
DriftNet by Catmac
Drift Gift Lift Rift Shift Swift Net.
Get Cities by Julius Horsthuis
No need for alarms in the cities that never sleep.
Get Pop-Aganda by Ican Agoesdjam
You might think we are making propaganda here, but the only 'ganda we like is pop-aganda.
DriftNet by Catmac
Drift Gift Lift Rift Shift Swift Net.
Resolume 6.1.1 & DXV Video Conversion with Alley 2.0
Yaaaas queen! Finally! Alley 2.0.0 can now convert your video files to DXV.

We've made it ridiculously easy so your files are off blazing in two clicks, using super fast super multi-threaded super power. Of course, you can also fine tune your blend with presets and power user tricks. Check out the workflow and all the options in this Alley support article. Alley will always be free, so make sure it's part of your swiss army toolkit.
16 Bit DMX Input
You can now assign a DMX shortcut as 16 bit. Resolume will then automatically use the selected channel for coarse and the following channel for fine. Lampies will understand all the words in that sentence. Thanks Maarten!
Way Faster Adobe DXV Exporters
While we were working on Alley, we gave the Adobe DXV exporters a speed boost as well by making them multi-threaded. The more cores you have in your CPU the faster it will be.
Gl!tchy Tǝxt No More
We squashed the glitched out characters in the Text Animator and Text Block for good this time. No more weird looking text. Although the glitchy look was kind of awesome.
Grey Output Flash on Startup?
We all love a good flashy flashy, but thanks to the logs sent in by Brandon Little, Ozaer Faroqui and Thomy Hoefer, Resolume keeps a low profile during startup again and does not flash a grey frame in the output anymore.
Transformers
Pasting effects would result in an extra Transform effect that couldn't be removed. Now I have loved Transformers since I was a little babbie in the eighties. I even kept loving them when Michael Bay took them and CGI'd them to death. But this many Transformers was a bit much even for me.
l10n
The interface, onscreen help and manual are now disponible en Español, in Deutsch verfügbar and 有中文.
Download
Checkout the full fix list below, and just grab yourself a dime bag of that dank download.
Update! Thu 8 Nov, Hotfix List
#11972 Group Parameter Animation Menu does not open
#11949 Composition dashboard doesn't load stored names
#11954 Media Manager not working after save on new composition
#11969 Avenue footage installer creates wrong folder for registration file, then fails to register on OSX
#11960 Clipgrid scrollbars are not disappearing on resize, breaks vertical scrolling
Thu 1 Nov, Fix List
8458 play to end and hold' with 'continue from last pos' seeks to different frame on clip relaunch
9416 reveal effect moves original texture slices when set to centre scaling
9618 Text Atlas corrupt / text glitch
9776 MM: "Locate All.." doesn't work on Mac with MM comp that comes from Windows
10648 Monitor panel can lose title and settings icon
10841 DMX clip, column launch relaunches clip constantly above the launch threshold value when channel is changing
11111 Recording has wrong frame rate set when recording Audio + Video
11250 having many decks slows down composition load time considerably on OS X.
11254 DXV AV clip can flash clip's first frame when set to play to end and pause
11296 SMPTE Clip offsets are based on 25 FPS on composition reload instead of actual SMPTE frame rate setting.
11331 Dropping a deck from compositions panel to your composition makes the compositions list jump to the top
11353 Double clicking on Source presets to preview makes midi controller flash
11414 File search bar clears constantly when you convert a file to a folder that is visible in files panel.
11423 Layer texture is stretched when layer opacity is 0, and layer size is different than composition size
11453 File preview breaks for files panel after contents of folder changed
11456 New fixture renames itself back in fixture editor when scrolled out of view
11529 Clip Transport shortcut selection broken if you have in, out, and playhead mapped
11537 DXV3 Adobe exporter sets video stream to 23 fps when you select 23.98 setting
11553 Renaming an envelope preset creates a duplicate preset file.
11565 Paste effects adds a non removable transform to destination clip
11607 Appcrash loading Atlas cache, when a png is missing in it.
11610 Distortion VST effect drive and tone not effective
11665 Accept playhead/ in /out position for Audio and AV clips in hh:mm:ss.ms format too, similarly to Video clip's hh:mm:ss.ff format
11668 Soloed layer in a bypassed group is not rendered to group texture.
11671 Adobe DXV plugin fails with AE CS5
11683 App crash in ra::Envelope::Data::read
11686 Appcrash removing duplicate layer in attached composition
11697 Clip In/ out points not updating any more when they have been moved together via Midi
11707 Prediction around loop point results in wrong frame
11717 Scrubbing with playhead in playing AV clip via midi CC makes clip show random frames
11744 paused Clip transport position is sometimes displaying frames, sometimes milliseconds randomly on launch
11746 Layer preview incorrect when inside a group
11747 Layer router size incorrect when layer taken from a group
11748 Layer # visible in Video Router Input dropdown
11759 Add translated PDF manuals to installers
11764 Precision for setting Playhead, In/out position via left-right arrows on dingetje got very low.
11769 Panel on top left, in the same row as LayersAndClips resizes on every restart
11779 Importing still images and 1 frame DXVs does not give uniform duration
11794 Keyboard arrows on selected envelope keyframes are mixed-up
11795 Keyboard arrow moves Envelope keyframe in huge increments
11817 Clips get stuck on loading again
11851 16 Bit DMX Input
11880 Display hotkeys don't work on mac any more
11881 Display hotkeys are missing in the Output menu since 6.0.10
11909 Output can flash white/grey when output is enabled
We've made it ridiculously easy so your files are off blazing in two clicks, using super fast super multi-threaded super power. Of course, you can also fine tune your blend with presets and power user tricks. Check out the workflow and all the options in this Alley support article. Alley will always be free, so make sure it's part of your swiss army toolkit.
16 Bit DMX Input
You can now assign a DMX shortcut as 16 bit. Resolume will then automatically use the selected channel for coarse and the following channel for fine. Lampies will understand all the words in that sentence. Thanks Maarten!
Way Faster Adobe DXV Exporters
While we were working on Alley, we gave the Adobe DXV exporters a speed boost as well by making them multi-threaded. The more cores you have in your CPU the faster it will be.
Gl!tchy Tǝxt No More
We squashed the glitched out characters in the Text Animator and Text Block for good this time. No more weird looking text. Although the glitchy look was kind of awesome.
Grey Output Flash on Startup?
We all love a good flashy flashy, but thanks to the logs sent in by Brandon Little, Ozaer Faroqui and Thomy Hoefer, Resolume keeps a low profile during startup again and does not flash a grey frame in the output anymore.
Transformers
Pasting effects would result in an extra Transform effect that couldn't be removed. Now I have loved Transformers since I was a little babbie in the eighties. I even kept loving them when Michael Bay took them and CGI'd them to death. But this many Transformers was a bit much even for me.
l10n
The interface, onscreen help and manual are now disponible en Español, in Deutsch verfügbar and 有中文.
Download
Checkout the full fix list below, and just grab yourself a dime bag of that dank download.
Update! Thu 8 Nov, Hotfix List
#11972 Group Parameter Animation Menu does not open
#11949 Composition dashboard doesn't load stored names
#11954 Media Manager not working after save on new composition
#11969 Avenue footage installer creates wrong folder for registration file, then fails to register on OSX
#11960 Clipgrid scrollbars are not disappearing on resize, breaks vertical scrolling
Thu 1 Nov, Fix List
8458 play to end and hold' with 'continue from last pos' seeks to different frame on clip relaunch
9416 reveal effect moves original texture slices when set to centre scaling
9618 Text Atlas corrupt / text glitch
9776 MM: "Locate All.." doesn't work on Mac with MM comp that comes from Windows
10648 Monitor panel can lose title and settings icon
10841 DMX clip, column launch relaunches clip constantly above the launch threshold value when channel is changing
11111 Recording has wrong frame rate set when recording Audio + Video
11250 having many decks slows down composition load time considerably on OS X.
11254 DXV AV clip can flash clip's first frame when set to play to end and pause
11296 SMPTE Clip offsets are based on 25 FPS on composition reload instead of actual SMPTE frame rate setting.
11331 Dropping a deck from compositions panel to your composition makes the compositions list jump to the top
11353 Double clicking on Source presets to preview makes midi controller flash
11414 File search bar clears constantly when you convert a file to a folder that is visible in files panel.
11423 Layer texture is stretched when layer opacity is 0, and layer size is different than composition size
11453 File preview breaks for files panel after contents of folder changed
11456 New fixture renames itself back in fixture editor when scrolled out of view
11529 Clip Transport shortcut selection broken if you have in, out, and playhead mapped
11537 DXV3 Adobe exporter sets video stream to 23 fps when you select 23.98 setting
11553 Renaming an envelope preset creates a duplicate preset file.
11565 Paste effects adds a non removable transform to destination clip
11607 Appcrash loading Atlas cache, when a png is missing in it.
11610 Distortion VST effect drive and tone not effective
11665 Accept playhead/ in /out position for Audio and AV clips in hh:mm:ss.ms format too, similarly to Video clip's hh:mm:ss.ff format
11668 Soloed layer in a bypassed group is not rendered to group texture.
11671 Adobe DXV plugin fails with AE CS5
11683 App crash in ra::Envelope::Data::read
11686 Appcrash removing duplicate layer in attached composition
11697 Clip In/ out points not updating any more when they have been moved together via Midi
11707 Prediction around loop point results in wrong frame
11717 Scrubbing with playhead in playing AV clip via midi CC makes clip show random frames
11744 paused Clip transport position is sometimes displaying frames, sometimes milliseconds randomly on launch
11746 Layer preview incorrect when inside a group
11747 Layer router size incorrect when layer taken from a group
11748 Layer # visible in Video Router Input dropdown
11759 Add translated PDF manuals to installers
11764 Precision for setting Playhead, In/out position via left-right arrows on dingetje got very low.
11769 Panel on top left, in the same row as LayersAndClips resizes on every restart
11779 Importing still images and 1 frame DXVs does not give uniform duration
11794 Keyboard arrows on selected envelope keyframes are mixed-up
11795 Keyboard arrow moves Envelope keyframe in huge increments
11817 Clips get stuck on loading again
11851 16 Bit DMX Input
11880 Display hotkeys don't work on mac any more
11881 Display hotkeys are missing in the Output menu since 6.0.10
11909 Output can flash white/grey when output is enabled
New Footage Releases by Unit44, Julius Horsthuis and Sebastian Purfurst
As we tend to our alien garden, we start to find patterns everywhere. We experience glitches in our reality, as the bugs take over. And I for one welcome our new insect overlords.
Get DayOfTheArthropod by Sebastian Purfurst
A footage collection of bugs & glitches in a retro vintage sci-fi look.
Get Alien Garden by Julius Horsthuis
This pack is our proof that alien gardeners exist.
Get Pattern 3 by Unit44
Now I am no math scientist, but I am seeing a pattern here.
Get DayOfTheArthropod by Sebastian Purfurst
A footage collection of bugs & glitches in a retro vintage sci-fi look.
Get Alien Garden by Julius Horsthuis
This pack is our proof that alien gardeners exist.
Get Pattern 3 by Unit44
Now I am no math scientist, but I am seeing a pattern here.
In Pursuit of Secret Mapping Experiment
Imagine you are driving (top down, of course) along a pristine coastline.
The wind in your hair, the world at your feet. The sun is setting, the ocean is that perfect blue. The salty air is filling your lungs up with freshness.
You smile coz life is perfect and as you muse out into the water, you see this:

Didn’t your life just get so much better?
[fold][/fold]
Enter: Secret mapping experiment.
Who are they? We don’t know.
What do they do? Some mad cool shit.
How do we get to know them? Well, we tried our best.
So, who exactly are you?
We are artists with alpha channels who try to find the edges.

Yeah? That’s cool.
And, what do you do?
We do projection mapping on buildings, gigantic masterpieces and machines which are now out of use. The concept is to do projection mapping on as many abandoned and closed-down sites as possible, around the globe. This way we would like to rethink the spaces’ (hi)story and architectural details.

We try to redefine and present these buildings and objects (which had an important role and function in the past) with the utilization of the technology of animation and video mapping, shaking and dressing them up.

This way we give the buildings a ’new life’; we can look back into the past and the ghosts can repopulate the spaces that used to be full of life.
IMPORTANT: the exact locations are being kept a secret, so they can remain for the after-ages.
Wow that is some really cool s*@#$*$#@.
And why do you do this?
If you have been working on commercial projects for many years you need to find a platform where you can experiment your ideas.
This is a platform where you can play freely and create your own games with your own rules to find your daily excites.
Ah. Daily excites :) Don’t we all need them?

Abandoned buildings always enchanted us. Our aim is to redefine places that had an important function in the past and give them a new meaning with an entirely new perspective with our projection mappings.

We started in 2015 in an old power factory in Budapest.

There are a lot of neglected and forgotten unique constructions and natural formations and we gladly wake them up from their secret past for a night.
What sort of preparation goes into this? From identifying the right projection surface to actually making it happen?
Identifying the right projection surface is a very hard decision. It is very important to get correct photos, keeping in mind the camera angle and the eventual projection angle. We use different technologies, but sometimes we have only one photo of the surface. Sometimes, the surface itself changes, and looks very different in a photo than what it is in reality.
Thankfully, there are good methods that you can use to make 3D plans from photos.

Once this is done, we work at sizes, distances, figuring out the correct projector and lens.

Wow. All of this is pretty complex considering you project on massive ice bergs & mountain faces.
Yes. The maximum distance we have had to work with is 300 meters from the surface. That is really far out and so tough to map everything to the right position.


We have got different kind of projectors. Every surface needs a different technology. You can definitely feel the difference between DLP – LCD or Lazer systems when you are in the darkness.
We set our projector up in the back of a Pick-up, the top of a wind turbine or on a special trailer that we’ve built for it.
And we solve electricity problems by using sunlight, uranium or even a hamster wheel.

Ah of course. The trusted hamster wheel. Never lets you down.
And so, how long does it take for you to put together an experiment?
Sometimes a couple of weeks, sometimes only a few days. But there is one project that we haven’t been able to finish for two years. We also make interactive stuff what is created on the spot, on the actual day.

Tell us about all the different software you work with. Does good ol’ Resolume feature in your secrets?
We use so many different kinds of software - there is no one and only.
Here is one of our usual dialogues:
“F***! it’s almost midnight and we haven’t started yet!”
“Switch the damn projector on and map this shit on it. You can do this!”
Enter Resolume.
“No problem. 5 minutes.”
“Don’t forget the MASKS!”
“OK”

Haha. Just great! Good to know :)
Tell us about some of the challenges you face doing this. I’m sure there are many.
Wow. That is a good question- because every project is a challenge.
We need to do the projection during the night, in the dark, and solve immediately some unexpected issues. A lot of times, we haven’t got permission. So, what we are doing is highly risky.

There is a project. We have already tried twice. We travelled 2000 km with all the equipment. It is our dream spot. The top of a mountain, which seems to be always cloudy or rainy. It doesn’t matter, if it’s summer or winter. The mountain just doesn’t let us to do the projection. But we will go back every year and try to realize our dream. We never give up!

That is really inspiring. Before you go back into hiding, anything to keep our eyes peeled for?
We want step into the sky and come down. Finish the top of the mountain. But, there are thousands of venues what needs secret mapping all around the world.
If you know some of them, please send us your suggestion secretmapping@gmail.com.
Aaaaand, let the inbox spamming begin!
Thank you for talking to us Secret Mapping Experiment. Boy, are we on the look-out for you.
Check out more of their amazing work here.
Interview Credits: Daniel Besnyo, Founder- Secret Mapping Experiment
The wind in your hair, the world at your feet. The sun is setting, the ocean is that perfect blue. The salty air is filling your lungs up with freshness.
You smile coz life is perfect and as you muse out into the water, you see this:
Didn’t your life just get so much better?
[fold][/fold]
Enter: Secret mapping experiment.
Who are they? We don’t know.
What do they do? Some mad cool shit.
How do we get to know them? Well, we tried our best.
So, who exactly are you?
We are artists with alpha channels who try to find the edges.
Yeah? That’s cool.
And, what do you do?
We do projection mapping on buildings, gigantic masterpieces and machines which are now out of use. The concept is to do projection mapping on as many abandoned and closed-down sites as possible, around the globe. This way we would like to rethink the spaces’ (hi)story and architectural details.
We try to redefine and present these buildings and objects (which had an important role and function in the past) with the utilization of the technology of animation and video mapping, shaking and dressing them up.
This way we give the buildings a ’new life’; we can look back into the past and the ghosts can repopulate the spaces that used to be full of life.
IMPORTANT: the exact locations are being kept a secret, so they can remain for the after-ages.
Wow that is some really cool s*@#$*$#@.
And why do you do this?
If you have been working on commercial projects for many years you need to find a platform where you can experiment your ideas.
This is a platform where you can play freely and create your own games with your own rules to find your daily excites.
Ah. Daily excites :) Don’t we all need them?
Abandoned buildings always enchanted us. Our aim is to redefine places that had an important function in the past and give them a new meaning with an entirely new perspective with our projection mappings.
We started in 2015 in an old power factory in Budapest.
There are a lot of neglected and forgotten unique constructions and natural formations and we gladly wake them up from their secret past for a night.
What sort of preparation goes into this? From identifying the right projection surface to actually making it happen?
Identifying the right projection surface is a very hard decision. It is very important to get correct photos, keeping in mind the camera angle and the eventual projection angle. We use different technologies, but sometimes we have only one photo of the surface. Sometimes, the surface itself changes, and looks very different in a photo than what it is in reality.
Thankfully, there are good methods that you can use to make 3D plans from photos.
Once this is done, we work at sizes, distances, figuring out the correct projector and lens.
Wow. All of this is pretty complex considering you project on massive ice bergs & mountain faces.
Yes. The maximum distance we have had to work with is 300 meters from the surface. That is really far out and so tough to map everything to the right position.
We have got different kind of projectors. Every surface needs a different technology. You can definitely feel the difference between DLP – LCD or Lazer systems when you are in the darkness.
We set our projector up in the back of a Pick-up, the top of a wind turbine or on a special trailer that we’ve built for it.
And we solve electricity problems by using sunlight, uranium or even a hamster wheel.
Ah of course. The trusted hamster wheel. Never lets you down.
And so, how long does it take for you to put together an experiment?
Sometimes a couple of weeks, sometimes only a few days. But there is one project that we haven’t been able to finish for two years. We also make interactive stuff what is created on the spot, on the actual day.
Tell us about all the different software you work with. Does good ol’ Resolume feature in your secrets?
We use so many different kinds of software - there is no one and only.
Here is one of our usual dialogues:
“F***! it’s almost midnight and we haven’t started yet!”
“Switch the damn projector on and map this shit on it. You can do this!”
Enter Resolume.
“No problem. 5 minutes.”
“Don’t forget the MASKS!”
“OK”
Haha. Just great! Good to know :)
Tell us about some of the challenges you face doing this. I’m sure there are many.
Wow. That is a good question- because every project is a challenge.
We need to do the projection during the night, in the dark, and solve immediately some unexpected issues. A lot of times, we haven’t got permission. So, what we are doing is highly risky.
There is a project. We have already tried twice. We travelled 2000 km with all the equipment. It is our dream spot. The top of a mountain, which seems to be always cloudy or rainy. It doesn’t matter, if it’s summer or winter. The mountain just doesn’t let us to do the projection. But we will go back every year and try to realize our dream. We never give up!
That is really inspiring. Before you go back into hiding, anything to keep our eyes peeled for?
We want step into the sky and come down. Finish the top of the mountain. But, there are thousands of venues what needs secret mapping all around the world.
If you know some of them, please send us your suggestion secretmapping@gmail.com.
Aaaaand, let the inbox spamming begin!
Thank you for talking to us Secret Mapping Experiment. Boy, are we on the look-out for you.
Check out more of their amazing work here.
Interview Credits: Daniel Besnyo, Founder- Secret Mapping Experiment
New Footage by Video2000, Laak and Julius Horsthuis
Let's go on a journey of exploration and discovery. Explore this worlds' mythical tribal cultures with Laak's Ethnik 2. Go beyond this word and explore alien alchemy with Julius Horsthuis' FractalGold. And explore inner space with Video2000's zen-like meditations on minimalism Cont.
Get Cont by Video2000
A minimal pack for the maximum of the effects.
Get FractalGold by Julius Horsthuis
Fractals look different every time you look at them. Like looking at yourself in the mirror.
Get Ethnik 2 by Laak
Start packing, cause we are going on a journey!
Get Cont by Video2000
A minimal pack for the maximum of the effects.
Get FractalGold by Julius Horsthuis
Fractals look different every time you look at them. Like looking at yourself in the mirror.
Get Ethnik 2 by Laak
Start packing, cause we are going on a journey!
Resolume 6.1.0 Sync to the DJ
The 6.1.0 release is a major point release, which means we added a big feature. As we announced a few weeks ago, we made it a lot easier to sync a prepared video set to the audio coming from the DJ.
If you work with a DJ that mixes on a Denon setup, you can sync every video in Resolume to every track on the players. You can read all about that in the manual. Suffice to say, this will make banging out those DJ intros and special show moments a piece of cake.
We also have a handful of bug fixes. If you're using multiply a lot, make sure you peek the warning below! Otherwise, just hit that download.
Multiply Mixer Warning
We fixed the jump cut when ejecting a clip from a multiply layer using a transition. The multiply blend now works the same as in Photoshop. So far so good! This also means that when using the multiply blend in combination with content that has transparency, the visual result will be slightly different, also in existing compositions. If you prefer the old look, you can disable the alpha channel in the content.
Hotfix! September 20th.
The Opacity fader of a layer was not working correctly when it was playing a SMPTE synced clip. Fixed in Arena 6.1.0 rev61231. Simply download Arena 6.1 again to get this latest revision.[fold][/fold]
Fixed
#11629 (closed) Make multiply mixer transparency aware
#11598 (closed) DMA Textures crashes with HAP q alpha
#11405 (closed) Crash loading certain TIFF files
#11609 (closed) Hue rotate after Iterate or Fragment not processing pre multiplied colors correctly
#11574 (closed) Fix Rings generator not outputting correctly premultiplied colors
#11573 (closed) Fix LineScape generator not outputting correctly premultiplied colors
#11571 (closed) Fix Lines generator not outputting correctly premultiplied colors
#11569 (closed) Fix checkered generator not outputting correctly premultiplied colors
If you work with a DJ that mixes on a Denon setup, you can sync every video in Resolume to every track on the players. You can read all about that in the manual. Suffice to say, this will make banging out those DJ intros and special show moments a piece of cake.
We also have a handful of bug fixes. If you're using multiply a lot, make sure you peek the warning below! Otherwise, just hit that download.
Multiply Mixer Warning
We fixed the jump cut when ejecting a clip from a multiply layer using a transition. The multiply blend now works the same as in Photoshop. So far so good! This also means that when using the multiply blend in combination with content that has transparency, the visual result will be slightly different, also in existing compositions. If you prefer the old look, you can disable the alpha channel in the content.
Hotfix! September 20th.
The Opacity fader of a layer was not working correctly when it was playing a SMPTE synced clip. Fixed in Arena 6.1.0 rev61231. Simply download Arena 6.1 again to get this latest revision.[fold][/fold]
Fixed
#11629 (closed) Make multiply mixer transparency aware
#11598 (closed) DMA Textures crashes with HAP q alpha
#11405 (closed) Crash loading certain TIFF files
#11609 (closed) Hue rotate after Iterate or Fragment not processing pre multiplied colors correctly
#11574 (closed) Fix Rings generator not outputting correctly premultiplied colors
#11573 (closed) Fix LineScape generator not outputting correctly premultiplied colors
#11571 (closed) Fix Lines generator not outputting correctly premultiplied colors
#11569 (closed) Fix checkered generator not outputting correctly premultiplied colors
Resolume Arena with Denon DJ StageLinQ Integration
Touch down in Philadelphia, we’ll be at DJ Expo in Atlantic City together with Denon DJ this week to show the integration of Denon’s StageLinQ protocol into Resolume Arena. StageLinQ enables Arena to automatically video sync to Denon’s Prime DJ players and mixer.
Arena can not only play video in perfect sync with the playing audio tracks but it will also recognise the songs being cued and automatically start the corresponding video. Just plug in the ethernet cable and you’re ready to go. The DJ has full creative freedom, scratching, pitching, looping and firing cues, the video in Arena will stay in perfect sync.
We’re very excited to finally show the Denon DJ StageLinQ integration in Arena, we think this is big step forward for synchronised shows.

Come and visit the Denon DJ stand at DJ Expo. We (Edwin and Bart) will be there for demos. Resolume Arena version 6.1 with StageLinQ integration will be publicly released in a couple of weeks.
Arena can not only play video in perfect sync with the playing audio tracks but it will also recognise the songs being cued and automatically start the corresponding video. Just plug in the ethernet cable and you’re ready to go. The DJ has full creative freedom, scratching, pitching, looping and firing cues, the video in Arena will stay in perfect sync.
We’re very excited to finally show the Denon DJ StageLinQ integration in Arena, we think this is big step forward for synchronised shows.
Come and visit the Denon DJ stand at DJ Expo. We (Edwin and Bart) will be there for demos. Resolume Arena version 6.1 with StageLinQ integration will be publicly released in a couple of weeks.
Resolume 6.0.11 Gotta go fast!
Like beating Super Mario Bros in under five minutes, Resolume 6.0.11 is all about shaving off those milliseconds and finding those hidden shortcuts.
With the 6.0.11 update, you can play even more videos at higher resolutions than before. We dove under the hood to see where Resolume was spending most of its time, and improved that. A 2013 Mac Pro can now comfortably play 14 layers of 4K at 60 fps. Considering it had trouble reaching 30 fps with the same load before, that's a huge improvement!
Take a warp zone to download the update via the app or via the website, or take your time and get nerdy with all the techy details and the full fix list below.[fold][/fold]
So how does it work?
There is a new option called DMA Textures in the Video Preferences. By turning this ON, Resolume can pass textures to the GPU directly. This results in significantly improved performance.

There are some caveats, so by default it's still turned OFF. You can let Resolume detect if your computer supports DMA by setting it to Autodetect. There are a few edge cases where it might work, even though we can't detect it (the Mac Pro with its dual GPUs for instance). In those cases, you can force Resolume to use it by turning it ON.
When Resolume doesn't correctly autodetect the setting (so turning it ON improves performance, but Autodetect doesn't), or if turning it ON gives you crazy results, let us know!
For all of you that love your numbers, turning this option ON also gives you a nice little system statistics display on the statusbar as well.
New
#10782 DMA Textures option in Video Preferences
Fixed
#11410 Possible crash adjusting fixtures
#11131 Clip flashes last played frame on auto pilot launch
#11316 DMX inspector is initialized with random channel values.
#11333 Only write effect and ass preset files when they have changed
#11376 Resolume can have trouble starting without a valid audio output device.
#11139 Elgato game capture can't be opened
#11116 Alley name missing next to windows taskbar icon
#11403 OSC absolute values are broken
#11334 Launching Selected clips with Enter works only once
#11236 Layer Auto pilot on Random always launches the first clip in a deck you switched to
#11381 Possible crash on start with Intel(R) HD Graphics 4000 and windows 10
#11413 Tiff files with accented characters in the file name can't be added to the composition
#10364 Gifs with alhpa have Alpha channel button disabled (greyed), switching off other channel removes alpha
#11425 Stingy sphere is black inside
#11485 Slice routed layers/groups duplicates not outputting any more unless previewed
#11351 "This.." target Midi shortcuts vs OSC "Only this..." shortcuts are behaving different
#11369 Clip Drag and drop doesn't scroll nicely
#11406 Possible crash trying to save an empty palette
#11294 Line stride is incorrect on DV capture via AV Foundation
#11415 moving clip In/out points with shift only moves the value of the one you grab
#11384 Cue points are not imported from R5 compositions.
#11455 Crash receiving an OSC message without a leading /
#11385 Midi feedback for multi option objects with multiple shortcuts is broken
#11389 Fix Distance50 mixer not outputting premultiplied colors
#11390 Fix PointGrid effect not outputting premultiplied colors
#11388 Fix DotScreen effect not outputting premultiplied colors
#11386 OSC ../connectspecificclip and ../connectspecificcolumn no worky over int 1
With the 6.0.11 update, you can play even more videos at higher resolutions than before. We dove under the hood to see where Resolume was spending most of its time, and improved that. A 2013 Mac Pro can now comfortably play 14 layers of 4K at 60 fps. Considering it had trouble reaching 30 fps with the same load before, that's a huge improvement!
Take a warp zone to download the update via the app or via the website, or take your time and get nerdy with all the techy details and the full fix list below.[fold][/fold]
So how does it work?
There is a new option called DMA Textures in the Video Preferences. By turning this ON, Resolume can pass textures to the GPU directly. This results in significantly improved performance.
There are some caveats, so by default it's still turned OFF. You can let Resolume detect if your computer supports DMA by setting it to Autodetect. There are a few edge cases where it might work, even though we can't detect it (the Mac Pro with its dual GPUs for instance). In those cases, you can force Resolume to use it by turning it ON.
When Resolume doesn't correctly autodetect the setting (so turning it ON improves performance, but Autodetect doesn't), or if turning it ON gives you crazy results, let us know!
For all of you that love your numbers, turning this option ON also gives you a nice little system statistics display on the statusbar as well.
New
#10782 DMA Textures option in Video Preferences
Fixed
#11410 Possible crash adjusting fixtures
#11131 Clip flashes last played frame on auto pilot launch
#11316 DMX inspector is initialized with random channel values.
#11333 Only write effect and ass preset files when they have changed
#11376 Resolume can have trouble starting without a valid audio output device.
#11139 Elgato game capture can't be opened
#11116 Alley name missing next to windows taskbar icon
#11403 OSC absolute values are broken
#11334 Launching Selected clips with Enter works only once
#11236 Layer Auto pilot on Random always launches the first clip in a deck you switched to
#11381 Possible crash on start with Intel(R) HD Graphics 4000 and windows 10
#11413 Tiff files with accented characters in the file name can't be added to the composition
#10364 Gifs with alhpa have Alpha channel button disabled (greyed), switching off other channel removes alpha
#11425 Stingy sphere is black inside
#11485 Slice routed layers/groups duplicates not outputting any more unless previewed
#11351 "This.." target Midi shortcuts vs OSC "Only this..." shortcuts are behaving different
#11369 Clip Drag and drop doesn't scroll nicely
#11406 Possible crash trying to save an empty palette
#11294 Line stride is incorrect on DV capture via AV Foundation
#11415 moving clip In/out points with shift only moves the value of the one you grab
#11384 Cue points are not imported from R5 compositions.
#11455 Crash receiving an OSC message without a leading /
#11385 Midi feedback for multi option objects with multiple shortcuts is broken
#11389 Fix Distance50 mixer not outputting premultiplied colors
#11390 Fix PointGrid effect not outputting premultiplied colors
#11388 Fix DotScreen effect not outputting premultiplied colors
#11386 OSC ../connectspecificclip and ../connectspecificcolumn no worky over int 1
Chasing Greatness with Sandy Meidinger
On a fine sunny afternoon, in 2014, Joris de Jong was holed up in front of his computer, of course. Apart from a full- time job serving coffee at Resolume HQ, he moonlights as a video operator. And that day he was mighty frustrated.
Joris had gotten sick of customizing and then rendering the same content, over and over again for every show he played- with minute changes in timing and position. “There has to be an easier way!” he thought to himself, sipping on below average coffee that he had not brewed.
And so, Chaser was born. [fold][/fold]
What is Chaser?
Chaser is a plugin that serves up the perfect solution to VJs who play a lot of different shows and do not have time to render custom content for each show. It makes a job, that would take hours, happen in minutes. It enables you to create chase effects, screen bumps and sequences based on your input map (That’s right, INPUT) in Resolume.
Chaser converts the slices you create in your input map to “buttons” that you can toggle on & off- and so create different chase effects & sequences. Read all about the why’s & how’s here.
Once you are ready with your different sequences, you can apply chaser as an effect (to your composition, layer or clip) in Resolume and voila! You’re ready to chase that kill.
Which leads us to Coachella 2018.
Visual artist Sandy Meidinger, on duty for Illenium, served up slices as delicious as grandma’s black cherry pie. She diced that LED up nice and fine and thoroughly used (and abused) Chaser- to its full potential.
Thank you for talking to us Sandy.
Let’s start from the beginning. How did this visual journey begin for you?
In 2012, I was finishing up my undergraduate degree in Graphic Design and I had to take an After Effects class. During the first weekend of the semester I went to a rave and noticed the videos on the LED screens looked like they were made in After Effects. That night I decided that I was going to learn how to do that, and so I did it.
Living in Southern California made it easy to connect with other VJs. I’ve spent the majority of my career as the house VJ at Create Nightclub in Hollywood thanks to V Squared Labs but it was the word of mouth among the artists that got me my job with Illenium.
So, what is working with Illenium like? Tell us about his show & his set at Coachella.
I love working with Illenium. I work very close with him and his team and over the past 18 months we’ve become like a family. They care a lot about what the visual show looks like, which makes my job even better.
We run two shows now, a DJ set and a live show. The DJ set is Illenium on CDJs and me mixing and triggering the videos by ear. The live show, which we performed at Coachella, is run by Ableton. For the visuals, Ableton sends MIDI to Resolume. I’ve used this system for about 40 shows without fail.
Coachella was one of the later shows using this system, so almost all of the show had already been created. We added some new content for some new songs but the main thing I had to worry about was mapping the 2 x 4k outputs. I was able to upgrade my machine to one with a GTX1070 before the show.
What made you start using Chaser & what has it made easier for you?
I started using Chaser in its very early stages, during the release of Resolume 5. I remember reading the manual and being fascinated by the input mapping. Everyone I knew at the time had been using Layer Routers to route slices, and I was never able to fully understand or practice it to incorporate it into my show.
The input map made a lot of sense to me and I haven’t looked back since. Up until very recently, Chaser was the only mapping tool I used for many shows and I still use it on its own for stages with smaller outputs.
And so, we come to Chaser & Coachella. Give us all the juice, please.
Here are the video map & pixel maps of the Sahara Tent at Coachella 2018:
Since the majority of the live show is run by MIDI from Ableton I am able to focus more on mapping and how all the content fits on the stage. For extra- large stages I use a combination of the Mapper plugin from Dvizion as well as Chaser.
Mapper handles the overall placement and look of each video and I use Chaser for some extra flair. One way I organize my looks through Chaser, is to create an extra screen that is not outputting for each look. This gives me room to play around while knowing that I will not be messing with anything on the output side.
There is a point in the show where I flash the Illenium logo in a grid that is formed by the design of the LED panels.
Because of the 2 x 4k outputs, I had A LOT of pixels to work with. I ended up with 473 number of slices across the whole input map. If I could redo it, I would increase the scale of the grid because the number of slices loses your eye too fast for the amount of time I use this part.
Other looks I create with Chaser are for the content to flash randomly with each panel as a whole. And to split the screens in half and flip one side to create a mirror effect.
I also use it to map the LED for our hologram DJ booth.
What is the hologram DJ booth?
The DJ booth is an acrylic structure with 3x2 6mm panels on the bottom that reflect onto a transparent film. This creates a "Pepper's Ghost" hologram effect.
We bring the DJ Booth with the live show as often as we can but because of its size it doesn't always work with the festival stage setup. Most of the time it is run from the third output of my laptop and in Resolume I have it on its own layer which is routed through Chaser. The clips are triggered through MIDI by Ableton the same way the rest of the show is.
Did any issues creep up on you while programming? How did you deal with them?
Most of the programming for the show was done at home. Since I use input maps, I had a good idea of what the content was going to look like before I got on site. I had zero issues using my map during load-in and the show. I was even able to finish my programming on site in less than an hour thanks to Chaser & Mapper.
The only issues our show had was from using our network on the VLAN over fiber and the Ableton MacBook Pro overheating in the sunlight.
Sigh. I can’t even count the number of “MacBook Pro overheating” situations I’ve heard of.
And so, tell us about your rig. Anything on your wish-list?
For Coachella, I was able to upgrade my 2-year old 15” Sager with GTX 980 to the new 15” NP9155 with GTX 1070. This machine runs perfectly with my set up of running my input map through Chaser & Mapper. I was able to test 3 x 4ks with my composition size of 4850 x 1200 and still got 60 fps.
One thing I’m looking forward to doing this summer is getting a PCIE 2TB SSD.
And what about your wish-list, software- update wise?
A feature I would love to see in Resolume would be to be able to drag & drop columns. In my compositions, each song is its own column so I stack the chaser effects above it. When Illenium changes the order of the set I have to move each clip individually. This would help out a lot especially in my DJ set show file.
For Chaser, being able to select multiple slices with something like a marquee tool would be a huge time saver for me. The new update with exporting the input map as a PNG will definitely help me out for the large stages.
Finally, please drop some slices of wisdom for our budding Chaser users out there.
Just like learning anything new for the first time it just takes practice! It takes a moment to wrap your head around the concept of using the input map, but once you figure it out the possibilities are endless.
The Resolume crew loves the fact that you recognize and appreciate the value of Input maps, Sandy. Keep up the great work.
For everyone who is interested in learning about input maps and other cool things you can do with Arena 6, check this video out:
It’s time to go chase those dreams, eh?