last blogpost, we spoke to Max about the process of content development for his AV album “Emergence”. In this second part, we understand his equipment, live setups, life philosophies & much more.In the
One of the things we have been curious about is how his rig “flows” live. It takes a sweet mash of hardware and software to achieve a perfect sync and, at the same time, the flexibility to freestyle.
Let me explain it from an information-flow perspective.
First, I have my midi controllers, the APC40 and Lemur on iPad, and sometimes the Novation Launch Control XL, mainly for when I’m doing surround sound and/or Aether live shows:
In addition to my usual visual set up, I send midi control information into Ableton in order to launch clips, trigger percussive sounds, work with glitch effects, delays, reverbs etc., and to work with EQs and filters – all the normal Ableton live controls. I also send midi to Ableton for some visual-only controls, such as my effects matrices, whereby I can assign any combination of many different visual effects to link to the filter cut off frequency of one particular filter, for example.
All of the visual controls for my live show arrive via Ableton and OSC over ethernet cable, whether they actually do anything to the audio or not. This allows me to continually work on the audio-visual interface, so that I can constantly try to improve the link between the audio and visual.
I’m always thinking – “OK I want to do this particular glitch effect or audio transition with a delay, or whatever, but how should that particular sound, look?”
Then, the next challenge is to figure out how I can make it work in Arena.
Luckily for me Arena, has a lot of effects and modulation options, so I’ve managed to find some nice mapping techniques which are in line with the concepts I’m trying to show i.e. how simple building blocks come together to create complex beautiful outcomes i.e. emergence. This is a very old video about this, but hopefully still relevant:
There is another, more practical, reason why I send all my controls through Ableton en route Resolume, which is that I can use Max for Live devices to map the control curves – it may be that I want a particular graining effect to come in as I filter in a sound, but maybe a 1:1 mapping of the filter cut off to the grain fade parameter doesn’t quite work. In fact, what I found was that 1:1 mappings rarely felt natural. So, I use hundreds of Max for Live devices for changing the mapping correspondences.
Sometimes a straight line needed to map to a shallow, or sharp, curve; or map to a limit less than the highest value on the receiving end. I use Max for Live’s old API tools for these jobs, although there are plenty more parameter to parameter tools out there which do the same sorts of jobs, some where you can draw in the correspondences yourself. I spent ages on this side of the set-up, trying to create something I could jam with just like I was playing an audio-only set, with my usual glitching and chopping approaches, but whereby the visual would also follow in sync and in style.
That is really interesting.Tell us what made you start working with Resolume. Are there any features that you particularly like? Anything you would like to see more of?
I came to the software with little experience of using visual tools and I found it a pleasure to use, and a very powerful tool for my live shows. If I wanted to do something, I could pretty much do it.
It has mainly been the suite of effects that has enabled this, I have about 70 different effects on my composition channel that I can quickfire trigger live for beautiful fun glitch mayhem on top of the video renders which already contain plenty of their own glitch:
I’m also now doing more and more multi-screen immersive visual shows where I’m projecting 3 or more surfaces around the audience, which Arena is amply set up for achieving.
I have to admit I haven’t had time to try Arena 6 yet, and I know there is a new Ableton communications technique, which may open some doors for me. The one thing I’ve struggled with in the past has been getting a consistent and tight sync between Live and Arena, which may well have been solved with Arena 6 already.
Oh I’d definitely like to see more effects! I love my visual effects, and I’ll use as many as you can provide, all at the same time until it’s a right nice mess.
Boy do we love a good ol' effects mash.
Tell us a little more about your controllers and glitch creators. How do you manage to intricately control the effects and glitches in the visuals with the audio?
I’m using Lemur to trigger glitch sounds like live drumming, and each different sound triggers a different visual effect via the pathway from midi controller to Ableton Live, to OSC trigger via Max for Live mapping devices and the Resolume Parameter forwarder over ethernet cable between the two laptops.
Then, I also have filter cut off frequencies on glitch sounds linked to glitchy audio effects, so that I can smoothly introduce audio-visual glitchyness in addition to the sharp glitchyness of the live Lemur drumming. And I can assign many different combinations of visual effects to a single filter cut off frequency, so that I can do similar audio glitching with very different visual glitching effect.
I know particularly tracks and videos are better suited to one or other type of effect or combination of effects, and every show I experiment with these combinations to find little tricks for each part.
Tell us about your Studio. What’s on your wish list & anything in there that you would like to change/ upgrade?
At the moment, I’m all about my Dave Smith Instruments and loads of random guitar effects pedals mainly. I used to do everything digitally though so I’m not on the analogue bandwagon, just enjoying the wagon for now. My staples being the Prophet 6, the Prophet 08, the Juno 6, The Moog Sub 37, Moog Miniature and still plenty of Henke’s trusty Operator for soft synth sketches, and plenty of NI software – Absynth, Guitar Rig, Razor etc.
Pedal-wise, I’m loving my Fairfield Circuitry units I discovered on a recent Montreal trip, and have been putting the Meet Maude and Shallow Water to lots of use.
I love the classic Roland RE201 space echo tape delay too, and the Moogerfooger Ring Mod and Midi Murf. And for full on analogue pedal mayhem the Industrialectric DM-1N and Echo Degrador, and the WMD Geiger Counter. And the Strymon Big Sky for a beautiful Plate reverb simulation.
As for what I want to have – a Jupiter 8! But I can’t afford it, it’s got ridiculous how much they’re going for. So, I’m mainly focused on finding unusual pedals and experimenting with pedal combos.
My most recent upgrade was the Genelec 8050’s from the 8040’s, they’re lovely monitors in my opinion, nice and full and soft and round, both physically and audibly! That’s why I upgraded directly to the next model.
Sweet. That rig sounds nice and heavy.
And finally, any pearls of wisdom for our budding AV artists out there?
I spend most of my time reading science and philosophy books rather than listening to music or reading about work in the arts. It’s those ideas which are the starting points of most of my new projects. The same goes for my video briefs, I’m mainly just trying to convey what I think is exciting and inspiring about a particular idea, with the hope that a video artist might share some of my thoughts and feelings.
For me, too much of the AV and computational art scene is based around the endpoint aesthetic, just making something look cool for the sake of it. The same goes for music. That’s why I’m trying to work with ideas that I love for a more meaningful reason, to enrich the process, harness the inherent value of nature, push me in new directions creatively, and so that I can use each project to learn more about the world.
So, to answer your question more succinctly, I don’t use storyboards most of the time, but instead just try to put across the ideas and feelings I want to convey so that the video artist can express themselves with plenty of room for experimentation. That approach also lends itself well to the sorts of ideas we’re working with, which are often abstract and without the need for characters and traditional narratives.
And my suggestion to help people grow as artists would be to find what it is that makes you, you. Art is a process of making that tangible, and everyone is different, so you can find your niche by being honest with yourself.
So well said.
Throughout Emergence, Max’ love and understanding of science is so evident. There is such a beautiful balance between hard scientific data visualizations and artistic representations of scientific theories, it’s really the sweet spot between hard core science and artistic interpretation.
And so, before we sign off, Max, we must ask you: What comes first for you- Science or Art?
I’m glad you mentioned that it is “artistic representations”, as sometimes it can sound too much like a science lecture, which it absolutely isn’t. It’s about the art hiding in there in science, and plenty of artistic interpretations and maximal artistic license applied to scientific ideas. I actually did a lecture about all of this recently, which is online here:
It’s been a fun process and I can see that there is a lot more potential in working with these sorts of links between fields. While I won’t be adding to the Emergence project specifically, but instead I am working on some new wide-ranging concepts which drive music and visual creation, and my live shows.
Lots more to follow soon about those projects, if you want to find out as they arrive then drop your email onto my website and I’ll send you previews of each project as it comes
Also, one final note, all of the collaborations, credits and ideas, along with stills and videos, are on the Emergence mini-site here.
Photo by Alex Kozobolis
Speaking to Max about science, art, his thoughts & everything else in between has been nothing short of inspiring.
As our good friend & avid Resolume user Albert Einstein says, “Imagination is more important than knowledge.”
So, go ahead! Imagine. Create. And, of course, tell us about it