Category » Inspiration Clear ×

Inter-Interview with Sean Bowes

A while back we saw a pretty awesome Slice Transform tutorial by Sean Bowes.
This is how we got introduced to this talented VJ, content creator and animator.
While digging through his YouTube-channel we found many cool interviews with familiar faces like Laak, Joris de Jong and Sandy Meidinger.

So we decided to flip the script and ask Sean if we could interview him, and here we are.



Hey Sean! Thanks for taking the time for this interrogation. Tell us your story, how did you get into VJ’ing?

I have always had an interest in art, technology, and music - always drawing as a kid and filming skits on my parents camcorder with my siblings. That evolved into an interest in graphic design as I progressed into high school. After high school, I decided to pursue a BFA in Graphic Design and Studio Art.

During that time I started to see projection-mapped visuals at concerts. Specifically, I remember seeing EOTO at the beginning of 2012, and being really impressed with the design and animation of their new Lotus Stage design. The next day I looked up who was behind the visuals of that show and discovered the AV Artist Zebbler. I sent him a few emails with questions about projection mapping and how he approached making that type of content so I could try to do it myself. He set me further on the path of exploring Cinema4D and also introduced me to Resolume.

In 2015 I heard about a VJ Competition that was happening as part of the Together Festival in Boston. I had been playing with Resolume quite a bit in my free time but had never performed outside of my own bedroom before. I entered the competition and ended up winning first prize! That introduced me to more VJs in the area and opened some doors for me to start performing at some clubs in the area.



After your initial introduction into the world of VJ’ing you quickly started landing bigger gigs and festivals. Bigger gigs means more responsibility and art direction. Do you prefer the “anything goes”-vibe at clubs over art directed festivals and concerts?

I actually enjoy a mixture of both. I am a fan of variety.
I always like working on something new, and working in new situations with new people. So it is fun to switch gears between creating from my own ideas, and helping to execute someone else's. It can be refreshing and inspiring to see an idea that someone else has come up with and help to bring it to life with some of my own creativity, and I am often given access to opportunities that I wouldn't necessarily pursue on my own.

That is also why I enjoy being an animator / VJ / YouTuber. Moving from creating content, to mixing content, to educating others keeps me feeling inspired and helps me to continue learning and to think about my process in new ways.

Funny that you mentioned YouTube as we’ve been binging your channel at the Resolume HQ. How did you get started with YouTube and how does this affect your career as a VJ and animator?

I’ve had a YouTube channel for several years where I have intermittently posted videos about trips I’ve taken, daily life vlogs, and some tutorials here and there. Its current form started around the end of 2019 when I made a Getting Started in Notch tutorial. It was a piece of software I was excited about, but some of my peers seemed confused about how to get started or confused about the pricing. So I made a video that attempted to get them over the barriers of that first attempt, and a few friends watched it and liked it. I liked that feeling of actually helping people more than the usual flame emoji comment on an instagram post.

Then Covid-19 hit and I wanted to make stuff even though I couldn’t go out VJ’ing anymore. I made a couple more tutorials and I started doing interviews with people from the VJ scene.

The interviews didn't get as many views as I think they should, I guess this is because the subject is for a very niche audience of professional VJ’s and my channel was quite small at this time. So after about a dozen of them I decided to continue making tutorials. They attract more new viewers because they are more searchable. Eventually, I would like my channel to evolve into a comprehensive guide to get people started with VJ’ing. This is also why I go beyond making Resolume tutorials and do tutorials about landing gigs, buying a computer and other software packages.That has helped grow the channel and bring more eyes to the interview series as well.

I’ve met a lot of people as a result, from people I’ve looked up to and now have the privilege of interviewing, to new VJs who are excited to enter the field, and even some new clients. That's what I love about YouTube, you are building a community that you can interact with.



We are very happy to see a content creator making videos about VJ’ing in general.
Professionalization of the VJ profession is a noble cause indeed. What would you say beginner VJ’s need to learn to make the step to the big boys league?


From a technical point of view: You should know the Advanced Output inside and out. Big stages come with lots of creative and technical challenges, so having a solid understanding of all of the mapping tools and techniques will really help you. Also, learn to work with cameras, capture devices, and video routers. Eventually you also want to have a basic understanding of timecode as bigger artists tend to incorporate that into their shows in some way. This one sounds basic - but an understanding of color theory for matching visuals with lighting is another essential skill.

From a professional point of view: Learn how to work with the lighting guys and art directors. Work with people, and be enjoyable to work with. Coordinate transitions, build-ups and drops with the lighting guy. You can start practicing that at small club gigs. Be humble and willing to learn from other people, and if you don’t know how something works: just ask.

From a gear point of view: Stress-test your system and make sure your machine is up to it. Create a MIDI-mapping and familiarize yourself with it. Practice, practice, practice.



Let’s talk Resolume for a bit. What Resolume feature was a game changer for you?

Groups, definitely groups.
Separating your content into groups makes larger projects more manageable.
I separate my groups into tasks; a group for content, a group for logos, a scenic group and/or a group for advertisements.

Separation into groups really helps when you are working with content that has different mapping requirements, or needs to be always on, or manipulated separately. You don’t accidentally mirror the event logo or apply glitch effects to ads.


Thank you so much for sharing your story Sean. We are looking forward to new interviews and tutorials.

PS: Like and subscribe! And something with a bell.. And share.. Something! Am I doing this right Sean? …Sean?

Resolume Workshop @ St. Joost Art Academy

Last week Resolume went to the St. Joost School of Art & Design in the city of Den Bosch to share some of our Resolume-knowledge with the students there. VJ Liza Renee invited us to collaborate with her on a media art-focused Resolume workshop. We are always interested in using Resolume for more than VJ’ing so we got hyped and joined in on the workshop.

We started out with teaching the basics: composition, clips, layers, sources, effects and more effects and more effects and then some more effects. After that, we continued exploring the basics of projection mapping as most of the students were interested in this subject.

DSC00504.jpg

After the students got to grips with the basic operation of the software they each went their own way in making an installation. The focus here was on experimentation instead of producing, which led to a chill vibe in which the students could freely make while having support from us and Liza.

We had an amazing day with the students and saw some interesting results.
Each artist took their own approach to making their work, some used Arena to mix their existing video work, others jumped on projection mapping and there was even some experimentation with Wire.

We are looking forward to doing more workshops in the future.
If you are interested in a Resolume workshop at your educational institution feel free to contact us at mail@resolume.com

DSC00503.jpg

DSC00500.jpg

Visual storytelling with STVinMotion

STVinMotion is the globe-trotting company of Steve Kislov and Nadya Abra. They produce footage packs and tutorials while traveling the world. We caught up with Steve to talk about his work, visuals, tutorials and travels.

Steve hosted a workshop on our Youtube channel. The footage used in the workshop can be found here and the masks used can be found here.



Batman has a great origin story, what’s yours?

I’m originally from Israel. In 2000, after travelling and losing myself at the best psytrance festivals around the world, I returned to Israel to find out that a highschool friend (VJ Masterdamus) was VJing and creating better visual experiences than what I saw at some of the top festivals abroad. I started helping him out, just out of sheer enthusiasm.

When he suddenly relocated to Australia, I had to choose if I wanted to let VJing go or become a VJ. Easy choice! I bought Masterdamus’s computer and he taught me how to create VJ content through video calls. Together, we performed at some of the best psytrance festivals abroad - the circle was completed, VJ STV was born.

The next step in my journey was STVinMotion which I founded together with my partner Nadya Abra. When we decided to leave Israel in favor of a nomadic lifestyle we added the "inMotion" to the name, symbolizing the motion of visuals and the motion of travelling from place to place.

I have retired from actively performing - in favor of focusing my attention on content creation and teaching. You could say that these days we perform vicariously through VJs who use our content.


Vunk Concert, Romania. Live Visual by Bobo Visuals

Can you tell something about your creation process? How does a loop pack come to be?

I get inspired by seeing artworks by others. Browsing Instagram or Pinterest and seeing beautiful art can really light my brain on fire. I’m an avid collector. I arrange what I like into themed boards (say, Jungle or Aztec).

When it’s time to create our next VJ pack we have themes and references already lined up. We search for an overlap between what we want to create and what VJs need. The key is making content that has the strongest impact on the audience.

We take the inspirational material and we start brainstorming about what we can make.
We look at the results as if this was a puzzle - looking for connections. The visuals have to connect conceptually and visually. The clips should tell a story and be pleasing to compose. Next, we create a couple of loops, take them into Resolume and see how they mix. We move elements around to see how they compose and react to effects. We try to have a good blend of Background Clips and Element Clips that can be mixed by the users.

From here on it’s an iterative process. We create more loops and try them in Resolume. This process is repeated until the VJ pack is complete.

From a technical perspective: We use mostly Element 3D inside After Effects. We sometimes use Illustrator for design, Cinema 4D for hard surface modeling and Daz3D for characters - all are imported into Element 3D (as paths, OBJs & OBJ Sequences) and are composed there for the final render.


Steampunk Show by ULA Projects, Russia


You guys are living nomadically, how does this influence your content?

We travel slow, spending a couple of months in each location so we can soak the place and ambiance.We expose ourselves to the local artwork and customs. It changes our point of view and evokes inspiration. Travelling is an experience that takes us out of our comfort zone, it forces us to adapt ourselves to the environment and be flexible. The challenges we face help us grow and continue changing. When we reach a new place we meet new people and culture, cuisine and music.

For example, the OrnaMental VJ pack was inspired by Indian Mandalas that we saw locals painting with colorful powder in front of their doors. The Cyborgasm VJ pack idea came while practicing Vipassana Meditation in Thailand and realizing that our reactions to the world are kinda automatic - we might already be cyborgs!


We see eastern theme's, steampunk, architecture, symbolism and tribalism in your content. With this you set yourself apart in a VJ-world where abstract visuals are dominant. How did this come to be?

Having themes in our packs is a deliberate choice and its purpose is to create a stronger experience. When a VJ mixes visuals, one after the other, the audience searches for a connection between them, a story and a meaning unfolds. When all the visuals are abstract, there’s no story or connection between the visuals and after some time the viewer might become indifferent to the show.

We want VJs to become an integral part of any event by introducing themes and providing interesting content thus becoming valuable artists rather than those who fill the void on the screens.”


Steve's Expanse video is showing off some travel-inspired content


You have a great YouTube channel with a wealth of knowledge. Why did you start making tutorials?

I started making VJing tutorials because users of our content were not making the most out of it - not layering and compositing it the way we were envisioning it. I realized that not everyone shares my vision and has the same knowledge.

When I released the first OrnaMental VJing tutorial, it became very popular - the enthusiastic feedback from viewers made me realize that people are craving for this kind of knowledge and that it makes a positive impact on their performances and careers. I enjoy sharing what I’ve learned throughout the years and helping other VJs to grow and acquire new skills. Of course, seeing the visulas we created fulfill their purpose is a huge part of it, too.

I created a Facebook group that is dedicated solely to Resolume Tutorials. Everybody is welcome to join it, learn from the tutorials there and contribute by sharing tutorials that you made yourself.


Awesome Resolume tutorial by STVinMotion

Thank you Steve for the interview. Make sure to check out the STVinMotion website and YouTube channel. They also regularly show their workproces on Instagram and Facebook

We're Hiring! UI/UX Designer | Senior C++ Developer | Product Manager

Comix-645-min.jpg
We're again growing rapidly here at Resolume HQ because of the release of version 7 and a very exciting new product coming up. We're looking to expand our team and hire a new Product Manager, UI/UX Designer and a Senior C++ Developer. Have a look at our open positions below and apply if you qualify or forward this to anyone who you might think is a good match for our team.
[fold][/fold]
UI/UX Designer
Are you a UI/UX designer by day and VJ by night? We have the dream job for you. Join us and design how VJ’s interact with their visuals.

Resolume is an interface-first company. This means that when we think about a new feature in our software we first see how it will be integrated into the Avenue or Arena interface. If a new feature fits well into the current interface and “feels right” only then we start writing code.

As a UI/UX designer for Resolume you need to understand our audience and their environment. Resolume is a live instrument so the interface needs to be very fast and clear to work with. Yet at the same time live shows are becoming more and more complex as stages become bigger. We need to walk the line between complexity and “easy-to-use” every single day.

Responsibilities
  • Create mockups and prototypes of new features before development is started
  • Improve usability of Resolume
  • Modernize UI design of Resolume
  • Score at least 3 goals everyday in our (very passionate) table football matches
  • Expand our Adobe XD design system to quickly create prototypes and deliver the final design


Requirements
  • Experience designing desktop applications
  • Experience as a VJ, DJ or Light Jockey is not required but a huge benefit
  • Fluid in Adobe XD, Photoshop & Illustrator
  • An eye for detail to craft pixel perfect interfaces
  • Full-time (4 or 5 days per week) position
  • Dutch or English speaking
  • Location: The Hague (EU applications only)
  • Salary indication: ~40-52k



Senior C++ Developer
We’re looking to expand our team of 7 C++ developers here at Resolume HQ. We have a lot of ideas for new features in Resolume and we need extra hands and brains to actually build all this cool new stuff. Our team here at Resolume HQ is very diverse with experience ranging from 2 to 20 years in cross-platform software development. This means we can all learn from each other and together come up with the best solution to implement new features and maintain our code.

Are you a team player and you want to join a small and friendly team of developers then contact us, we’d love to have a chat.

Requirements
  • 5+ years experience in developing software using C/C++ on Mac OS X & Windows
  • Experience with JUCE Framework (not required but a very big plus)
  • Experience with OpenGL && Vulkan (not required but a big plus)
  • Full-time (4 or 5 days per week) position
  • Dutch or English speaking
  • Location: The Hague (EU applications only)
  • Salary indication: ~40-52k



Product Manager
Developing software requires much more than writing code. Software needs to look good. It needs to work well. It needs to do what it's users want. It needs to be self explanatory yet well documented. It needs to be promoted.

This requires a lot of different disciplines; developers, testers, support, translators, marketeers and they all need to somehow work together on one product.

Do you know how to juggle all these balls in the air? Come join our team!

You'll be working in our team and our community to determine what we can do to make Resolume even better. What issues need to be addressed first and what features need to be implemented. You’ll be responsible to make sure all those disciplines work together towards one goal; a better Resolume.

Requirements
  • Proven experience in software product management, preferably in the visual design or music production industry
  • Full-time (4 or 5 days per week) position
  • Dutch or English speaking
  • Location: The Hague (EU applications only)
  • Salary indication: ~40-52k



About Resolume
The idea for Resolume was born in 1998. We wanted to VJ with software instead of VHS players because that would allow us to improvise and match our visuals to the music much better. Now, more than 20 years later Resolume has become the industry standard for performing live visuals. We never imagined this would happen when we started and we're very grateful that our long term commitment and determination has brought us this.

Because we have been doing this for so long we are not a start-up anymore. We have so many customers who perform with our software in front of a large crowd. We feel the responsibility to only release new features and updates when they are truly well implemented and tested.

We do not do any contract work, we only work on our own products so we determine our own (realistic) dead-lines. This greatly reduces the stress involved in our work. We're also not a start-up anymore, this means we have no pressure to "bring new stuff to market" as soon as possible. We work on improving Resolume everyday and make long term decisions with quality in mind.

Here at Resolume HQ in The Hague we work in a small (10+) team where everybody is treated equal and your opinion is respected. This way we learn from each other and find solutions for problems together.

We have a very friendly community of Resolume users who gather on our forum and on Facebook. Experienced users share their knowledge here and new users are welcomed and educated. Our development is very much driven by our community. By participating every day we learn about issues very quickly and can respond to user requests.

Apply
To apply please email your motivation, resume and portfolio to [email=bart@resolume.com]Bart van der Ploeg[/email].
Please do not contact us if you are a recruiter.

Keepin’ It Cool with Comix

COM1.jpeg
Over the past decade, the Comix crew has been making such great content, it is almost comical. A multimedia company specializing in live events, interactive design, motion graphics & film, they have worked with almost everyone in the dance music industry.

COMIX2.jpg
From Avicii to Alesso to Kygo to DJ Snake to Axwell Ingrosso to Swedish House Mafia, the list is a bit endless. [fold][/fold]

I especially love all the different visual styles & looks they play with. It’s amazing how much variety there is. So, of course, we caught up with Comix to talk about their art, journey and other cool things.

COMIX3.jpg
Tell us how this journey began.

Harry Bird (Founder/Director):
Growing up my father owned a nightclub in the South of England, the way he used lighting and projectors always interested me. He got a lot of influence from the Velvet Underground, one of the first groups to use projectors at their shows, and I guess his interest rubbed off on me.

COMIX4.jpg
I did a foundation course in arts and design then followed on to do a Degree in Interactive Media at The Arts Institute at Bournemouth. While here I DJ’d for club nights and begun promoting my own nights. VJing caught my eye as it begun to slowly take off during these years and I decided to investigate this further and find out more.

COMIX5.jpg
I was on the same course with Sam Hodgkiss (Co-founder of Comix) and fortunate enough to meet the foundation of our talented team at University while studying Interactive Media, this combination and utilization of skill sets I believe was key to our success.

COMIX6.jpg
After University, and many VJ nights later, Sam and myself managed to get a gig going on a UK bus tour with Radio 1 DJ Kissy Sell Out. In 2010, We supported an act at Brixton Academy called “Swedish House Mafia” - after the show SHM contacted us and asked us to come along to VJ their headline EDC LA show, obviously we said yes!

COMIX7.jpg
And the rest is history, eh?
When I look at your work, what I love the most is how dark it is. You do a great job mixing that in with the EDM flavor. Would you say this is your style? Or are you just working at what feels right for the artist?


Tom Brightman (Comix Creative Director):
I would say that it’s a mix of the two. Our main goal when working with a client on new show content is to deliver them the show that they want, inevitably during the creative process some of our individual creative styles come out in the finished work.

COMIX8.gif
A lot of people come to us as they’ve seen another show that we’ve produced and want something with a similar tone, which is great as it’s obviously a visual area we enjoy working in.

COMIX9.jpeg
However, others approach us as they’ve heard we are good to work with, or trust in our professionalism given the number of high-end clients we have worked with, and they may want something totally different to our usual style, which is also great as it can present a bigger challenge and may result in us having to learn new techniques, software or hardware.

COMIX10.gif
It’s the reason that we use specific designers for specific jobs, if we want to produce the best work it’s a good idea to utilize designer’s particular strengths.

COMIX11.jpg
Alongside a lot of our more well-known shows like Alesso and Avicii we have also produced more upbeat visuals, such as the love story we created for the recent Kygo’s Live ‘Kids in Love Tour’ or hand drawn animations we’ve previously made for The Chainsmokers.

COMIX12.jpg
It’s always great to work in these alternative styles as it keeps us on our toes!

With so many artists you cater to, how do you ensure something fresh for each one of them?

Tom:
From the start of each project we aim to establish a unique visual language for each client independently from one another. Most of us in the studio know the main bulk of our content, which allows us to alter any ideas that may appear similar to another clients. It’s very important to us to not imitate work from another artist, or any other shows, especially as we have a large group of clients in a similar genre.

COMIX13.jpg
The creative process itself tends to do away with much need for that as we are creating unique projects for individual clients, based on their specific tastes, music styles, ideas and requirements.

COMIX14.jpg
On top of this process generally every client has their own visuals operator, usually provided by us, who will lend their own style to the show, especially as the show organically evolves on the road after rehearsals.

Tell us about your content creation process- from scratch to final render. What all software do you work with?

We are often as open-minded as possible when approaching a new project, never really having the final outcome or process in our head, to a degree. The exception to this rule may be a project that is extremely time or budget sensitive.

When designing a show, we usually begin thinking of the bigger picture; What’s the overall visual language (colours, textures, symbols, film/CGI/mixed, etc.)? What theme wants to be communicated and carried throughout the show? What restrictions do we have? And so on.


COMIX15.jpg
After we begin to get a sense of what’s working and what’s not, we will start to move on to individual looks for songs, forming a mood board with a few routes for the client to choose from which we can develop further on.

COMIX16.jpg
Alongside this process, we will begin to think a bit about how we are going to produce it, but the level of that thought process varies massively. It’s more important that the client is happy with the core ideas, and as long as we have a rough idea of how we may translate the looks from paper to screen, then we will figure the rest out later. It’s worked out well for us to let the creative evolve in its right and not be too restricted by process.

After that it’s probably a similar process to most productions, we work out if we need to film anything, if we need any research and development days, then we begin any CGI process.

COMIX17.jpg
There’s always some technique, hardware or software that we need to learn to complete a job (in the office or on the road), but for us it’s not only interesting but rewarding to be able to learn something new, and as a company we’re lucky to have a team of friendly, clever and competent people capable of standing up to that challenge.

COMIX18.jpg
Software wise the list could be endless! Pre- production often involves a lot of drawing, post-it notes and Adobe InDesign for mood boarding.

During production we mostly use Cinema 4D, X-Particles, Octane Renderer, Houdini and Adobe After Effects.

Then for touring live we obviously use Resolume, which takes care of pixel-mapping, inputs, outputs and playback.

COMIX19.JPG
Alongside After Effects, Resolume really has been the tool that we have used and loved since day 1 of Comix and I couldn’t see us changing that any time soon, especially with some of the great features that came out in version 6 and the support you guys give to the community.

Cheers to that!
Tell us a bit about the kind of prep that goes into playing massive stages.


COMIX20.jpg
Sam Hodgkiss:
Every show is different, but I’ll talk about a show like Ultra Miami.
This show is unique in that a lot of our clients like to use it to launch their new visual and lighting show here which will normally be toured throughout the festival season in the summer, sometimes with a few additional updates.

We would start by looking back over the previous show with the client and working out how much new content they would like to add. This can range from a few updates if the content is fairly new, to an entirely new show and direction.

If everything goes well, and there’s always a few complications, we will get a stage design and pixel map in advance of the show. At this point depending on time and budget, we would look at creating custom content specifically designed to take advantage of the stage design, add any unique moments and do what we can to make this show feel different.

COMIX21.jpg
Once all that is complete, minus the usual last-minute amendments, we head into rehearsals. That’s when the fun part begins, sitting with the client and lighting operator and building the show.

Again, depending on time and budget this can vary from a small previz studio to having the entire stage built in a rehearsal warehouse.

This is where Resolume Arena can shine.

COMIX22.JPG
It allows us to keep the show organized, make changes quickly, drag and drop clips, add effects and colour corrections right in front of the client. And if there are any last-minute changes to the stage design or pixel map, Advance Mapper makes it easy to fix and adapt the final output with minimal re-rendering.

And, what hardware do you guys work with?

Sam:
Unlike traditional touring, with most DJs we don’t have trucks to transport large case’s, and sometimes fly in hot straight to stage at shows ranging from small clubs to large arenas. So, in that respect everything needs to be as portable as possible and with no compromise. Below is a list of essentials:

● High end show laptop with a minimum Nvidia 1070. A lot of new festivals now have dual 4k outputs and the 1070 allows playback at 60fps with no frame drops.
● A second identical laptop as a backup, also used to create content on the road.
● Midi controller of choice with backup, this is always personal preference
● Every video adapter under the planet with spares, these things always seem to go missing.


As every show is different, we also have some more specialized equipment for example:

● Datapath Fx4’s for those shows that love 4+ outputs
● Multi-camera rigs. A few of our clients incorporate live cameras into their set to add an extra layer to the live show
● And the camera rig is accompanied by Capture cards, Roland video mixers and preview monitors among other things.
● External Sound cards if audio input or output is needed


A new addition the some of our shows is a Notch Rig. This allows is to incorporate some really cool camera overlays and generative content live into the show.

COMIX23.jpg
When we are lucky enough to have a truck tour or we need the extra power at larger show, Coachella main stage for example, we have video server racks that can really push those extra pixels and outputs.

Tell us about your studio- where all that Comix magic is created.

Tom:
We have recently expanded our studio to accommodate for the greater variety in work we now produce and to accommodate our growing list of talented employees. Now, with one studio concentrating on post production and CGI, and the other as a live film, rehearsal space and test area for interactive work.

Rig wise in the studio we all use PC towers running Windows with at least two graphics cards (NVidia 1080ti + Titan’s), a decent amount of ram (32-64GB) and a good processor (Intel i7/i9), then tonnes of storage on a server for resources and backups. We also have render node’s with NVidia 1080ti graphics cards for when the 3D work needs extra muscle thrown at it. It’s also great if you need an extra heater in the office!

COMIX24.JPG
Haha! What are your favorite bit of content/ shows that you’ve created?

Harry: Every Avicii show I VJ’d over 6 years touring with him is now a memory I’ll never forget.

Comix25.jpeg
Sam: Swedish House Mafia’s One Last Tour. After being on the road for 6 months with the guys and a really amazing crew it felt like a family. Being lucky enough to travel around the world getting up every day to perform this massive show will never be forgotten. It was a truly great send off.

Comix26.jpg
And finally, any words of advice for budding visual artists out there?
Harry:


LOL. Thanks for the great interview and work, guys.
You are quite the c̶o̶m̶i̶c̶s̶ comix, yourselves :)

Orbit & Resolume at the Singapore Night Festival, 2018

What happens when a bunch of artists, architects, programmers and engineers come together? A burst of creation with a dollop of technology. Minimal yet complex. Simple yet amazing. This is what Litewerkz is.

TOT1 (credit Calvin Chan).jpg
Photo by Calvin Chan

Formed by students who met at the Singapore University of Technology and Design, Litewerkz is a newish design collective- with lots of vision. Gradually growing from indoor static installations to outdoor dynamic installations, they have worked at developing themselves by welcoming new members and perspectives, every year.

SLF 2016.png [fold][/fold]
Their installation this year at the Singapore Night festival really caught our eye. Named “Orbit” it was quick to evoke our inherent space obsession. With a central orb (as the primary source of energy- the sun), the installation consisted of different orbs (planets) with retroreflective cores, all wirelessly linked to each other. Very cool & scientific.

Orbit 1.png

So, how did the idea of “Orbit” as an installation come to you.

We’ve worked with 3M (a global science company) on previous installations, making use of their iridescent films. These were displayed at the Singapore Night Festival in the past few years.

This year 3M came to us with a challenge to use their retroreflective materials. These materials are used on traffic signs and safety vests, so we really had to think hard about how we could use them in an art installation.

3M gave us the creative freedom, and provided us with custom colors of these materials, which really helped us achieve our vision for the installation.

Orbit 2.png

We think they were really happy with the result as they have never seen their product used in this way.

Tell to us about the Installation- from idea to production.

Orbit is an abstraction of our solar system; 8 planets around a sun. Each of the planets and sun are individual units and react when the visitors spin them. The LED lights inside illuminate the planets and its unique make-up.

Orbit 4.png

Within each planet is a unique polygonal gem, covered with retroreflective material. It is this gem that lights up with the use of flash photography or videography.

Orbit 3.png

This gem is made computationally with the use of Grasshopper, cut on a corrugated board with a zund cutter, and assembled by hand.

grasshopper.png
Grasshopper script used to create the gems

Each planet uses a Particle Photon to connect to the main server, an Intel NUC. Each Photon has an app that allows it to understand the Artnet commands broadcasted from the NUC. With a MPU6050 6-axis accelerometer and gyroscope, the app is also able to detect the rotational speed and allow the planets to react accordingly.

While stationary or slow speed, the planets glow at half-brightness.

Orbit 5.png

At high speeds, the planets will then glow at full-brightness.

Orbit 6.png

Resolume runs on the Intel NUC, and allowed us to easily setup each planet and the sun; it allows for easy LED mapping as well as sequence control.

Tell us about the hardware that went into Orbit.

Hardware design was one of the most challenging parts of us, as this is our first time designing an interactive installation for the outdoors. There were also no off-the-shelf components for the curvature of each sphere we were using, so we had to design some components from scratch while trying to use as many off-the-shelf components as possible.

Assembly Diagram.PNG

We went with an Aluminum tube for the shaft as it was easy to cut and was very sturdy at 40mm diameter. We opted for a brass bushing due to its low maintenance and low pricing.
We used a slip ring connector to transmit power from the stationary shaft to the rotating spheres. This is the centerpiece of the installation, the interface between the stationary aluminum tube, and the spinning sphere. Finally, the sphere holders were 3D printed in PET-G. PET-G was chosen as it was more heat and weather resistant as compared to PLA, important in the Singapore climate.




In our earlier versions, we were able to engage a printing service to print it as a single piece. But this single piece print took too long and demanded a large print bed. We then designed a split version so that each piece is printable on common print bed dimensions. The split versions aligned with each other with printed pins and holes. Split sections were secured with a PVC tube section and also when the attachments are screwed to the sphere.

And what exactly is retroreflective material?

Retroreflectivity is a property of the material which causes light to reflect in the same direction as the incident ray. If you have seen how road signs shine in the night while you are driving, it is because of this material which reflects the car’s headlights back in the same direction.

What this meant to us as artists, is that this material gave us an opportunity to design experiences that were unique to each individual’s perspective, and we sought to exploit this property in our installation. We encouraged visitors to turn on the flash on their camera phones while taking a photo or video, which would allow them to see the effect of retroreflection unique to the perspective of their camera.

DSC_0150.jpg

Cool! So now tell us about the LED pixel strips. What purpose did they serve, how did you link them all and how did you manage to link their intensities with the speed of rotation of the orbs?

The LED pixel strips serve as a source of light for the retroreflective gems at the center of each sphere. We also intended for them to be eye catching from far, so that visitors nearby would be attracted to the installation.

Orbit 7.png

Beyond these simple functions, we tried to see what else we could do to make the LEDs more interesting. We decided on a light show synchronized across all the spheres. This initially meant running data cables between the spheres, but we managed to find a microprocessor that had WiFi built in, and the relevant ArtNet and WS2812 LED libraries. There would be a central server pushing out the ArtNet signals, and the micro-controllers listening and displaying these signals. This gave us the ability to design light shows synchronized across all the spheres.

We used the MPU 6050 6-axis Accelerometer and Gyroscope, compatible with the libraries available for our micro-controller. After much discussion, we settled on a simple increase in intensity while the sphere rotates, localized to each sphere, as we felt that this would provide a feedback to the audience that is intuitive.

So, the Particle Photon micro-controller is clearly super important to the installation. Tell us more about it.

We chose the Particle Photon as it seemed really easy to connect it to the Internet and also debug with the status LEDs. Particle also has tools like the Particle CLI to help make setting it up and debugging easy. It also helps greatly that the Particle team does great documentation.

The Particle Photon has Digital Input and Output pins for us to control the LED lights. It also features a I2C bus which allows us to use I2C devices like the MPU6050 6-axis Accelerometer and Gyroscope Module.

We especially like the status LED for debugging. The different LED colors and patterns correspond to a list of statuses for the Particle Photon. This made it very easy for us in identifying what remedial action should be done without the need for a serial monitor.

Another benefit of having a micro-controller that connects to WiFi is the ability to push new code onto the micro-controller without having to connect a USB cable. This meant a lot time saved for us, as connecting a cable would require us to open up the sphere and electronics housing, for each of the 9 spheres

Orbit 8.png

To put all these components together, we designed and printed a PCB. This PCB sat in the centre of each sphere, in a housing, and spins together with the sphere.

Schematic Diagram.PNG

The PCB design allowed for us to easily put together the various electronics components. It also made the soldering and connections more consistent.We added a lithium-polymer battery backup system to prevent the micro-controller from shutting down if power connection is lost when the planets are spinning.

Board File.PNG

The PCB design was done on EAGLE, a PCB design software that is free for hobbyist and makers (this free version has limited features). The other good thing about EAGLE is that it works well with Fusion 360; PCB designs can be imported over from EAGLE to Fusion 360, allowing for inspections on the sizing with the overall design.

And so, tell us about the role that Resolume played in this installation.

Resolume allowed us to easily setup each planet and the sun; it allows for easy LED mapping as well as sequence control.

The central Intel NUC with an integrated GPU ran Resolume, and we mapped every planet to a section of the template. We came up with a template of 31x32 pixels to represent all the pixels in the whole installation. Eight 6x16 rectangles are mapped to the eight planets, and a 7x32 rectangle is mapped to the 7 smaller spheres within the sun. Each Particle Photon has a static IP, and we simply mapped each section of the video to the corresponding IP address.

advance.PNG

After exporting the video sequences, we loaded them all into Resolume and set them to autoplay on start, and autopilot. All this happens when the power to the computer turned on, which meant we could let anyone turn on the installation without having to meddle with a computer.

general.PNG

We could have used code to run the entire installation, but the workflow of using After Effects to design the sequences, and Resolume to pixel map and convert this video into ArtNet, gave us the ability to quickly visualize new sequences.

It also democratized the designing of sequences within our team, as we no longer require coding knowledge to design sequences. Anyone can now export a sequence, put it onto Resolume, rearrange the playback sequence, change the speed, or add any other effects.

That is such a great way to use Resolume :)
So finally, considering the whole setup was wirelessly linked and controlled- I'm sure you faced challenges..?


Linking everything wirelessly gave us a lot of convenience, but of course we had to deal with the downside of using WiFi. We noticed that when the festival got crowded, the micro-controllers tend to disconnect from the network, and would freeze on the last received ArtNet frame. We then had to power cycle these frozen micro-controllers from time to time. It could have been interference from mobile phones with their WiFi and Bluetooth turned on, or because our low powered WiFi router was not equipped to deal with 10 simultaneous connections (1 server, 9 spheres).

Apart from these minor problems, we felt that wireless is the way to go logistically.


Interactive installations take the biggest abuse. The PCB was initially designed with female header pins for the outputs but when the wires keep coming out when the planets were spun, we replaced them with screw terminals to ensure that they were held in place.

On the first day of the festival, we witnessed kids and adults using all their strength to spin the spheres as fast as they could.

Orbit 9.png

Well, can you blame them? :)

Thank you Keith & Team Litewerkz for talking to us. We look forward to seeing more of your cool work.

You can find out more about them here , on Instagram and Facebook.

Photo Credits:
Chiasuals
Natalie Chen (Litewerkz)

In Pursuit of Secret Mapping Experiment

Imagine you are driving (top down, of course) along a pristine coastline.
The wind in your hair, the world at your feet. The sun is setting, the ocean is that perfect blue. The salty air is filling your lungs up with freshness.
You smile coz life is perfect and as you muse out into the water, you see this:
SME_1.jpg

Didn’t your life just get so much better?
[fold][/fold]
Enter: Secret mapping experiment.
Who are they? We don’t know.
What do they do? Some mad cool shit.
How do we get to know them? Well, we tried our best.

So, who exactly are you?

We are artists with alpha channels who try to find the edges.
tumblr_oxii0u3QV71uevde3o7_1280.jpg

Yeah? That’s cool.
And, what do you do?


We do projection mapping on buildings, gigantic masterpieces and machines which are now out of use. The concept is to do projection mapping on as many abandoned and closed-down sites as possible, around the globe. This way we would like to rethink the spaces’ (hi)story and architectural details.
SMP_3.jpg

We try to redefine and present these buildings and objects (which had an important role and function in the past) with the utilization of the technology of animation and video mapping, shaking and dressing them up.
SME_4.jpg

This way we give the buildings a ’new life’; we can look back into the past and the ghosts can repopulate the spaces that used to be full of life.
IMPORTANT: the exact locations are being kept a secret, so they can remain for the after-ages.


Wow that is some really cool s*@#$*$#@.
And why do you do this?


If you have been working on commercial projects for many years you need to find a platform where you can experiment your ideas.
This is a platform where you can play freely and create your own games with your own rules to find your daily excites.


Ah. Daily excites :) Don’t we all need them?
SME5.jpg

Abandoned buildings always enchanted us. Our aim is to redefine places that had an important function in the past and give them a new meaning with an entirely new perspective with our projection mappings.
SMP6.jpg

We started in 2015 in an old power factory in Budapest.
SME7.jpg

There are a lot of neglected and forgotten unique constructions and natural formations and we gladly wake them up from their secret past for a night.



What sort of preparation goes into this? From identifying the right projection surface to actually making it happen?

Identifying the right projection surface is a very hard decision. It is very important to get correct photos, keeping in mind the camera angle and the eventual projection angle. We use different technologies, but sometimes we have only one photo of the surface. Sometimes, the surface itself changes, and looks very different in a photo than what it is in reality.
Thankfully, there are good methods that you can use to make 3D plans from photos.

02_cave.jpg

Once this is done, we work at sizes, distances, figuring out the correct projector and lens.
SME8.jpg

Wow. All of this is pretty complex considering you project on massive ice bergs & mountain faces.

Yes. The maximum distance we have had to work with is 300 meters from the surface. That is really far out and so tough to map everything to the right position.
ProjectionSOFAR.png
SME9.jpg

We have got different kind of projectors. Every surface needs a different technology. You can definitely feel the difference between DLP – LCD or Lazer systems when you are in the darkness.

We set our projector up in the back of a Pick-up, the top of a wind turbine or on a special trailer that we’ve built for it.
And we solve electricity problems by using sunlight, uranium or even a hamster wheel.
SME10.jpg

Ah of course. The trusted hamster wheel. Never lets you down.
And so, how long does it take for you to put together an experiment?


Sometimes a couple of weeks, sometimes only a few days. But there is one project that we haven’t been able to finish for two years. We also make interactive stuff what is created on the spot, on the actual day.
SME11.jpg

Tell us about all the different software you work with. Does good ol’ Resolume feature in your secrets?

We use so many different kinds of software - there is no one and only.

Here is one of our usual dialogues:
“F***! it’s almost midnight and we haven’t started yet!”
“Switch the damn projector on and map this shit on it. You can do this!”


Enter Resolume.

“No problem. 5 minutes.”
“Don’t forget the MASKS!”
“OK”

SecretMapping_pres_iceberg_33.jpg

Haha. Just great! Good to know :)
Tell us about some of the challenges you face doing this. I’m sure there are many.


Wow. That is a good question- because every project is a challenge.

We need to do the projection during the night, in the dark, and solve immediately some unexpected issues. A lot of times, we haven’t got permission. So, what we are doing is highly risky.
SME13.jpg

There is a project. We have already tried twice. We travelled 2000 km with all the equipment. It is our dream spot. The top of a mountain, which seems to be always cloudy or rainy. It doesn’t matter, if it’s summer or winter. The mountain just doesn’t let us to do the projection. But we will go back every year and try to realize our dream. We never give up!
SME12.jpg

That is really inspiring. Before you go back into hiding, anything to keep our eyes peeled for?

We want step into the sky and come down. Finish the top of the mountain. But, there are thousands of venues what needs secret mapping all around the world.
If you know some of them, please send us your suggestion secretmapping@gmail.com.


Aaaaand, let the inbox spamming begin!

Thank you for talking to us Secret Mapping Experiment. Boy, are we on the look-out for you.

Check out more of their amazing work here.

Interview Credits: Daniel Besnyo, Founder- Secret Mapping Experiment

Chasing Greatness with Sandy Meidinger

Coachella 2018.jpg
On a fine sunny afternoon, in 2014, Joris de Jong was holed up in front of his computer, of course. Apart from a full- time job serving coffee at Resolume HQ, he moonlights as a video operator. And that day he was mighty frustrated.

Joris had gotten sick of customizing and then rendering the same content, over and over again for every show he played- with minute changes in timing and position. “There has to be an easier way!” he thought to himself, sipping on below average coffee that he had not brewed.
And so, Chaser was born. [fold][/fold]

What is Chaser?

Chaser is a plugin that serves up the perfect solution to VJs who play a lot of different shows and do not have time to render custom content for each show. It makes a job, that would take hours, happen in minutes. It enables you to create chase effects, screen bumps and sequences based on your input map (That’s right, INPUT) in Resolume.

Chaser converts the slices you create in your input map to “buttons” that you can toggle on & off- and so create different chase effects & sequences. Read all about the why’s & how’s here.

Once you are ready with your different sequences, you can apply chaser as an effect (to your composition, layer or clip) in Resolume and voila! You’re ready to chase that kill.

Which leads us to Coachella 2018.

Goldenvoice.jpg
Visual artist Sandy Meidinger, on duty for Illenium, served up slices as delicious as grandma’s black cherry pie. She diced that LED up nice and fine and thoroughly used (and abused) Chaser- to its full potential.

Thank you for talking to us Sandy.

me.jpg
Let’s start from the beginning. How did this visual journey begin for you?

In 2012, I was finishing up my undergraduate degree in Graphic Design and I had to take an After Effects class. During the first weekend of the semester I went to a rave and noticed the videos on the LED screens looked like they were made in After Effects. That night I decided that I was going to learn how to do that, and so I did it.

Living in Southern California made it easy to connect with other VJs. I’ve spent the majority of my career as the house VJ at Create Nightclub in Hollywood thanks to V Squared Labs but it was the word of mouth among the artists that got me my job with Illenium.

So, what is working with Illenium like? Tell us about his show & his set at Coachella.

I love working with Illenium. I work very close with him and his team and over the past 18 months we’ve become like a family. They care a lot about what the visual show looks like, which makes my job even better.

We run two shows now, a DJ set and a live show. The DJ set is Illenium on CDJs and me mixing and triggering the videos by ear. The live show, which we performed at Coachella, is run by Ableton. For the visuals, Ableton sends MIDI to Resolume. I’ve used this system for about 40 shows without fail.



Coachella was one of the later shows using this system, so almost all of the show had already been created. We added some new content for some new songs but the main thing I had to worry about was mapping the 2 x 4k outputs. I was able to upgrade my machine to one with a GTX1070 before the show.

What made you start using Chaser & what has it made easier for you?

I started using Chaser in its very early stages, during the release of Resolume 5. I remember reading the manual and being fascinated by the input mapping. Everyone I knew at the time had been using Layer Routers to route slices, and I was never able to fully understand or practice it to incorporate it into my show.

The input map made a lot of sense to me and I haven’t looked back since. Up until very recently, Chaser was the only mapping tool I used for many shows and I still use it on its own for stages with smaller outputs.

And so, we come to Chaser & Coachella. Give us all the juice, please.

Here are the video map & pixel maps of the Sahara Tent at Coachella 2018:

Coachella Video Map 2018.pngCoachella 2018 Video Map
Sahara2018_Map1_v3_pixelmap.pngCoachella 2018 Pixel Map 1
Sahara2018_Map2_v3_pixelmap.pngCoachella 2018 Pixel Map 2

Since the majority of the live show is run by MIDI from Ableton I am able to focus more on mapping and how all the content fits on the stage. For extra- large stages I use a combination of the Mapper plugin from Dvizion as well as Chaser.
Coachella_600000.png
Mapper handles the overall placement and look of each video and I use Chaser for some extra flair. One way I organize my looks through Chaser, is to create an extra screen that is not outputting for each look. This gives me room to play around while knowing that I will not be messing with anything on the output side.

There is a point in the show where I flash the Illenium logo in a grid that is formed by the design of the LED panels.



Because of the 2 x 4k outputs, I had A LOT of pixels to work with. I ended up with 473 number of slices across the whole input map. If I could redo it, I would increase the scale of the grid because the number of slices loses your eye too fast for the amount of time I use this part.
Chaser Grid.jpg
Other looks I create with Chaser are for the content to flash randomly with each panel as a whole. And to split the screens in half and flip one side to create a mirror effect.
W2 Nainoa 2.jpg
I also use it to map the LED for our hologram DJ booth.

What is the hologram DJ booth?

The DJ booth is an acrylic structure with 3x2 6mm panels on the bottom that reflect onto a transparent film. This creates a "Pepper's Ghost" hologram effect.
DJ Booth-saintlouis-thepageant-alexschenberg-processed-038.jpg
We bring the DJ Booth with the live show as often as we can but because of its size it doesn't always work with the festival stage setup. Most of the time it is run from the third output of my laptop and in Resolume I have it on its own layer which is routed through Chaser. The clips are triggered through MIDI by Ableton the same way the rest of the show is.

Did any issues creep up on you while programming? How did you deal with them?

Most of the programming for the show was done at home. Since I use input maps, I had a good idea of what the content was going to look like before I got on site. I had zero issues using my map during load-in and the show. I was even able to finish my programming on site in less than an hour thanks to Chaser & Mapper.



The only issues our show had was from using our network on the VLAN over fiber and the Ableton MacBook Pro overheating in the sunlight.

Sigh. I can’t even count the number of “MacBook Pro overheating” situations I’ve heard of.
And so, tell us about your rig. Anything on your wish-list?


For Coachella, I was able to upgrade my 2-year old 15” Sager with GTX 980 to the new 15” NP9155 with GTX 1070. This machine runs perfectly with my set up of running my input map through Chaser & Mapper. I was able to test 3 x 4ks with my composition size of 4850 x 1200 and still got 60 fps.

One thing I’m looking forward to doing this summer is getting a PCIE 2TB SSD.

And what about your wish-list, software- update wise?

A feature I would love to see in Resolume would be to be able to drag & drop columns. In my compositions, each song is its own column so I stack the chaser effects above it. When Illenium changes the order of the set I have to move each clip individually. This would help out a lot especially in my DJ set show file.

For Chaser, being able to select multiple slices with something like a marquee tool would be a huge time saver for me. The new update with exporting the input map as a PNG will definitely help me out for the large stages.

Finally, please drop some slices of wisdom for our budding Chaser users out there.

Just like learning anything new for the first time it just takes practice! It takes a moment to wrap your head around the concept of using the input map, but once you figure it out the possibilities are endless.

The Resolume crew loves the fact that you recognize and appreciate the value of Input maps, Sandy. Keep up the great work.

For everyone who is interested in learning about input maps and other cool things you can do with Arena 6, check this video out:



It’s time to go chase those dreams, eh?

Maxing Out on Science & Art (Part Two)

EmergenceLive7.jpg
In the last blogpost, we spoke to Max about the process of content development for his AV album “Emergence”. In this second part, we understand his equipment, live setups, life philosophies & much more. [fold][/fold]

One of the things we have been curious about is how his rig “flows” live. It takes a sweet mash of hardware and software to achieve a perfect sync and, at the same time, the flexibility to freestyle.

Let me explain it from an information-flow perspective.

IMG_1356.JPG
First, I have my midi controllers, the APC40 and Lemur on iPad, and sometimes the Novation Launch Control XL, mainly for when I’m doing surround sound and/or Aether live shows:



In addition to my usual visual set up, I send midi control information into Ableton in order to launch clips, trigger percussive sounds, work with glitch effects, delays, reverbs etc., and to work with EQs and filters – all the normal Ableton live controls. I also send midi to Ableton for some visual-only controls, such as my effects matrices, whereby I can assign any combination of many different visual effects to link to the filter cut off frequency of one particular filter, for example.

Hyperform1.jpg
All of the visual controls for my live show arrive via Ableton and OSC over ethernet cable, whether they actually do anything to the audio or not. This allows me to continually work on the audio-visual interface, so that I can constantly try to improve the link between the audio and visual.

I’m always thinking – “OK I want to do this particular glitch effect or audio transition with a delay, or whatever, but how should that particular sound, look?”


Aether4.JPG
Then, the next challenge is to figure out how I can make it work in Arena.

Luckily for me Arena, has a lot of effects and modulation options, so I’ve managed to find some nice mapping techniques which are in line with the concepts I’m trying to show i.e. how simple building blocks come together to create complex beautiful outcomes i.e. emergence. This is a very old video about this, but hopefully still relevant:



There is another, more practical, reason why I send all my controls through Ableton en route Resolume, which is that I can use Max for Live devices to map the control curves – it may be that I want a particular graining effect to come in as I filter in a sound, but maybe a 1:1 mapping of the filter cut off to the grain fade parameter doesn’t quite work. In fact, what I found was that 1:1 mappings rarely felt natural. So, I use hundreds of Max for Live devices for changing the mapping correspondences.

Sometimes a straight line needed to map to a shallow, or sharp, curve; or map to a limit less than the highest value on the receiving end. I use Max for Live’s old API tools for these jobs, although there are plenty more parameter to parameter tools out there which do the same sorts of jobs, some where you can draw in the correspondences yourself. I spent ages on this side of the set-up, trying to create something I could jam with just like I was playing an audio-only set, with my usual glitching and chopping approaches, but whereby the visual would also follow in sync and in style.

Hyperform2.JPG
That is really interesting.Tell us what made you start working with Resolume. Are there any features that you particularly like? Anything you would like to see more of?

I came to the software with little experience of using visual tools and I found it a pleasure to use, and a very powerful tool for my live shows. If I wanted to do something, I could pretty much do it.

It has mainly been the suite of effects that has enabled this, I have about 70 different effects on my composition channel that I can quickfire trigger live for beautiful fun glitch mayhem on top of the video renders which already contain plenty of their own glitch:



I’m also now doing more and more multi-screen immersive visual shows where I’m projecting 3 or more surfaces around the audience, which Arena is amply set up for achieving.



I have to admit I haven’t had time to try Arena 6 yet, and I know there is a new Ableton communications technique, which may open some doors for me. The one thing I’ve struggled with in the past has been getting a consistent and tight sync between Live and Arena, which may well have been solved with Arena 6 already.

Oh I’d definitely like to see more effects! I love my visual effects, and I’ll use as many as you can provide, all at the same time until it’s a right nice mess.

Boy do we love a good ol' effects mash.
Tell us a little more about your controllers and glitch creators. How do you manage to intricately control the effects and glitches in the visuals with the audio?


I’m using Lemur to trigger glitch sounds like live drumming, and each different sound triggers a different visual effect via the pathway from midi controller to Ableton Live, to OSC trigger via Max for Live mapping devices and the Resolume Parameter forwarder over ethernet cable between the two laptops.

Then, I also have filter cut off frequencies on glitch sounds linked to glitchy audio effects, so that I can smoothly introduce audio-visual glitchyness in addition to the sharp glitchyness of the live Lemur drumming. And I can assign many different combinations of visual effects to a single filter cut off frequency, so that I can do similar audio glitching with very different visual glitching effect.

4DSound1.JPG
I know particularly tracks and videos are better suited to one or other type of effect or combination of effects, and every show I experiment with these combinations to find little tricks for each part.

Tell us about your Studio. What’s on your wish list & anything in there that you would like to change/ upgrade?

At the moment, I’m all about my Dave Smith Instruments and loads of random guitar effects pedals mainly. I used to do everything digitally though so I’m not on the analogue bandwagon, just enjoying the wagon for now. My staples being the Prophet 6, the Prophet 08, the Juno 6, The Moog Sub 37, Moog Miniature and still plenty of Henke’s trusty Operator for soft synth sketches, and plenty of NI software – Absynth, Guitar Rig, Razor etc.

SurroundSoundAndVisualTests.JPG
Pedal-wise, I’m loving my Fairfield Circuitry units I discovered on a recent Montreal trip, and have been putting the Meet Maude and Shallow Water to lots of use.

I love the classic Roland RE201 space echo tape delay too, and the Moogerfooger Ring Mod and Midi Murf. And for full on analogue pedal mayhem the Industrialectric DM-1N and Echo Degrador, and the WMD Geiger Counter. And the Strymon Big Sky for a beautiful Plate reverb simulation.

Surround Sound Home Studio Set Up.JPG
As for what I want to have – a Jupiter 8! But I can’t afford it, it’s got ridiculous how much they’re going for. So, I’m mainly focused on finding unusual pedals and experimenting with pedal combos.

My most recent upgrade was the Genelec 8050’s from the 8040’s, they’re lovely monitors in my opinion, nice and full and soft and round, both physically and audibly! That’s why I upgraded directly to the next model.

Sweet. That rig sounds nice and heavy.
And finally, any pearls of wisdom for our budding AV artists out there?


I spend most of my time reading science and philosophy books rather than listening to music or reading about work in the arts. It’s those ideas which are the starting points of most of my new projects. The same goes for my video briefs, I’m mainly just trying to convey what I think is exciting and inspiring about a particular idea, with the hope that a video artist might share some of my thoughts and feelings.

Hyperform4.jpg
For me, too much of the AV and computational art scene is based around the endpoint aesthetic, just making something look cool for the sake of it. The same goes for music. That’s why I’m trying to work with ideas that I love for a more meaningful reason, to enrich the process, harness the inherent value of nature, push me in new directions creatively, and so that I can use each project to learn more about the world.

So, to answer your question more succinctly, I don’t use storyboards most of the time, but instead just try to put across the ideas and feelings I want to convey so that the video artist can express themselves with plenty of room for experimentation. That approach also lends itself well to the sorts of ideas we’re working with, which are often abstract and without the need for characters and traditional narratives.

MultiOutputVisualTests.JPG
And my suggestion to help people grow as artists would be to find what it is that makes you, you. Art is a process of making that tangible, and everyone is different, so you can find your niche by being honest with yourself.

EmergenceLive9.jpg
So well said.

Throughout Emergence, Max’ love and understanding of science is so evident. There is such a beautiful balance between hard scientific data visualizations and artistic representations of scientific theories, it’s really the sweet spot between hard core science and artistic interpretation.
And so, before we sign off, Max, we must ask you: What comes first for you- Science or Art?

I’m glad you mentioned that it is “artistic representations”, as sometimes it can sound too much like a science lecture, which it absolutely isn’t. It’s about the art hiding in there in science, and plenty of artistic interpretations and maximal artistic license applied to scientific ideas. I actually did a lecture about all of this recently, which is online here:



It’s been a fun process and I can see that there is a lot more potential in working with these sorts of links between fields. While I won’t be adding to the Emergence project specifically, but instead I am working on some new wide-ranging concepts which drive music and visual creation, and my live shows.

Lots more to follow soon about those projects, if you want to find out as they arrive then drop your email onto my website and I’ll send you previews of each project as it comes

Also, one final note, all of the collaborations, credits and ideas, along with stills and videos, are on the Emergence mini-site here.

Screen Shot 2018-07-06 at 1.04.21 PM.pngPhoto by Alex Kozobolis

Speaking to Max about science, art, his thoughts & everything else in between has been nothing short of inspiring.

As our good friend & avid Resolume user Albert Einstein says, “Imagination is more important than knowledge.”

So, go ahead! Imagine. Create. And, of course, tell us about it :)

Maxing Out on Science & Art

j_rosenberg-10.jpgPhoto by J. Rosenberg

Max Cooper is not your average electronic producer. With a PHD in Computational Biology, Max is what we like to call an Audio-Visual Scientist. Through his work he tries to bridge the gap, or reinforce the deep-seeded relationship between science, art and music. A look through his work and you realize how successful he has been. [fold][/fold]

From his experiments with a 4D sound system using Max4Live & Abelton to his first album Human in 2014, Max’ work has been cutting edge, beyond meaningful and focused on a wholesome approach to music as opposed to one that is purely auditory.

SHOTAWAY_008.jpg
On 20th September 2018, Max is dropping his third studio album- One Hundred Billion Sparks. As per Max, each and every one of us are one hundred billion sparks. One hundred billion neurons that fire feelings and ideas, that make us different yet connected. Deep, right? You can check out the first single from this album here.
But before we dive into this one, we caught up with Max to get the scoop on his second studio project and AV show- Emergence.

Artwork_Max Cooper_EmergenceRemixed.jpg
Emergence as a concept is remarkable. It focuses on different properties of nature and what it can give rise to, or what can “emerge” from it- not just on a physical level but also a mathematical & functional level. It finds art in simple natural processes, something we might be quick to take for granted or disregard.

Screen Shot 2018-06-13 at 12.12.37 AM.pngPhoto by Alex Kozobolis

Emergence is divided into multiple chapters- each chapter a different representation of the universe and its evolution from the distant past to the future. All the visual content is so well interwoven with the audio that naturally the question arises, What comes first the audio or the visual?

The concept for the project came first, which then spawned many visual and musical ideas. The narrative form and the fact that it needed to be a live music show, meant that there was already a lot of structure imposed before I had started on the musical or visual specifics – for example, I knew that humans were eventually going to emerge later on in proceedings down the universe timeline, and that things were going to start to get darker as complex forms of subjugation, and the like, came along.

10514440_10152346857427374_9203502906143937854_o.jpg
So, I knew how it needed to progress musically too, which also fitted with a live show arc of increasing musical intensity. There were these sorts of macro structures to work to from the start as I began to pull together a palette of rough ideas.

Screen Shot 2018-06-11 at 11.17.20 AM.png
Then there were all the specific chapters, the different science-related ideas, that I thought would lend themselves to the story, and to beautiful visualization. They were designed to fit the macro-arc of the show, and each to also tell their own micro-story of emergence.

Screen Shot 2018-06-11 at 11.20.42 AM.png
Screen Shot 2018-06-11 at 11.22.39 AM.png
For example, the emergence of the first cell structures with the audio track “origins”, which fits into the wider part of the narrative of the emergence of life, which fits into the wider story of the emergence of the universe and all of its complexities. Sometimes I would create a piece of music with a particular part of the story in mind, sometimes I would send the concept to a visual artist and receive visual drafts to which I would score the music.



I’m often asked to describe the audio-visual link more specifically, people want to know what the process is. I can describe the explicit links between the scientific ideas and the visuals in detail, as we can absorb a lot of varied information from visuals and the mapping is usually quite straightforward.

SHOTAWAY_005.jpg
But if you try to map data to music you invariably make a non-musical mess – we have tight constraints over the data-format for music. But what music can do better than data (usually) is convey emotion. And that’s how I’ve always written music anyway, I’ve never had formal training, and have always approached each piece of music by an emotive-optimization process: I have an image or idea in my head and I know what feeling I have associated with that image or idea. I then have to keep sculpting the melodic form and sound design until the feeling it creates is aligned with that of the image or idea.

79.jpg
It’s a bit of a mysterious process, but we all feel things which are associated with different ideas or scenes, so it’s something anyone can do, it just takes a lot of time.

DSC_4076.jpg
Perhaps the fact that I approached music like that from the start, lent itself to visual work, although again, the links are subjective, so I don’t think it’s so hard to do. The most interesting part in this process for me, was the links to science and nature visually, and the research process of delving into the themes, by which I learnt a lot of artistically inspiring things (Read more about this here)

80.jpg
From visual representations of hard core mathematical data to artistic illustrations of real phenomenon like the big bang; from deliciously cringe-worthy depictions of the emergence of microorganisms to quirky infographic portrayals of humans; from cool facial mapping with Kinect to a good old fractal zoom ending, with a twist- Emergence has it all.

Can you tell us a few different software/ tools the visual artists you work with use to create content?

I can only give a generalized overview rather than getting too specific. But the main approaches were as follows:

1. Traditional video tools like Cinema4D and AfterEffects: As used by Nick Cobby. Plus, he uses Processing.



In the case of Morgan Beringer, he uses Adobe creative suite tools also.



2. Programming approaches using Matlab, C++ etc: This was when I was working with scientists or mathematicians who use these tools for their work. Dugan Hammock used Matlab (for ‘The Primes’) to render my requests for Sacks Spiral, Riemann’s Zeta Function and the Sieve of Eratosthenes.

77.jpg
Andy Lomas used C++ for his cell morphology simulations.



Csilla Varnai from the Babraham institute, well, I’m not sure what they used for their process of gathering DNA binding sequences from real Hi-C chromosomal conformation capture experiments, but I’d guess C++ (See next video)

3. Gaming engines, specifically Unreal: As used by Andy Lomas to map the DNA structural data to a 3D environment with which we could render video content as well as interact with the DNA molecules in VR.



4. Hand-drawn animation was used by Henning M Lederer as well as Sabine Volkert. Sabine hand drew every frame of the Organa video!



That is just amazing. So, Emergence is a product of 3 years of hard work and ideation and a fruit of the labor of a wide range of visual artists. It might be hard, but can you pick one or a couple of your favorite bits of content from the lot? What are you particularly happy with/ didn’t expect to turn out so good?

I was heavily involved in some of the video projects, directing the content and having long discussions over how to move forward on the ideas. Whereas for some of them I just sent the concept and brief to the visual artist and they nailed it with minimal additional input. One of my favorite examples of this, where the concept also fitted neatly into the musical form too, was the chapter/track called “order from chaos”, with the video created by Maxime Causeret.

Screen Shot 2018-06-11 at 11.33.17 AM.png
I was playing with an explicit emergence technique musically for this part, where I had recorded random raindrop sounds during a storm, and was then gradually forcing these percussive hits towards quantized grid positions during the intro, to yield an emergent rhythm from the rain, around which the rest of the track was built – order from chaos.



Maxime applied the idea to early life with beautiful effect, showing complex cell structures and simple life forms, plus other emergent behaviors – murmurations, competition for resources etc, in a very artistic and colorful manner. It’s a great merger of different worlds, and was an exciting surprise to receive his first draft.

Personally, my favorite bits of content are those that involve science & simulation- the awesome visual representations of scientific data.

Screen Shot 2018-06-11 at 11.30.53 AM.png
Those are definitely my favorite bits of content too. Can you tell us a little bit about working with computer-generated simulations? How much of a hit and a miss is it working with simulators and data entry?

Andy Lomas’ cell growth simulations already existed before my project, he is a mathematician and artist who has been working with generative art techniques for many years, and I was just lucky to have his work put in front of me by a mutual friend, upon which I started some very interesting conversations and collaborations which are still ongoing now. It just happened that Andy’s work fitted perfectly with what I was aiming for with the project.

Whereas with Dugan, the process worked in the opposite direction – there had been several animations I wanted for some time and which I had asked many visual artists about, and found that they couldn’t do what was needed, and I needed to find a mathematician instead.

One of these ideas was that of showing higher dimensional forms – structures that exist in more than 3 dimensions of space, for the part of the story about spatial dimensionality.



And, the other main chapter of relevance here is the first chapter, on the distribution of the prime numbers. Because I was working with a mathematician rather than a typical visual artist here, we chose to minimize and simplify the visual form to its basics, black and white wireframe representations of the data. The Chromos chapter also used real data.

But all in all, there wasn’t too much “hit and miss” involved. Nature seems to be inherently beautiful, so we just had to be true to nature’s form and it worked.

Staying true to Nature’s divine form. Believing and falling in love with Nature’s perfect imperfections. That is what Max is about :)

Screen Shot 2018-06-13 at 12.23.12 AM.pngPhoto by Alex Kozobolis

With all of this amazing content, it is but natural that Max had to develop a cracker of a live show. People often describe his show as “hypnotic”- something which is only possible with some great blurring of the lines between audio & video. Of course, his setup was never going to be standard. We cover all of this and more in the next part, so stay tuned.

Notes from Max:

1) A big thanks to Vimeo for being so supportive of the Emergence project.
2) All of the collaborations, credits and ideas, along with stills and videos, are on the Emergence mini-site here
3) If you want updates on my projects as they arrive, drop your email onto my website and I’ll send you previews of each project as it comes.