The Ultimate Guide to Multiscreen Output with Resolume

Photo by Diana Gheorghiu
Photo by Diana Gheorghiu
“I want to connect a dozen and then some screens, what hardware should I get?”

We get this question quite often.

The question sounds simple, the answer is always complicated. It’s the same as asking: “I want to buy a house, which house should I buy?”

Consider us your multiscreen real estate managers. We like to help you make the right decisions and find the house that’s right for you. After gathering lots of use cases, possible problems and possible solutions, we came to this document. This document will guide you through the overwhelming multiscreen adventure.

The adventure starts here!

You’ll find some essentials explained on the do’s and don’ts when using Resolume for multiple outputs. The different options are listed in order of preference and it even contains a flowchart. Yay. Just answer the questions and you will be guided to your ideal dream home. One click and you’ll be taken to all the essential information you need on that snazzy 3 story condo with all copper plumbing. We’ll try to avoid the shady parts of town, but if you like, we can show you some options in the extender hub ghettos as well.

Always remember, buying a large house is a big investment. Before going house hunting, you need to make sure your computer has enough pixel power in the bank to build that pretty picture. It would suck if you get all the gear together to run a beautiful 4 story pixel map and then realise your Intel Iris Pro chokes at more than a single bedroom NY apartment. When in doubt, check them benchmarks.

Here's that URL one more time, in case you missed it the first time

Posted by Joris on Tuesday April 18, 2017 at 14:35 Tags: datapath * gpu * multiscreen
Comment from jate:

Thanks for this Joris!
It would have answered a lot of questions when I was starting out on my multi-screen journey!
I'll hopefully have a bit more to contribute to this in the near future after I get my setup a bit more stable.

Comment from Arvol:

Fantastic! This is very well put together. Should be a great tool everyone could learn something from.

Comment from Zenithik:

Thanks for this guide, very helpful!

When you are able, is it possible you could do some research into eGPU solutions like the Razer Core and how they might apply here? For example, my Razer Blade can only output to one device at a time, but theoretically with an eGPU it should be able to achieve more (though I am experiencing problems with that). I don't want to start troubleshooting that in this thread obviously, but it may be a solution more VJs will be looking at in the near future.

Comment from Meptik:

I assume this answers the question of which nVidia GPU's you all prefer for Resolume... a GTX 1080 vs, say, a Quadro P4000 or P5000?

Comment from Arvol:

Quadro uses OpenCL and Resolume is focused around OpenGL (GTX and Geforce series)

Comment from Joris:

Quadro uses OpenCL and Resolume is focused around OpenGL (GTX and Geforce series)
I'm all in favour of making things as simple as possible. but as we say in the Netherlands, that's taking a very tight turn through the curve.

Both the Quadro and GTX series use OpenGL, both Quadro, GTX and also AMD, Matrox and Intel can use OpenCL. The primary use of OpenCL is using the GPU to do calculations that would otherwise clog up your CPU. Think large physics based stuff for scientific purposes, or 1 million+ particle systems.

OpenCL is not so interesting for Resolume. Most of what we do is drawing pictures, so OpenGL covers everything we need.
a GTX 1080 vs, say, a Quadro P4000 or P5000?
Because we don't need any of the extras available on Quadros, there is no reason to invest in the more expensive Quadro line. A similar specced GTX performs the same, if not better, for a lot less money.

That will probably change once we start supporting stuff like GPU Affinity, at which point this document will probably read: "Get as many Quadro cards as you need" ;)
When you are able, is it possible you could do some research into eGPU solutions like the Razer Core and how they might apply here?
eGPUs don't really add anything fundamentally different. In most cases, they either take over for your built-in GPU, in which case they fall under the "single-computer-single-GPU" category, or they're used as a second GPU next to your built-in GPU, in which case they fall under the "single-computer-two-GPUs" category.

Comment from Arvol:

Thank you for clarifying. All this time I thought the Quadro cards was an 80/20 focus on CL and GL (80% focus on CL with a 20% small focus on GL), But what you're saying is that they both provide the same focus on GL and that the Quadro has the additional processing power of CL that pushes beyond the capabilities of the GTX and GeForce series?

eGPUs don't really add anything fundamentally different. In most cases, they either take over for your built-in GPU, in which case they fall under the "single-computer-single-GPU" category, or they're used as a second GPU next to your built-in GPU, in which case they fall under the "single-computer-two-GPUs" category.
That was always my understanding as well. I figured it would be no different than connecting, let's say, a usb video adapter. It would either become the primary driver (you would want to connect your GUI monitor to this card, otherwise it would be no different than connecting 2 PCIe cards in a desktop). The eGPU option might interest people with a cheaper laptop with an integrated Intel GPU but has a TB3 port, right?

Comment from Joel_Dittrich:

Here´s the open "standard" for eGPU; Akitio Thunder 3. I built an earlier, inofficial version, with their Thunder 2.

https://www.akitio.com/expansion/node

https://twitter.com/joeldittrich/status ... 8901895168

Comment from millst:

Probably the best way of getting lots and lots of outputs is to use a Blackmagic Decklink Quad card which has 8 SDI outputs. You could use two to get 16 outputs or 4 to get 32 outputs.

It saves a lot of messing around with conversions and you can also run SDI for over 100m before running into problems using cheap coax cable.

Comment from Joris:

But what you're saying is that they both provide the same focus on GL and that the Quadro has the additional processing power of CL that pushes beyond the capabilities of the GTX and GeForce series?
The GTX series is perfectly capable of OpenCL as well. You can even use OpenCL on Intel cards. OpenCL is not inherently 'better' than OpenGL. They're just programming languages, and each one has its own specific purposes that it's good at.

Comment from bradg:

Joris,

What a great guide! You should sticky this thing all over the forum. When I was designing a Resolume setup for an install last year, I probably had to read a hundred threads and discuss with 2-3 people to figure out workable setups.

I think the only thing missing to make this a full picture is a discussion of EXTENDERS. We installed one room with 8 projectors fed via low latency H.264 network at the X4 OUTPUTS and this was rock solid. Another room used 10x displays fed by (3) X4 with their INPUT fed by high quality HDBaseT. Getting 8 monitors and a user GUI from 3 GPU outputs was rock solid, but the last output just wouldn't stay locked. We had to redesign when the updated BMD support update was released and use a BMD quad 2. This meant the addition of Decimator MD-HX units to feed each X4 input. This setups works rock solid (after a lot of sync setting configuration between BMD and MD-HX).

My point is that practical solutions on this scale seem to often require extension (outside of LED walls with easy access to sender cards). I could have assembled most of the solutions in this guide in the shop and they would have worked perfectly with only minor tinkering, but as soon as extension was required in the field it's a totally different ballgame. I think adding this element is the difference between theory and reality in a lot of applications.

Thanks again for the guide. I stored it in my Evernote right away!

Brad

Comment from Joris:

Awesome to hear that y'all are digging this guide so much. Credits go to Bonne for narrowing things down to a flowchart and getting all the info together in a PDF.

We purposefully don't go into what happens after the signal leaves the computer. We make software, so we can only weigh in on what happens inside the computer when using different hardware setups and how that affects Resolume.

We rely on expert opinions like yours for advice on the signal flow after the signal leaves the computer.

Comment from alfaleader:

Problem with many displays is that GPU's can't always handle a screen on each output they have. 1080 for example can only drive 4 displays.

Had a show with a Mac Pro 2013 this weekend and couldn't get 5x screens working (4x projector and resolume screen). When I tested it at home it worked but on the field with DVI outputs it didn't.

Any of you tried 2 different GPU's? 1 main GPU for processing and another one for extra outputs?

Comment from jate:

The only thing software wise that I would LOVE to have... is the ability to pick which GPU is the renderer.

We've already talked about this, but let's say you have a bunch of GPUs for outputs, and one that's supposed to be the renderer... the process of disabling all of the other cards, opening, and then re-enabling them can be very tedious without running a few scripts.

Comment from Arvol:

Correct me if I'm wrong, but isn't the GPU that is driving the GUI the GPU that will be rendering? If so, connecting the GUI display to whatever card you want to be set as your primary should work for a solution, right? Does the detect, identify, make this display primary. In the Windows or GPU settings panel do the trick?

Comment from jate:

dinga wrote:Correct me if I'm wrong, but isn't the GPU that is driving the GUI the GPU that will be rendering? If so, connecting the GUI display to whatever card you want to be set as your primary should work for a solution, right? Does the detect, identify, make this display primary. In the Windows or GPU settings panel do the trick?
Wouldn't it be nice :D Unfortunately, I can't find a rhyme or reason for why it picks the GPU it does to latch on to.

I started with what you listed, then also disabled the other cards from CUDA, set the OpenGL renderer in NVIDIA settings, set the default renderer for 3D general, as well as the primary in the tab for Resolume.

Went so far as to download a third party monitor manager that completely overrides the default Windows one, but still no luck.

Only solution currently (luckily as suggested by Joris) is to disable all GPUs but the primary, open Resolume, then re-enable them and reassign everything.


Login to the forum to comment.