No probably not, the Mac machines that are now coming out are not yet their high-end machines suitable for VJ-ing. But we will soon have access to a MacBook Air with the new M1 chip so we'll do more investigation soon.
There was a talk about a new "Rosetta" virtualisation layer for the M1 processors (I think it was in their oh so grandeau speech (wassit called the apple (oh i dont know precisely #alcohol)) anyway, they showed an existing x86 application running within this new virtualisation layer and apparently frame rates etc were if not comparable to, but exceeded expectations.
I'll try to locate it on their video for all to question ...
Introducing ARC - The APC40 mkII Extender.
ARC brings you 640 clip triggers, 128 Faders, 128 Track pots, 80 Scene launch, 16 foot switches and more. https://www.theapc40extender.co.uk
Well if you can get Resolume to work on a cell phone it might work on the new Mac's. The new chip is basically a glorified ARM chip is it not? I wouldn't even think about running visuals on anything that doesn't have at least 4 cores, 32gb of memory, a nvme drive and a 6GB GPU. Mac's are lucky to get those configuration right these days
I can confirm it runs on macbook pro 13" M1 with 8gb ram and 256gb ssd. Literally the first thing i installed . I believe it is running with Rosetta 2 (intel emulation), the demo files run smooth at 60fps @ 1080p. Not bad so far.
I'll look for the thread on resolume benchmarking and make it sweat.
Im not sure what you mean by dirty? did you mean the files labelled "noise" woops didn't use those.
drops below pretty quick. - but still seems pretty impressive considering.
2020 MacBook Pro M1 w/8GB,256GB
4k composition, output to a single external 4k display, laptop screen(2880 x 1800) for UI.
using "clean" 4k files from benchmarks. 0 effects
1-6 layers 58-60fps
7 layers drops to 54fps, climbs back to 60fps in a 10 seconds
8 layers 58-60fps
9 layers 54fps, climbs back to 60fps in a 10 seconds
10 layers 52fps
17 layers 30fps
20 layers 26 fps
I add a couple of effects (delay rgb, fragment & fisheye)
20 layers 20fps
15 layers 25fps
10 layers 30fps
5 layers 38 fps
3 layers 45 fps
I did all of this unplugged. Im assuming the performance is the same when plugged into the wall.
Yeah, i'm not interested in the clean files. The noisy files will be closer to real world files and would reflect the performance of a typical show. could you retest with those files?
Thanks
60fps @ 7 layers of the noise 4k files. outputting at 4k external HDMI, composition @ 4k.
I haven't used Resolume intensively for years, so wondering how this stacks up? As this is under intel emulation not sure how stable it is. but hasn't crashed yet..
According to the resolume benchmark spreadsheet that puts the gpu on par with a nvidia 980, which is honestly a lot better than i thought it would be. The m1 benchmarks i've seen have been slightly better than an integrated intel gpu, so this is very useful info to share. Thanks