2018 Razer Blade Performance Review

Just let it all out, buddy. You're among friends here.
Post Reply
User avatar
Arvol
Might as well join the team
Posts: 2794
Joined: Thu Jun 18, 2015 17:36
Location: Oklahoma, USA

2018 Razer Blade Performance Review

Post by Arvol »

Hi Guys,
I Had The Opportunity To Benchmark The New 2018 Razer Blade. I’ve Had Some People Reach Out To Me Asking How The Performance Is Stacked Up To their Previous Models, And Since I Don’t Actually Own The New 2018 Model, I Had One Sent Out For A Shootout.
Here Are My Results:
*** Sorry For The Long Winded Over Detailed Report. ***


2018 Razer 15" Razer Blade
nVidia GTX1070 W/ Max-Q (8GB)
Intel i7-8750H 2.2Ghz (6-Core)
16GB Of Ram

2016 Razer 14" Razer Blade
nVidia GTX1060 (6GB)
Intel i7-6700HQ 2.6Ghz (4-Core)
16GB Of Ram


Both Models Have The 4K Touchscreens And M.2 SSD’s


[Test’s Were Made Using A Connected 4K Monitor For The 4K Composition Benchmark Tests,
And A 1080p Monitor Connected For The 1080p Benchmark Tests.
By Not Connecting A Monitor, You Will Get Much Different Results Than You Do When You Actually Output Pixels!]

Number Of Layer Before Dropping Below 30FPS:

[DMA: On / FPS:60]
(2018) 4K Clean: 27 | (2016) 4K Clean: 18
(2018) 4K Noise: 16 | (2016) 4K Noise: 9
(2018) 1080p Clean: 72 | (2016) 1080p Clean: 37
(2018) 1080p Noise: 41 | (2016) 1080p Noise: 19

[DMA: Off / FPS:60]
(2018) 4K Clean: 25 | (2016) 4K Clean: 15
(2018) 4K Noise: 14 | (2016) 4K Noise: 8
(2018) 1080p Clean: 82 | (2016) 1080p Clean: 54
(2018) 1080p Noise: 44 | (2016) 1080p Noise: 22


Setting DMA To Auto Gave Same Results As Setting DMA To On.
[DMA = Direct Memory Access Textures]
Setting FPS To Auto Gave Same Results As Setting FPS To 60.

Adding Memory Heavy FX Such As Delay RGB Reduces Overall Layer Count By 1 Layer.
Changing The Composition From 8-Bit To 16-Bit Reduces Overall Layer Count By 2 Layers.

*** Any Idea Why Turning DMA Off Would Give A Performance Boost On 1080p? DMA On Helps A Ton On 4K But It Seems To Be Better To Leave It Off When I Was Testing With 1080p. Very Strange. ***


The 2018 Blades Now Have A Mini-DP Port Directly Connected To The nVidia GPU In ADDITION To The Current HDMI Port. This Gives The 2018 Blades An Additional Output Directly Connected To The nVidia GPU, Whereas The 2017 And Previous Models ONLY Have A HDMI Directly Connected To The nVidia GPU.

The 2018 Blades Also Fixed The Hardware Problem The 2017 Blades And Previous Models Had Where The Thunderbolt 3 Port Would Only Support A SINGLE Display Connection At A Time. The 2018 Blades Now Support At-Least 2 Display Connections (Possibly More?).
Both The 2018 And 2017 And Previous Model Blades Still Support 2 Connected Displays Via USB-C.

Here Are Some Additional Benchmarks I Took Using Multiple Outputs:
[Tests Were Made Using 2 Monitors Both Set To 720p@60.]

Number Of Layer Before Dropping Below 30FPS:


[Single Monitor Connected To The nVidia GPU]
(2018) 720p Noise: 44 | (2016) 720p Noise: 20

[Single Monitor Connected To The Intel GPU]
(2018) 720p Noise: 43 | (2016) 720p Noise: 19

[2 Monitors Connected To The nVidia GPU]
(2018) 720p Noise: 8 | (2016) 720p Noise: *Can’t Due To Hardware Restrictions*

[2 Monitors Connected To The Intel GPU]
(2018) 720p Noise: 31 (Via Thunderbolt 3 AND USB-C) | (2016) 720p Noise: 16 (Via USB-C)

[2 Monitor Connected. One Connected Via The nVidia GPU, And 1 Connected Via The Intel GPU]
(2018) 720p Noise: 36 | (2016) 720p Noise: 19



DMA Settings And FPS Settings Did Not Seem To Affect These Numbers.

*** I Have A Feeling The Reason Why 2 Connected Displays To The nVidia GPU Returned Such Poor Results Might Be Due To A Driver Issue??? ***

One Small Bug I Have Ran Across Is When I Have A Monitor Connected To The nVidia And Intel GPU’s, When Firing Off 3-5 Layers Of 720p, The FPS Will Jump From 60FPS Down To 30-44FPS Every 5-7 Seconds Or So. Changing The V-Sync Settings Does Not Seem To Fix This.

Keeping V-Sync Set To Application Default Seems To Still Give Better FPS Results WITHOUT Tearing.

There Is Still NO Option To Disable The Intel GPU Or Turn Of Windows Optimus Via The BIOS. This Was Something I Was REALLY Hoping For, But This Option Seems To Be Going Away In Most Laptops Now. ☹

Overall This Is A Very Well Built And Sexy Machine!
The 2018 Model Pretty Much Doubles The Performance Of The 2017 And Prior Models.
I’ve Said This At Least A Few Hundred Times And I’ll Say It Again:
I’ve Been Working With PC’s And Laptops Professionally For Well Over 16 Years.

Having Worked In Some Of The Most Harsh Environments, Such As Industrial Cook Rooms That Are Well Over 120 Degree’s In The Human Working Area To Blast Freezer That Go From Room Temperature Down To Negative 60 Degree’s In Under 1 Min. I Have Had To Get Creative In Designing And Building Systems That Can Withstand These Environments For Years At A Time.
The 2016-2017 Blades I Own (12), Have Been Nothing Short Of Work Horses. I Have Personally Ran Outdoor Festival Shows Where I Had To Give Up My FOH Tent To Accommodate Guest VJ’s And LD’s In The 114 Degree Texas Heat for 16+ Hours A Day For 3 Days. And The Blades Pushed Through Without Breaking A Sweat! I Physically Couldn’t Touch The Body Of The Laptop Without Burning My Fingers. But With A Basic Laptop Riser/Cooler It Was Pushing Well Over 60FPS All Day!
I Had Artists Using My Tent I Gave Up For Under An Hour Running MBP’s And They Were Thermal Throttling At 2-4FPS To The Point Where I Had The Festival Bring Out BAGS Of Ice And Towels To Let The MBP’s Rest On, Just So They Could Get Their FPS Closer To 20…..

Why Are People Still Spending $1,000 More On MBP’s When These Laptops Are An Option? They Have The Same Sexy Body Style.

My Point: The 2018 Models Have Even Stepped Up Their Game By Introducing An Improved Cooling System Based On The Successful 2016 Model.
Vacuum Sealed Liquid Cooling. I’ve Been Told It Has Almost Twice As Much Performance In Cooling Than The 2016-2017 Models Have, And I Personally Hold The 2016-2017 Models As The Baseline Where All Other Systems Are Judged By.
I Personally Haven’t Tested The 2018 Model Out In A Hot Festival Environment, But I Am Eager To see What It Can Do As Well!


There Are Other Builds Out There That Are Great As Well. But I Would Like To Thank Razer For Letting Me Test Out Their New Toys And Report Back To The Industry Professionals About What Their Gear Can Do. Razer Is Geared Towards Gamers, But I Have A Feeling A LOT Of Video And Photo Artists Will Be Making The Switch As Apple Is Dumbing Down Their Hardware And Removing Industry Standard Ports.

Now If We Can Just Get Microsoft To Play Well And Stop Forcing Updates Down Our Throats And If Optimus Can Play Nice I Have A Feeling A Majority Of All Windows Related Resolume Problems Would Be Fixed.

For Most Of Us The Results On This Build Is More Than Enough For Our Day To Day Shows.

Looking Forward To Seeing Other People’s Reviews On Their Setups.
With The Hardware Out Right Now, The Future Is Looking Bright!


https://www.razer.com/comparisons/blade-15


~A

Luxcollective
Is taking Resolume on a second date
Posts: 37
Joined: Sun Nov 17, 2013 22:33

Re: 2018 Razer Blade Performance Review

Post by Luxcollective »

Definitely curious about the horrible performance when connected to 2 monitors. That is pretty bad and suggests that the 2nd output is some kind of hacky implementation and probably not really connected to the GPU, but spun off the TB3 or something like that ...especially if the Intel onboard seems to be running circles around it. Bummer. I thought this would be a good upgrade to get proper extra outputs plus all that cool Razer form factor. Oh well. That's why I like my Triton 700. It doesn't mess around with any of this GPU power saving nonsense. GTX 1080 only with 3 discreet outputs all connected directly to the GPU (full size DP, HDMI 2.0, and TB3) NO intel onboard at all so I can avoid all that sillyness. ...but dang the form factor of the Razer is sweet.

User avatar
Arvol
Might as well join the team
Posts: 2794
Joined: Thu Jun 18, 2015 17:36
Location: Oklahoma, USA

Re: 2018 Razer Blade Performance Review

Post by Arvol »

Luxcollective wrote: Fri Aug 17, 2018 23:18 Definitely curious about the horrible performance when connected to 2 monitors. That is pretty bad and suggests that the 2nd output is some kind of hacky implementation and probably not really connected to the GPU, but spun off the TB3 or something like that ...especially if the Intel onboard seems to be running circles around it. Bummer. I thought this would be a good upgrade to get proper extra outputs plus all that cool Razer form factor. Oh well. That's why I like my Triton 700. It doesn't mess around with any of this GPU power saving nonsense. GTX 1080 only with 3 discreet outputs all connected directly to the GPU (full size DP, HDMI 2.0, and TB3) NO intel onboard at all so I can avoid all that sillyness. ...but dang the form factor of the Razer is sweet.
The Mini DP is connected to the GPU. Not sure if It's a Max-Q thing or just an outdated driver? I had a crazy week with a tour flipping a semi of our gear, So I only spent about 3 hours messing with it. These are numbers straight out of the box. Possibly a quick fix for the second output, But I don't know, unfortunately I didn't have the time :(

I could reach out to the engineers and see what they think. Since there is a free Demo to test with, I'm sure they could run a few tests and get back with me.

Luxcollective
Is taking Resolume on a second date
Posts: 37
Joined: Sun Nov 17, 2013 22:33

Re: 2018 Razer Blade Performance Review

Post by Luxcollective »

I bet the FPS issue is another example of the Optimus gremlin:

https://resolume.com/forum/viewtopic.php?t=15688

Post Reply