A while back, there was an anonymous (I think) poster who opined that gaming laptops were all broken by design, since there was no way that adequate cooling could be included in the limited space of a laptop’s enclosure. I disagreed, of course.
Heat is, of course, a major issue in higher-performance PCs, and laptops especially. Anonymous was not wrong that cooling is the biggest hurdle that gaming laptops need to overcome compared to their larger brethren.
Recently I’ve been using my gaming laptop (a Dell G3-3579, 15.6 inch, with nVidia GTX 1050ti) for its stated purpose, and I’ve been trying to squeeze some more performance out of it (as you do). In this case, the temps reported by the GPU during gaming are lower than expected. That could mean some performance is being “left on the table.”
If it’s on the table, I want it!
The G3’s GTX 1050 ti is a lower-end GPU on a laptop that’s a few years old, so there are newer and more upscale products that will leave my little G3 in the dust, but I would like to get the most out of the hardware I already have.
I decided to run Geeks3d’s Furmark benchmark to see what the temperatures look like. Furmark is known to be brutal on GPUs in terms of generating heat, so I thought it would be a decent test.
There is a Linux version of Furmark, but I decided to run the Windows one (1.29.0) as it has a nice GUI, and I was familiar with it from my Windows days. I ran it in WINE, using the Lutris “FShack” version 7.2, with DXVK enabled. I did the 1920×1080 fullscreen test.
The GPU temperature for the 60 second run didn’t exceed 69 C. That’s well below the slowdown threshold of 97 C… so if the clock rate of the GPU were pushed to the full boost clock (another 10% or so), perhaps it would still be under the max temp, while providing more performance?
It’s presumably hitting the power limit. I ran into the same thing on my desktop’s nVidia card. If I could raise that a little bit…
In the process of all of this, I noticed a link from Geeks3d to their page showing Furmark test results. I saw a few 1050ti results in there, like this one, and that got me to thinking. I hadn’t bothered to see what my own score was, as I had been more interested in the temperature. The score of the 1050 ti I had in front of me, the one I just linked, was 2424 (the total number of frames rendered in one minute). It was on a PC running Windows 10, and the hardware ID suggested a desktop GPU.
I’d read that the mobile 1050 ti like mine can equal the performance of the desktop 1050 ti under ideal circumstances, but are they ever really? The cooler on a desktop PC is far better than the one on a laptop, and it’s not shared with the CPU. One way to find out!
So I ran the test again, and got this result.
I was surprised. My score of 2541 bested all of the 1050 ti contenders on that page (three of them at the time that had run the 1920×1080 test, with at least the one I cited above almost certainly being a desktop), and it was a laptop running a Windows benchmark without Windows that had done it.
You can tell this is a Linux result if you know what to look for. No, I didn’t use Windows 7; that’s just the default version of Windows that WINE claims to be. I can change that, but if it works, I see no reason to do so.
The version of the nVidia driver listed is a current-ish (it is the newest in the Ubuntu repo, but nVidia’s site has one newer) version for Linux; current Windows nVidia drivers are 512.* instead of 510.*
The result for my laptop also lists the second GPU as “modesetting,” which is the Linux kernel driver for the Intel integrated graphics on the laptop. If the Windows PC had been a laptop, it would almost certainly also have listed a second GPU there too, but it would have been reported as something close to what Intel calls it in the device manager.
So in this case, the myth that a gaming laptop is broken by design (by having insufficient cooling to render it fit for purpose) is “Busted.” I have made some tweaks to enhance the cooling (using liquid metal thermal material on the CPU and GPU, and undervolting the CPU), but there’s plenty of thermal headroom here (that I would like to put to good use!).
It also illustrates just how little performance loss there can be in translating Windows/Direct3d to Linux/Vulkan. I’m still amazed by that!
Dell XPS 13/9310, i5-1135G7/16GB, KDE Neon
XPG Xenia 15, i7-9750H/16GB & GTX1660ti, KDE Neon
Acer Swift Go 14, i5-1335U/16GB, KDE Neon