Hello. I know this isn’t completely related to Linux, but I was still curious about it.
I’ve been looking at Linux laptops and one that caught my eye from Tuxedo had 13 hours of battery life on idle, or 9 hours of browsing the web. The thing is, that device had a 3k display.
My question is, as someone used to 1080p and someone that always tries to maximise the battery life out of a laptop, would downscaling the display be helpful? And if so, is it even worth it, or are the benefits too small to notice?
No, the majority of the energy consumption is in the backlight.
Maybe if it allowed you to switch to integrated graphics versus discrete, putting the GPU to sleep.
For just browsing, even integrated graphics has been plenty since the beginning of the internet, maybe with some exceptions when Flash gaming reached its pinnacle.
That might save a bit of power, but your dedicated GPU is usually in an idle/powered down state until your compositor gives it specific applications to accelerate. for Nvidia laptops this is what the PRIME/Optimus feature does.
Using the iGPU might save power but the resolution doesn’t need to be turned down for that
I’d think so. 3k is so many pixels to compute and send 60 times a second.
But this video says the effect on battery life in their test was like 6%, going from 4k to 800x600. I can imagine that some screens are better at saving power when running at lower resolutions… but what screen manufacturer would optimize energy consumption for anything but maximum resolution? 🤔 I guess the computation of the pixels isn’t much compared to the expense of having those physical dots. But maybe if your web browser was ray-traced? … ?!
Also, if you take a 2880x1800 screen and divide by 2 (to avoid fractional scaling), you get 1440x900 (this is not 1440p), which is a little closer to 720p than 1080p.
Your GPU doesn’t need to re-render your entire screen every frame. Your compositor will only send regions of the screen that change for rendering, and most application stacks are very efficient with laying out elements to limit the work needed.
At higher resolutions those regions will obviously be larger, but they’ll still take up roughly the same % of the screen space.
My PowerBook G4 might be a bit dated, but running other resolutions than native is quite heavy on that thing. Your built-in display can handle one resolution only - anything else will require upscaling.
Your GPU can probably do that upscaling for cheap. But cheaper than rendering your desktop applications? 🤷♂️
You’ll have to benchmark your particular device with powertop.
Isn’t rescaling usually done by the display driver? I am fairly certain this is the case for external displays. Are laptop displays any different?
Edit: with “display driver” I mean the hardware chip behind the display panel, dedicated to converting a video signal to the electrical signals necessary to turn on the individual pixels.
For an external display I’d bet the case is the hardware driver for the panel.
At least my 17" Powerbook G4 with a massive 2560x1440 display does it in the software display driver. I’m sure some laptop panels do it in hardware as well, but seems there’s some very janky shit going on at least with laptops that have both integrated and discrete GPUs.
Unless you’re running games or 3D intensive apps no. Resolution is cheap on power under normal circumstances.
As a web developer, I noticed that some elements such as very big tables struggle to render on 4K but are absolutely fine at 1080p. I would assume that means the CPU and/or GPU are more taxed to draw at higher resolution, and therefore I assume they would draw more power. I might be mistaken. Do you speak by experience?
I’m a flutter dev, and I’ve seen testimonies from a former Windows 98 dev about limiting the number of redraws in the shell.
There’s deffo extra overhead, but it’s not linear - 4k being 4 times as many pixels as 1080p doesn’t mean 4k the work to render after the first frame, as the browser/framework will cache certain layout elements.
The initial layout is still expensive, though, so big tables will take longer, but that big table at high Res will probably be less chuggy when scrolling once loaded.
I am not sure… in the case I’m referring to, they were lagging also when scrolling. But it was React, so native browser rendering. And they were actually very large tables, so we had to do some funny things like viewport culling (see react-window).
For what it’s worth I’ve never had any similar performance issues with tables in Flutter (web with the canvas-based render engine, not Android) when applying the same culling technique, they just ran fine at any resolution. Different hardware, though, so it’s not an apple to apple comparison.
In any case just to be safe I would personally assume less pixels = less work = less power = more battery life. My opinion is very unscientific though.