Hello. I know this isn’t completely related to Linux, but I was still curious about it.

I’ve been looking at Linux laptops and one that caught my eye from Tuxedo had 13 hours of battery life on idle, or 9 hours of browsing the web. The thing is, that device had a 3k display.

My question is, as someone used to 1080p and someone that always tries to maximise the battery life out of a laptop, would downscaling the display be helpful? And if so, is it even worth it, or are the benefits too small to notice?

45 points

No, the majority of the energy consumption is in the backlight.

permalink
report
reply
10 points
*

Maybe if it allowed you to switch to integrated graphics versus discrete, putting the GPU to sleep.

For just browsing, even integrated graphics has been plenty since the beginning of the internet, maybe with some exceptions when Flash gaming reached its pinnacle.

permalink
report
reply
3 points

That might save a bit of power, but your dedicated GPU is usually in an idle/powered down state until your compositor gives it specific applications to accelerate. for Nvidia laptops this is what the PRIME/Optimus feature does.

permalink
report
parent
reply
1 point

Even back when I was in the laptop-repair game, this is the kinda stuff people would expect me to know about their stuff that I hated. I saw too many features come and go over the years to keep track of even half of it on behalf of others.

permalink
report
parent
reply
2 points

Using the iGPU might save power but the resolution doesn’t need to be turned down for that

permalink
report
parent
reply
1 point

Depends on the iGPU, but this being a damn near brand-new laptop, I’m sure you’re right.

permalink
report
parent
reply
9 points

I’d think so. 3k is so many pixels to compute and send 60 times a second.

But this video says the effect on battery life in their test was like 6%, going from 4k to 800x600. I can imagine that some screens are better at saving power when running at lower resolutions… but what screen manufacturer would optimize energy consumption for anything but maximum resolution? 🤔 I guess the computation of the pixels isn’t much compared to the expense of having those physical dots. But maybe if your web browser was ray-traced? … ?!

Also, if you take a 2880x1800 screen and divide by 2 (to avoid fractional scaling), you get 1440x900 (this is not 1440p), which is a little closer to 720p than 1080p.

permalink
report
reply
8 points

But you don’t lower the amount of pixels you use. You just up the amount of pixels used to display a “pixel” when lowering the resolution. So the same amount of power is going to be used to turn those pixels on.

permalink
report
parent
reply
2 points

Your GPU doesn’t need to re-render your entire screen every frame. Your compositor will only send regions of the screen that change for rendering, and most application stacks are very efficient with laying out elements to limit the work needed.

At higher resolutions those regions will obviously be larger, but they’ll still take up roughly the same % of the screen space.

permalink
report
parent
reply
6 points

My PowerBook G4 might be a bit dated, but running other resolutions than native is quite heavy on that thing. Your built-in display can handle one resolution only - anything else will require upscaling.

Your GPU can probably do that upscaling for cheap. But cheaper than rendering your desktop applications? 🤷‍♂️

You’ll have to benchmark your particular device with powertop.

permalink
report
reply
1 point
*

Isn’t rescaling usually done by the display driver? I am fairly certain this is the case for external displays. Are laptop displays any different?

Edit: with “display driver” I mean the hardware chip behind the display panel, dedicated to converting a video signal to the electrical signals necessary to turn on the individual pixels.

permalink
report
parent
reply
2 points

For an external display I’d bet the case is the hardware driver for the panel.

At least my 17" Powerbook G4 with a massive 2560x1440 display does it in the software display driver. I’m sure some laptop panels do it in hardware as well, but seems there’s some very janky shit going on at least with laptops that have both integrated and discrete GPUs.

permalink
report
parent
reply
6 points

Unless you’re running games or 3D intensive apps no. Resolution is cheap on power under normal circumstances.

permalink
report
reply
2 points

As a web developer, I noticed that some elements such as very big tables struggle to render on 4K but are absolutely fine at 1080p. I would assume that means the CPU and/or GPU are more taxed to draw at higher resolution, and therefore I assume they would draw more power. I might be mistaken. Do you speak by experience?

permalink
report
parent
reply
2 points

I’m a flutter dev, and I’ve seen testimonies from a former Windows 98 dev about limiting the number of redraws in the shell.

There’s deffo extra overhead, but it’s not linear - 4k being 4 times as many pixels as 1080p doesn’t mean 4k the work to render after the first frame, as the browser/framework will cache certain layout elements.

The initial layout is still expensive, though, so big tables will take longer, but that big table at high Res will probably be less chuggy when scrolling once loaded.

permalink
report
parent
reply
1 point

I am not sure… in the case I’m referring to, they were lagging also when scrolling. But it was React, so native browser rendering. And they were actually very large tables, so we had to do some funny things like viewport culling (see react-window).

For what it’s worth I’ve never had any similar performance issues with tables in Flutter (web with the canvas-based render engine, not Android) when applying the same culling technique, they just ran fine at any resolution. Different hardware, though, so it’s not an apple to apple comparison.

In any case just to be safe I would personally assume less pixels = less work = less power = more battery life. My opinion is very unscientific though.

permalink
report
parent
reply

Linux

!linux@lemmy.ml

Create post

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word “Linux” in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

  • Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
  • No misinformation
  • No NSFW content
  • No hate speech, bigotry, etc

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

Community stats

  • 7.7K

    Monthly active users

  • 6.5K

    Posts

  • 179K

    Comments