Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

fwiw NVIDIA GPUs have had a similar behavior under normal conditions on Windows forever. After following up with tech support about it, the explanation provided can basically be summarized as 'the higher the resolution + refresh rate of your attached monitors, the more demand placed on the memory controller and scan-out, which means a higher minimum GPU and/or memory clock to drive it'.

My current RTX 20xx card can drive two 4k monitors at low clocks, but the 9xx series really struggled and the 10 series still needed to clock up some. I suspect something similar is probably happening here with the AMD GPU if you're running the laptop's integrated retina display along with a 1440p60 external display, it's probably forced to clock up to drive that many pixels. I'm not really sure how AMD could get around basic hardware limitations here unless they have a really clever way of driving displays at a low memory clock.



It seems more related to 1440p specifically. People reporting no issues with higher resolutions.


I have a 2160p external monitor attached to my MBP 16" via Thunderbolt/Displayport 1.2 at 60 Hz.

I tried out the five different "Looks like" scalings the Display Preferences allow for and then eyeballed the "Radeon High Side" wattage readouts via iStat Menus:

1280x720: 18W

1920x1080: 6W

2560x1440: 6W

3200x1800: 6W

3840x2160: 6W

So it looks like it might (also?) be related to the display resolution scaling.


It would make sense if situations where the native panel resolution doesn't match the framebuffer resolution are causing the driver stack to have to maintain two framebuffers (native and non-native) to do the upscale and then feed it to the monitor. GeForce drivers on windows specifically have a setting to control whether this happens or not (let the monitor scale it, have the GPU scale it). The upscale would not only use more VRAM in that case (and as a result more memory bandwidth) but it'd need a bit more GPU compute as well to perform the scaling.

On my low-spec laptop from a while back, running Rising Thunder set to 720p was faster than native, but it was even faster to first set my desktop resolution to 720p because the overhead of the game engine scaling its 720p framebuffer up to native 4k was measurable. Setting the desktop resolution down also improved battery life.


This rescaling is obviously happening in every single "looks like" resolution except (perhaps) 3840x2160.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: