Basic75 gives a good rundown on how things work conceptually.
This is one of those things that is blown way out of proportion. My daily use machines are a 13" M1 MBP 16GB/1TB and an M1 mini 16GB/256GB.
I run the MBP at its default 1440x900 resolution, which is a scaled resolution by the way as the display is 2560x1600. Using Stats menus, I find that running the display at the scaled 1440x900 resolution typically uses a few percent (2%-5%) more GPU than running on the retina 1280x800 resolution during my daily productivity use.
I use the mini with an 32" LG 32UL500 4k display. I normally run it at native 3840x2160 resolution. That resolution allows me to work without a second display. When I give remote lectures, I'll drop down to a scaled 2560x1440 resolution. The result is pretty similar, the mini uses a few percent more GPU to render the the high dpi scaled resolution.
On the laptop, Apple does not let you choose from a full list of resolutions for the internal display, but will allow you to see all available resolutions for an external display. The external display will give options to choose between the high dpi scaled resolutions and low res options that just map the resolution to the display. I find the high dpi scaled resolutions to be well worth the GPU overhead when I want things to look larger on my display.
When the retina displays came out in 2012, it made no sense to me to use 4 pixels for every 1 pixel displayed because the GPUs on the entry level machines worked extremely hard to make it happen (I opted for the non-retina MBP back in 2013). With the M series chips 10+ years later, I really do not feel it is worth fussing over. If it's really a concern, then opt for a machine with a Pro, Max, or Ultra chip.