I was merely pointing out that the issue isn’t unique to M4 and that issues existed with M1 that were resolved with BetterDisplay until Apple pushed out a fix. I’m coming from a 5K iMac (purchase I think December of 2014) that’s now passed down to a younger relative. Over a decade ago!
I guess for me, I really never had an interest in non-retina class displays since the 5K iMac shipped. But I guess I’m a little different. We a criticize Apple all we want, but I’m gonna turn it around a bit…. It’s been well over a DECADE at this point. Why aren’t the the majority of displays being shipped retina class at this point? Why do the display manufacturers seem to be soaking their customers and stringing them along with ****** LoDPI panels and only marketing tech meant to gloss over the fact that the panels are LoDPI? And not that Apple’s GPUs or macOS couldn’t handle the scaling (likely not a performance hit on the system at this point), but imagine if companies like MS and Apple could scrap all the software trickery and antialiasing needed to produce “good enough” but still less than optimal text and image rendering?
It’s been 10 years since HiDPI has been a thing. Why support old tech that is sooner-rather-than-later going to be replaced once high pixel density displays become mainstream? I’ll bet we’ll see a big shift to hiDPI display tech this year (finally) at CES. The fact that most panels are still LoDPI is rather amusing given how fast the industry advances with other tech.
The other question that I still can’t wrap my head around (And I loathe car analogies) but It’s like people who go out and buy a luxury car then complain about the insurance and maintenance/upkeep. Or they buy an elephant then complain about the cost to feed it. This is different though. It’s a one time cost. And I’ll wager a high dpi panel will net you more mileage than the loDPI kit the display manufacturers are still peddling. In the other hand, maybe Apple had a reason to knee-cap support for LoDPI panels that require fractional scaling. And when I mean “displays” I’m talking about units that will primarily be used as computer monitors.
I guess for me, I really never had an interest in non-retina class displays since the 5K iMac shipped. But I guess I’m a little different. We a criticize Apple all we want, but I’m gonna turn it around a bit…. It’s been well over a DECADE at this point. Why aren’t the the majority of displays being shipped retina class at this point? Why do the display manufacturers seem to be soaking their customers and stringing them along with ****** LoDPI panels and only marketing tech meant to gloss over the fact that the panels are LoDPI? And not that Apple’s GPUs or macOS couldn’t handle the scaling (likely not a performance hit on the system at this point), but imagine if companies like MS and Apple could scrap all the software trickery and antialiasing needed to produce “good enough” but still less than optimal text and image rendering?
It’s been 10 years since HiDPI has been a thing. Why support old tech that is sooner-rather-than-later going to be replaced once high pixel density displays become mainstream? I’ll bet we’ll see a big shift to hiDPI display tech this year (finally) at CES. The fact that most panels are still LoDPI is rather amusing given how fast the industry advances with other tech.
The other question that I still can’t wrap my head around (And I loathe car analogies) but It’s like people who go out and buy a luxury car then complain about the insurance and maintenance/upkeep. Or they buy an elephant then complain about the cost to feed it. This is different though. It’s a one time cost. And I’ll wager a high dpi panel will net you more mileage than the loDPI kit the display manufacturers are still peddling. In the other hand, maybe Apple had a reason to knee-cap support for LoDPI panels that require fractional scaling. And when I mean “displays” I’m talking about units that will primarily be used as computer monitors.