Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
So what - I don’t understand the refresh rate obsession. Do people not understand how many pixels this thing is and how many resources would literally go up in flames to drive it to 120hz or whatever for almost no gain??
With current graphics cards (at least for the Windows world -> Nvidia/AMD), it would be technically no problem at all today to bring 6K in 120Hz or more to the display (AND I'M NOT TALKING ABOUT GAMING, I DON'T CARE ABOUT GAMING!).

So it's a bit sad that the new 6K monitors still don't support 120Hz and let's not kid ourselves: The next “Big Thing” will be that such monitors will naturally have 120/240Hz on board. So why not now?
 
So what - I don’t understand the refresh rate obsession. Do people not understand how many pixels this thing is and how many resources would literally go up in flames to drive it to 120hz or whatever for almost no gain??

Refresh rate is just as important as resolution, or pixel density. Just so you know, RTX 50 Series GPUs can run a 8K display at 120Hz. Should be a simple task for UI and desktop environment, not running games or 3D applications. So M4 Max GPU is actually worse than RTX 5060? 🥹
 
So M4 Max GPU is actually worse than RTX 5060? 🥹
Apple Silicon has no chance at all against Nvidia's RTX cards when it comes to pure top performance. When it comes to frames per watt, things look better, but users in stationary use are more interested in performance and less in consumption.

The general public often has the preconception that Apple is the non-plus-ultra when it comes to graphics. But that hasn't been the case for ~20 years!
 
Apple Silicon has no chance at all against Nvidia's RTX cards when it comes to pure top performance. When it comes to frames per watt, things look better, but users in stationary use are more interested in performance and less in consumption.

The general public often has the preconception that Apple is the non-plus-ultra when it comes to graphics. But that hasn't been the case for ~20 years!

With recent release of Cyberpunk on Mac, it's been proven M chip GPUs has the same performance/watt metrics to a mobile RTX 5090.


We can all safely say, Apple is just being extremely conservative when it comes to power consumption. Like you said earlier, Apple's GPU barely able to handle high refresh rate on desktop mode, let alone gaming.
 
Does it also works with Linux?

Lenovo ThinkPad E14 Gen 6 14", AMD Ryzen 5 with Fedora Silverblue?
That has an integrated Radeon 660M GPU. The Lenovo specs say 5K 60Hz, but technically USB-C DisplayPort 1.4 alt mode supports up to 8K 60Hz. So maybe it would work? Your problem is nobody has the ProArt 6K yet, but you could try asking ASUS. Or maybe ask Dell — the Dell 6K has been around for two years.

 
With recent release of Cyberpunk on Mac, it's been proven M chip GPUs has the same performance/watt metrics to a mobile RTX 5090.


We can all safely say, Apple is just being extremely conservative when it comes to power consumption. Like you said earlier, Apple's GPU barely able to handle high refresh rate on desktop mode, let alone gaming.
A lot, if not a lot, speaks in favor of the Nvidia chip when you know that it was produced in 5nm (which means it has higher energy requirements) and Apple Silicon M4 Max is already running in 3nm!

So the comparison here is a rather synthetic one, because neither on the MacBook nor on the Alienware will you be able to play Cyberpunk for longer than 60 minutes at maximum quality without a power supply. These Triple A games are therefore intended for stationary computers.
 
I'm a big Blur Busters fan, and I think super high refresh rates have more benefits than most people realize. I don't like that high refresh rate displays are always marketed as just beneficial to pro competitive gamers; I think they are also nice for casual indie games, or other basic computing.

But, at the same time, most of what I do for work is on static screens with lots of text. I scroll Excel documents row by row, not with smooth scrolling. I scroll PDFs page by page. For my work tasks, I'll take a large retina-density screen over a high refresh screen that sacrifices pixel density or size. There currently aren't really any screens that offer everything.

For gaming, I'd almost always run 1440P 120Hz over 4K 60Hz if my GPU can't keep up with 4K 120Hz.
LG makes great monitors which I have tested and that offers high pixel density + high framerate + extended color gamut.

LG 32GS95UV-B is 32" Dual-Mode (4K 240 Hz ↔ FHD 480 Hz)
LG 27GX790A-B is 27" 2560x1440 480Hz
 
LG makes great monitors which I have tested and that offers high pixel density + high framerate + extended color gamut.

LG 32GS95UV-B is 32" Dual-Mode (4K 240 Hz ↔ FHD 480 Hz)
LG 27GX790A-B is 27" 2560x1440 480Hz
Those are not high pixel density, and the subpixel layout makes text look worse than it does on an equivalent LCD.
I'm not aware of any widely available 120 Hz screens that are 5K or 6K 16:9 (or as tall as 5k or 6K but wider)
 
With recent release of Cyberpunk on Mac, it's been proven M chip GPUs has the same performance/watt metrics to a mobile RTX 5090.


We can all safely say, Apple is just being extremely conservative when it comes to power consumption. Like you said earlier, Apple's GPU barely able to handle high refresh rate on desktop mode, let alone gaming.
I'd bet the 5090m would have significantly better performance per watt if you ran it at the same maximum power level as the M4. Performance per watt is not linear when changing wattage, from my understanding.
 
How exactly would this work? Is there a switch? Two Mac but one TB4?
If it's like the PA27JCV - you could have one machine connected via TB4 (for power/video/km), and another with DVI/HDMI + USB C connectivity + attach a power cord.

Switching is a bit slow (IMO), but works fine - and I still keep a switchable BT keyboard around if one or both machines are laptops with their lids closed and they need to wake up.
 
So what - I don’t understand the refresh rate obsession. Do people not understand how many pixels this thing is and how many resources would literally go up in flames to drive it to 120hz or whatever for almost no gain??
The relationship between pixel count and refresh rate illustrates this point, as seen in the "dual mode" displays that are emerging. The pixel count is quadrupled as the refresh rate is halved. Or, the other way around, as the refresh rate is doubled, the pixel count is quartered.

You can see this in the (unreleased) Acer Predator XB323QX (5K 165Hz, 1440p 330Hz) and the Acer Predator X32 X3 (4K 240Hz, 1080p 480Hz).

And you are right about the resources -- it's not just about bandwidth -- the GPU requirements for a 6K 120Hz display would be very restrictive. That's why it will be a few years before we see it, unless Apple does it in the Pro Display.
 
And you are right about the resources -- it's not just about bandwidth -- the GPU requirements for a 6K 120Hz display would be very restrictive. That's why it will be a few years before we see it, unless Apple does it in the Pro Display.
Fortunately, the world doesn't just consist of Apple :) As far as graphics editing is concerned, I've turned my back on Apple for a loooong time because Apple simply isn't powerful enough for it, at least as long as power requirements don't play the primary role, but I also mean stationary work.

Apple is good when it comes to Photoshop, for example, everything that involves 2D... everything is top, but a Windows system with Nvidia is never worse. However, it becomes dramatic when it comes to 3D rendering, where the wheat is separated from the chaff and the Windows PC brutally overtakes Apple.

That's why I'm not worried that there will be HighDPI screens (6k/7k/8k) with 120+Hz, because graphics cards have already been able to operate such systems for 2-3 years.
 
That's why I'm not worried that there will be HighDPI screens (6k/7k/8k) with 120+Hz, because graphics cards have already been able to operate such systems for 2-3 years.
Got a question, since I don't follow the higher performance PC market. The Windows PCs that do what you're talking about, 3D rendering, much more powerfully than Macs...how many of them have connection technology (e.g.: Thunderbolt 5, very recent HDMI or DisplayPort) that can handle 6K at 120+ Hz?

I ask because display vendors will want a pre-existing market for their wares, and I'm curious whether it exists yet, or when it will. I figure it's a matter of time, but therefore of course I ask 'when?'
 
Got a question, since I don't follow the higher performance PC market. The Windows PCs that do what you're talking about, 3D rendering, much more powerfully than Macs...how many of them have connection technology (e.g.: Thunderbolt 5, very recent HDMI or DisplayPort) that can handle 6K at 120+ Hz?

I ask because display vendors will want a pre-existing market for their wares, and I'm curious whether it exists yet, or when it will. I figure it's a matter of time, but therefore of course I ask 'when?'
Nvidea RTX 50xx and AMD RX 79XX have display port 2.1 which is capable of driving a single 8k display @165hz

Does the panel technology even exist yet outside of paper on the display manufacturers side? And is it worthwhile considering tooling up for production yet?

Probably not.

My guess is that the first high refresh (single input) 8k displays we'll see will be extremely expensive TV's, and maybe some (even more) extremely expensive and extremely niche highly calibrated pro production monitors in the $30-50,000 range.

Maybe I'm wrong
 
  • Like
Reactions: drrich2
The relationship between pixel count and refresh rate illustrates this point, as seen in the "dual mode" displays that are emerging. The pixel count is quadrupled as the refresh rate is halved. Or, the other way around, as the refresh rate is doubled, the pixel count is quartered.

You can see this in the (unreleased) Acer Predator XB323QX (5K 165Hz, 1440p 330Hz) and the Acer Predator X32 X3 (4K 240Hz, 1080p 480Hz).
I wouldn't read too much into the limitations of physical panels to infer the limitations of GPUs.

And you are right about the resources -- it's not just about bandwidth -- the GPU requirements for a 6K 120Hz display would be very restrictive. That's why it will be a few years before we see it, unless Apple does it in the Pro Display.
I'm running four 4K 60Hz displays on a single 10 year old GPU, and it's perfectly fine for typical desktop use. A 6k 120Hz display would only be 23% more pixels per second.
 
And you are right about the resources -- it's not just about bandwidth -- the GPU requirements for a 6K 120Hz display would be very restrictive.

I'm running four 4K 60Hz displays on a single 10 year old GPU, and it's perfectly fine for typical desktop use. A 6k 120Hz display would only be 23% more pixels per second.
Putting aside the issue that 4 displays probably aren't using just one cable and port, does having the 'resolution x refresh rate' load split between, say, 2 displays vs. 1 matter in terms of the demand it puts on the GPU? I don't know why it would, but they say 'you don't know what you don't know.'

Per Apple, an M4 Mac Mini can support:
  • One display up to a native resolution of 8K (7680 x 4320) at 60Hz or 4K (3840 x 2160) at 240Hz over Thunderbolt or HDMI

Half the resolution at 4 times the refresh rate. So...why couldn't it handle 8K at 120 Hz? Or do they just not list things that don't exist yet?

After all, 8K x 120 Hz would be equivalent to 4K at 240 Hz, yes?

Perhaps not. We saw this:
The relationship between pixel count and refresh rate illustrates this point, as seen in the "dual mode" displays that are emerging. The pixel count is quadrupled as the refresh rate is halved. Or, the other way around, as the refresh rate is doubled, the pixel count is quartered.

You can see this in the (unreleased) Acer Predator XB323QX (5K 165Hz, 1440p 330Hz) and the Acer Predator X32 X3 (4K 240Hz, 1080p 480Hz).
So...why is the pixel count quartered when the refresh rate is only doubled?

I would've thought the product (multiplier) of resolution x refresh rate = pixel work load on GPU, in addition to determining the bandwidth requirement.
 
I wouldn't read too much into the limitations of physical panels to infer the limitations of GPUs.
It’s not a physical problem. It’s an engineering problem.
I'm running four 4K 60Hz displays on a single 10 year old GPU, and it's perfectly fine for typical desktop use. A 6k 120Hz display would only be 23% more pixels per second.
Can that GPU run a 4K 240Hz display like the LG 32GS95UE? By your logic, it should be able to, right? But the system requirements are RTX 4080 or above.
 
  • Like
Reactions: deevey
Putting aside the issue that 4 displays probably aren't using just one cable and port, does having the 'resolution x refresh rate' load split between, say, 2 displays vs. 1 matter in terms of the demand it puts on the GPU? I don't know why it would, but they say 'you don't know what you don't know.'
I don’t know either, but in practice it does seem to matter. I don’t think the math is as simple as one might think, even in theory, and definitely not in practice.
Per Apple, an M4 Mac Mini can support:
  • One display up to a native resolution of 8K (7680 x 4320) at 60Hz or 4K (3840 x 2160) at 240Hz over Thunderbolt or HDMI
Half the resolution at 4 times the refresh rate.
It’s not half. 8K is quadruple 4K, not double. It’s doubled both horizontally and vertically.
So...why couldn't it handle 8K at 120 Hz? […] After all, 8K x 120 Hz would be equivalent to 4K at 240 Hz, yes?
You are right that a dual-mode 8K 120Hz would support 4K at 240Hz. As you know, BOE has just such a display in pre-production.

Or maybe the answer to that last question is no? Maybe the equivalence is only with regard to bandwidth, not processing power?

PS: @deevey beat me to it!
 
Last edited:
  • Like
Reactions: drrich2
It’s not a physical problem. It’s an engineering problem.
An engineering problem isn't a physical problem? I didn't mean that the laws of physics don't allow it.
Can that GPU run a 4K 240Hz display like the LG 32GS95UE? By your logic, it should be able to, right?
I was responding to this from you: "you are right about the resources -- it's not just about bandwidth"
My GPU doesn't have the bandwidth/connections for a single 6K 120Hz display, but I was just commenting on the amount of "resources" it would take.
Rendering that many pixels is not tough for any modern GPU, even fairly mediocre ones (as long as you aren't playing a demanding game or 3D modeling app).
But the system requirements are RTX 4080 or above.
Where are you getting the RTX 4080 requirement from? It should work just fine on an RTX 3050 (for desktop use, not for fancy games).
 
An engineering problem isn't a physical problem? I didn't mean that the laws of physics don't allow it.
I guess I didn't understand what you meant by "limitations of physical panels" -- to use the 4K example(s), I thought you meant the display can't do both 4K and 480Hz at the same time because of the physical limits of the panel itself. So even if all the other engineering obstacles like bandwidth and silicon design/performance were surmounted, a fundamental "physical" problem would remain. For all I know, that could be true, but I haven't heard anyone say that with regard to these dual-mode panels that can do one or the other, but not both.
I was responding to this from you: "you are right about the resources -- it's not just about bandwidth"
My GPU doesn't have the bandwidth/connections for a single 6K 120Hz display, but I was just commenting on the amount of "resources" it would take.
Rendering that many pixels is not tough for any modern GPU, even fairly mediocre ones (as long as you aren't playing a demanding game or 3D modeling app).
I see -- my main point in that comment (however sloppily worded) was that the reason why we don't have a 4K 480Hz display today is because the system requirements to run it are too restrictive. The same *was* true for 6K 120Hz until very recently, but I still think it will be a couple of years before we see that, if ever (again, unless Apple does it) -- the gaming industry has no interest in 6K (as opposed to 5K, as seen in the Predator I mentioned, the first of its kind, and not likely to be the last). Dual-mode 8K 120Hz is more likely to come first (before 6K 120Hz), and indeed it has.
Where are you getting the RTX 4080 requirement from? It should work just fine on an RTX 3050 (for desktop use, not for fancy games).
I was refuting (in the context of systems that can run a 6K 120Hz display) your assertion that your ten-year old card could do it. You didn't qualify that assertion. There’s more to a GPU than just performance, there’s also the design of the display engine, for example. You are right that RTX 30 series (late 2020) appears to be the minimum for the LG, which is precisely my point -- when cards that can drive 8K 120Hz are five years old, we'll see that. But to take full advantage of its specs (for whatever purpose), the whole point of owning such a demanding display, I (perhaps a bit too quickly, it was the first thing that popped up in my search) relied on this assessment: LG 32GS95UE review: 480Hz OLED wonderland
 
Last edited:
I guess I didn't understand what you meant by "limitations of physical panels" -- to use the 4K example(s), I thought you meant the display can't do both 4K and 480Hz at the same time because of the physical limits of the panel itself. So even if all the other engineering obstacles like bandwidth and silicon design/performance were surmounted, a fundamental "physical" problem would remain. For all I know, that could be true, but I haven't heard anyone say that with regard to these dual-mode panels that can do one or the other, but not both.

I see -- my main point in that comment (however sloppily worded) was that the reason why we don't have a 4K 480Hz display today is because the system requirements to run it are too restrictive. The same *was* true for 6K 120Hz until very recently, but I still think it will be a couple of years before we see that, if ever (again, unless Apple does it) -- the gaming industry has no interest in 6K (as opposed to 5K, as seen in the Predator I mentioned, the first of its kind, and not likely to be the last). Dual-mode 8K 120Hz is more likely to come first (before 6K 120Hz), and indeed it has.

I was refuting (in the context of systems that can run a 6K 120Hz display) your assertion that your ten-year old card could do it. You didn't qualify that assertion. There’s more to a GPU than just performance, there’s also the design of the display engine, for example. You are right that RTX 30 series (late 2020) appears to be the minimum for the LG, which is precisely my point -- when cards that can drive 8K 120Hz are five years old, we'll see that. But to take full advantage of its specs (for whatever purpose), the whole point of owning such a demanding display, I (perhaps a bit too quickly, it was the first thing that popped up in my search) relied on this assessment: LG 32GS95UE review: 480Hz OLED wonderland
My only point was to refute this claim, which you had replied to:
So what - I don’t understand the refresh rate obsession. Do people not understand how many pixels this thing is and how many resources would literally go up in flames to drive it to 120hz or whatever for almost no gain??
Even basic GPUs have plenty of pixel rendering power to drive 6K at 120Hz. They just don't have the port bandwidth.
The cheapest Mac can run two 6K and a 5K simultaneously at 60Hz. That means its GPU is fast enough to render the pixels for a single 6K 120 Hz monitor.
It also supports "one display with up to 8K resolution at 60Hz or 4K resolution at 240Hz", so maybe it could run 6K at 120Hz, though that is more pixels per second than either of those, so probably not.

But to take full advantage of its specs (for whatever purpose), the whole point of owning such a demanding display
No GPU can fully take advantage of its specs in all games. I want a monitor that I can run at full specs for desktop apps, then adjust to lower settings to optimize the performance of whatever game I'm playing.
 
Rendering that many pixels is not tough for any modern GPU, even fairly mediocre ones (as long as you aren't playing a demanding game or 3D modeling app).
I would warrant that the majority of persons who would want/need such a high resolution and FPS display ARE involved in demanding industries, or Gamers.

Anyhow a Mac Studio M3 ultra (in theory) is already capable of driving an 8k screen @ 120hz - but they don't exist yet in the wild.

I'm sure as soon as the display industry have such a display that works with a single connector we'll know about it and when it becomes viable to Apple (or anyone else) to commercially produce such a product and upsell their existing XDR user base they will.

Who knows, maybe we'll get a surprise and they'll bring out a 4.5k iMac display with high refresh rate first for "all those" productivity people who use excel. 🤷‍♂️


-----------------------------------

Anyhow back on-topic, the Asus display, is a decent prosumer 6k display that is going to suit alot of people at that price (inlcuding me) and may drive down the prices a little of existing 5k monitors, like the S9.

The biggest mistake in this press release was even suggesting that this monitor is any kind of "competitor", "alternative" or even a distant, long lost cousin to the XDR. If you are truly in the market for an XDR, an XDR is still what you need.
 
Last edited:
Anyhow a Mac Studio M3 ultra (in theory) is already capable of driving an 8k screen @ 120hz - but they don't exist yet in the wild.

I'm sure as soon as the display industry have such a display that works with a single connector we'll know about it and when it becomes viable to Apple (or anyone else) to commercially produce such a product and upsell their existing XDR user base they will.
It's a tangent for this thread, but I'll ask anyway...given the high workload 8K at 120-Hz puts on a system, and what I think is the tendency for most displays to top out at 32" size, what are the practical benefits of 8K (as opposed to 6K) at 120-Hz?

The 120-Hz I get; smoother moving elements onscreen, so it looks more like moving physical objects than choppy video.

But at a 32" display size, 6K is said to be 'retina' - at a point where, at typical user eye distance, the human eye supposedly wouldn't discern a substantial difference at 8K. I asked about this in Post #235, and that concern was addressed in an answer in Post #237 (apparently, some people discern the difference).

Which leaves me to wonder what vendors will do with 8K displays. Do you anticipate they'll mainly be 32" displays for the most demanding professional users who spend long hours dealing with fine detail and discern better than most of us?

Or will the shift more likely be to use 8K in larger displays aiming for around 220 PPI 'retina' resolution? A quick online search showed one source claiming 8K would be retina at 34.5" display size. Do you think vendors like ASUS will likely keep to 32", or go to 35"?

I've never had a 32" display, so I don't have first hand experience with the 'look and feel' on my desk.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.