Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I am not sure what to think of adding a chip inside a display. Yes, it has benefits but one of the things I love about my 30" ACD is the ability to use it for over a decade. Still an awesome monitor. I'm really not sure the chip will cause the monitor to become obsolete at a much higher pace. That would suck. I really intend to keep a new Mac monitor for way longer than a decade.
 
  • Like
Reactions: johnmarki
How about an affordable screen like Thunderbolt display back in 2011, Apple?

Seriously tho, I could imagine it would sell like crazy. Give it some "docking" or "thunderbolt hub" functionality and people would buy them by the dozen.
LOL affordable… I guess compared to today’s $1,000 for a stand it is considered affordable. Thunderbolt monitor was $1,000 when better monitors were half the cost. I have one and still use it, but today’s market pumps out cheap monitors that have good screens etc.

but…would love to have an apple “affordable” monitor to replace my Apple Thunderbolt monitor and would pay $1,000 today regardless .. :)
 
I wonder if this display serves as an Apple Silicon add-on for the Intel-based Mac Pro.
That could open up also options/possibilities also for intel systems with USB-C ports…

Humm…my fully loaded intel i9 MacBook Pro 2018 would be happy with one of those types of monitors.
 
  • Like
Reactions: MisterAndrew
Lovely, do whatever you need to do Apple, just launch a new, large, beautiful external display!
 
Oh cause Apple is done with AMD and Nvidia. Apple Silicon GPU will be super powerful compared to those companies
The problem is that many things that rely on a GPU for processing won't use it (at least not for years, if ever). Most software that uses GPU processing relies on CUDA (Nvidia). Many games target specific GPU features from AMD and Nvidia. Apple Silicon can be the fastest GPU ever, but it will always be limited to dedicated Mac developers, and the forums for cross-platform software will be now be filled with users asking "when will you support Apple Silicon GPU?" in addition to "when will you support OpenCL?" and "when will you support Metal?".
 
  • Like
Reactions: richinaus
It is, but each zone consists of four LEDs, with RGBW.

They're not technically pixels in that they're much larger than the panel's pixels, but you asked why there's a factor of four. This is why.

Do you really think an M1 chip would be too slow to control 4 * 2732 * 2048 LEDs at 120 Hz? It wouldn't. They're just too costly to produce so far.

That is not the reason, there are many other mini LED panels that do 1x2, 2x1 binning. And in some panels there are literally simgle mini LEDs that light up multiple pixels. The factor four doesn’t have anything to do with RGBW….
After all, mini LED binning always result in more booming, and that’s the reason why screen manufacturers are putting more LEDs in their panels and creating FALD algorithms to control them individually (as much as possible).

And yes, the M1 is too slow to control a full FALD LED matrix in real-time when the number become to large. There is a reason why we invented dynamic programming algorithms to interact with large matrices. If the LED matrix becomes too larger the computational time and space complexity becomes exponentially more demanding. Especially if you do it in real as required for mini LED monitors and television.
In computer science they use freaking dual socket 128 core CPUs just to do basic matrix calculations and depending on the type data and algorithm, it’s may even take 10-20 second to finish one operation.

My M1 MacBook Air already struggles when doing sequence alignment using 5000x5000 matrices.
 
> And yes, the M1 is too slow to control a full FALD LED matrix in real-time when the number become to large.

I don't know why you would think this - I know that you understand that the M1 MBP is fast enough to "control" the matrix of 6016 * 3384 (so, over 20M) pixels - or 60M individual subpixel elements - of an external XDR display at 60Hz. Those pixels also have 30 bits of resolution, which is much higher than a dimming zone would need.

The reason, of course, is that the chip has a GPU in it, which is designed for exactly this type of work. Controlling 2500, or 10000, or a million dimming zones is nothing to this hardware.
 
  • Like
Reactions: mook
The problem is that many things that rely on a GPU for processing won't use it (at least not for years, if ever). Most software that uses GPU processing relies on CUDA (Nvidia). Many games target specific GPU features from AMD and Nvidia. Apple Silicon can be the fastest GPU ever, but it will always be limited to dedicated Mac developers, and the forums for cross-platform software will be now be filled with users asking "when will you support Apple Silicon GPU?" in addition to "when will you support OpenCL?" and "when will you support Metal?".

No reason Apples Mac OS can’t detect the eGPU and present that as the only GPU to software… in fact that’s how eGPU support with AMD works today… and games work just fine on it. Also because most developers wrote using Apples threading apis and lots have already shifted to Metal or Vulken most of the software will get a boost if they have GPU loads.
 
It could function by itself as cheap iMac, and be a great display for a mini, maybe even wireless connection.
Yes - but I doubt that Apple would even consider that - maybe it could expose itself as an Airplay Monitor, that would be cool
 
  • Like
Reactions: HowardEv
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.