Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I tend to agree after contemplating this possibility in a monitor...that It is probably not a GPU.

It would be pointless UNLESS you can change it out (and Apple is not interested in that).

A good guess if this is JUST a rumor (this is a website of Mac-"rumors" and fake news) to get everyone click-baiting or at least interested in monitors since everyone is expecting or waiting for Apple to put "something" out at a lower cost than their current offering.

Depending on their contract with LG....that probably will determine "if" we get an official low end Apple monitor.
 
  • Disagree
Reactions: speedyg256
From Apple's perspective a good business case. You'll need to update your monitor every 5 years as well.
 
The reason why they are going to use an A13 is to be able to control the mini LEDs more accurately.

Usually mini LED matrices in screens are grouped and controlled in groups. Ideally, screen manufacturers do not want to control them in groups in order to prevent blooming, but the more mini LEDS added (and controlled individually) the more CPU power is required so that the right LED is turned on/off depending on what’s happening on the screen.

An A13 will surely help a lot with that.
 
The reason why they are going to use an A13 is to be able to control the mini LEDs more accurately.

Usually mini LED matrices in screens are grouped and controlled in groups. Ideally, screen manufacturers do not want to control them in groups in order to prevent blooming, but the more mini LEDS added (and controlled individually) the more CPU power is required so that the right LED is turned on/off depending on what’s happening on the screen.

An A13 will surely help a lot with that.
An A13 would be absurdly overpowered for that. Even the original iPhone chip would be.
 
The reason why they are going to use an A13 is to be able to control the mini LEDs more accurately.

Usually mini LED matrices in screens are grouped and controlled in groups. Ideally, screen manufacturers do not want to control them in groups in order to prevent blooming, but the more mini LEDS added (and controlled individually) the more CPU power is required so that the right LED is turned on/off depending on what’s happening on the screen.

An A13 will surely help a lot with that.
The A13 is a CPU with GPU and Neural Engine cores, which is wayyyy overpowered to be used as a controller
 
Probably only one display input. Every non-apple monitor has multiple inputs. Some even have a KVM switch built in. For example, I have three computers on my desk that I use with two monitors. 1 x Mac, 1 x laptop (work), 1 x laptop (customer). That would not work at all with an Apple monitor.

You only get a one year warranty with Apple for that. Many people forget that when comparing prices. And you probably get a mirror as a monitor again with Apple.

Let's see if the Apple monitor will be height-adjustable and inclinable. Probably not or only to a limited extent.

I would be surprised if things turn out differently.
 
The A13 is a CPU with GPU and Neural Engine cores, which is wayyyy overpowered to be used as a controller

Not true at all, the more local dimming zones added to a TV/monitor it gets exponentially more demanding to control them individually!

Even a A13 is not powerful enough to control anything above 3000+ dimming zones.
 
  • Like
Reactions: WP31
An A13 would be absurdly overpowered for that. Even the original iPhone chip would be.

The M1 iPad Pro still fails to control dimming zones individually, let alone a A13. Instead they do 4x4 LED binning in order to make it less demanding for the iPad. As a result, horrible blooming problems with the iPad Pro (M1), even though it has more dimming zones when compared to modern FALD LCD televisions.
 
The M1 iPad Pro still fails to control dimming zones individually, let alone a A13. Instead they do 4x4 LED binning in order to make it less demanding for the iPad.

That’s a cost issue, not a performance one. LEDs small and plentiful enough would be cost-prohibitive.
 
I’ve been on and off in the last couple of months to buy the Apple XDR Studio Display.

I really like the display quality yet … one reason have been keeping me from it … it seams that it only works well with Apple hardware as a source. For a monitor of this price is just way too limiting.

Apple attitude is starting to bug me. Every other of my suppliers seam to go the extra mile for compatibility, including with Apple products. Apple is precisely the opposite … almost arrogantly and sarcastically so.
 
Last edited:
  • Like
Reactions: EntropyQ3
I think the reason for adding a Neural Engine is for image enhancement. The streaming of high Quality video from your phone or Mac can have choppy or reduced quality specially on Wifi. But with an Image Enhancement Engine similar to Nvidia DLSS or AMD's FSR you can make do with lower speed WiFI and still get a decent picture quality while streaming from devices. This also means a wireless Display is in the horizon.

something like DLSS or FSR isn't going to help on a crowded WiFi band or travel better through walls. Airplay tops out at about 30Hz. It is going to be hard to upscale stuff that is missing from the other half of the 60Hz DisplayPort video stream.
Mainstream "Long Distance" , WiFi 5 or plain 6 is too slow. But 6E is borderline fast enough for 24"-27" screens (and non HDR). with some upscaling help. However, if shorten up the range close enough....

"...Rather than relying on a connection to a smartphone or a computer, the headset CNET described would connect to a "dedicated box" using a high-speed short-range wireless technology called 60GHz WiGig. The box would be powered by a custom 5-nanometer Apple processor that's "more powerful than anything currently available." ..."
https://www.macrumors.com/roundup/apple-glasses/

Apple has been rumored for a couple of years to be prototyping with relatively really short distance , non mainstream WiFi 60GHz . I doubt this would be something that would work with 1-4 year old Apple products though. Bringing high end , quality video "wireless Display" to old machines without a new radio probably isn't going to work.

"... 802.11ay has a transmission rate of 20–40 Gbit/s and an extended transmission distance of 300–500 meters.[6] 802.11ay should not be confused with the similarly named 802.11ax that was released in 2019. The 802.11ay standard is designed to run at much higher frequencies. .."

17-30 Gb/s is enough to do 4K at 60Hz refresh.

This won't go through walls or an substantive opaque object, but if have two units sitting on the same desktop , then that is relatively very close and not much but the systems' enclosure walls between them .


Also a substantive problem of how to get 3-4 wireless displays all working in the same room ( e.g a work area with 3-4 computers and paired displays. Enough bandwdith and latency to go around at that point? )

But yeah... Apple's. 'holy war' against wires. It wouldn't be surprising for Apple to roll that out as a "gee whiz, insanely great" feature for a display for an additional $300 to the base price weaved in.


P.S. with more time to think about it, I suspect the A13 in these prototypes is just a "stub" solution for whatever Apple is making for the VR goggles. And that stuffing that chip into the monitor for your desk is somehow coupled to the chip being placed in the monitor for your face.
1. Apple puts this new SoC in more products ( to get higher economies of scale and spread out the R&D costs ).
2. Aimed at goggles it would be low power consumer; so no big new thermal headaches if had to place close to the panel ( thinnest monitor enclosure they can get away with. )
3. if plug old mac into VR video distribution base station ... could work with those too. ( for lots more money. would enable hiding more wires but not completely eliminate them. For example, TB cable to basestation, but more monitor placement freedom. )
 
  • Like
Reactions: pankajdoharey
A13!! Will be at least 2 years old by the time it launches. But it should be powerful enough. Hope Apple does not price it sky high.

The rumor at 9to5mac actually says.

"The new display is being developed under the codename J327, but at this point, details about technical specifications are unclear. According to sources, this display will have an Apple-made SoC, which right now is the A13 Bionic chip — the same one used in the iPhone 11 lineup. ..."

' right now' is a A13 doesn't necessarily mean shipping with a A13. 'Being developed" means it isn't the final engineer review before mass production. The specific of the source is only that the final product will have an Apple-made SoC. That's it.

Apple could use the A13 as a "stub" to test what a future. 5nm or3nm smaller more power efficient could do now, but with higher power and space consumption. Work out the bugs with radio/picture/etc and the ready for the better hardware later. If the testing is far off the concept then can adjust or kill the project.

The T2 was a mildly tweak. A9 (?) or something along those line. If not running random end user software it doesn't matter much for some fixed, limited , embedded processing role.
 
That’s a cost issue, not a performance one. LEDs small and plentiful enough would be cost-prohibitive.

I highly advise to do some basic reading why there is a limitation in the screen-industry to increase the effective amount of local dimming zones…

The M1 iPad Pro has a total of 2596 dimming zones, yet it has 10.000+ mini LEDs in the panel. Ever wondered why they can’t control them individually?

From a mathematical point of view, it’s same problem as matrix multiplication using brute force algorithm. There are more efficient methods such dynamic matrix algorithms, but they still require large space and time complexity which is not feasible to produce by screen manufacturers (or even Apple).

Imagine a screen being shipped with 500mm^2 chip and 2KG heatsink only to be able to drive the mini LEDs. This will significantly increase the cost and size of the screen!
 
sign me up for the consumer level display if it's around the same price as the old thunderbolt models ($1,000 or less).
 
sign me up for the consumer level display if it's around the same price as the old thunderbolt models ($1,000 or less).
I think pretty much everyone that comment in these thread would like to see Apple come up with a high quality display that doesn’t break the bank. There are an awful lot of MacBooks and Minis that are waiting for that one. Or rather, and more concerning for the shareholders, whose monitor $$ go to other companies.
Who knows, if it was sufficiently nice, it might even sell on the PC side of the fence.
 
I highly advise to do some basic reading why there is a limitation in the screen-industry to increase the effective amount of local dimming zones…

The M1 iPad Pro has a total of 2596 dimming zones, yet it has 10.000+ mini LEDs in the panel. Ever wondered why they can’t control them individually?

Because those are RGBW subpixels. Hence four LEDs per zone.
 
I'm just curious why you feel that way. There are some truly fantastic displays from Viewsonic, NEC, and even Dell and other companies in the $800-$1500 range, and pretty good displays as low as $300. What does the Apple logo bring to the space?

Don't take this as anti-apple; enjoy their products. But why do you care so much for a product in.a space they clearly have no interest in being in when there are so many excellent products already available?
to be honest it's purely aesthetic — i love apple's design. same way i love sony tv's and can't stand samsung. it's a matter of taste nothing more — i'm sure there are many wonderful options, but don't want them on my desk. there was actually an article here not too long ago about some very attractive display(forget the brand), but there wasn't even a drop date for it yet...
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.