Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Apple overtly said they were.... emphasis added.

"..
With regards to the Mac Pro, we are in the process of what we call ‘completely rethinking the Mac Pro.’Phil Schiller
As part of doing a new Mac Pro — it is, by definition, a modular system — we will be doing a pro display as well. Now you won’t see any of those products this year; we’re in the process of that. We think it’s really important to create something great for our pro customers who want a Mac Pro modular system, and that’ll take longer than this year to do. ..."
https://techcrunch.com/2017/04/06/t...-john-ternus-on-the-state-of-apples-pro-macs/

Note that is "a display". They may not limit themselves to a singular one but it is not a range. Nor do they need a range.


The display return is in the same sentence as where they are coming back with the Mac Pro. If not coming back with Display then not coming back with the Mac Pro. I think it is a huge mistake though to think the displays are coming back only because the Mac Pro is coming back. Also on top of that the Mac Pro coming back will use the same set of I/O ports that were available in 2010.

The Mac Pro will be brought back in the current technology context. That is a context where the MBP is the dominate "Pro" system.

".. Across all of that, as we’ve said, we’re a more mobile than desktop company; of the people who use pro apps, and define themselves as pros, our largest product used by those customers are notebooks ... Second on the list is iMacs — used by pros ....
Third on the list is Mac Pro. Now, Mac Pro is actually a small percentage of our CPUs — just a single digit percent. However, we don’t look at it that way.

The way we look at it is that there is an ecosystem here that is related. ... "
[quote from same article as above. ]

That is all pretty clear. The Mac Pro is not the primary driver here. How all these systems get along with each other is an major issue. So when Apple brings back a display it would probably have features to enhance the MBP and iMac as at least much as the Mac Pro. Apple has lots of Pro users who have a MBP that could a bigger than laptop display also.






If include all of the Macs which have a run rate of 10-14M per year there is more than a viable display market there. Even if just sell displays to 1-2% to that market that is 100K-280K display. If Apple charges $750/display that is that is about $17M in profits for Apple if the mark-up is 30%. If "Pro" market price tag over $1000 that would be even fatter profit. For a budget of $7M ( as subset of those profits) can't get a couple of display out the door is kind of crazy

[ $750/ display would be a mix of something like the Apple versions of these two.
https://www.apple.com/shop/product/HKMY2VC/A/lg-ultrafine-4k-display?fnode=8a $699
https://www.apple.com/shop/product/HKN62LL/A/lg-ultrafine-5k-display?fnode=8a $1999

average $999 ]




Failed to be industry standard? It is standard alt-mode of USB Type C. Is USB not a standard?

The long time rant in this Mac Pro forum is that USB was going to "kill" Thunderbolt. It didn't. Thunderbolt merged in and is only increasing adaptation at this point. One of the major resistance from the system vendors was that Intel was the only implementor. That won't be true in the next year or so.

Apple needed to allow 3rd party to do a diverse set of peripherals. The major issue has been to get more system vendors to put the port on more systems. That is basically happening now in the > $1,200+ laptop market now ( and a lower rate in the next price tier down. ). The only place where USB has a almost absolute exclusive is the chromebook price zone. ( that will just take a couple more years. )




Apple doesn't have to build every possible monitor. They just need to make 1 or 2 to "return" to the monitor business. The vast majority of Mac Pros sold probably won't use one. That doesn't make a difference to viability if Apple sells just a minor sliver to the rest of the Mac market.

Apple doesn't need to come back with a super duper 8K HDR monitor or any other super low run rate one. A single 4K and 5K would do just fine. ( and the iMac volume would further viability since that would be the real volume driver for the panel component. )

I can't see Apple doing anything for the monitors besides breaking out the 5K (and possibly 4K) monitors into computer-less chassis. The displays are very good, if not perfect for people who need precise calibrated monitors, but it'd be a lot cheaper to repurpose.
 
Could the Pro display be a VR/AR headset? What other new could Apple offer to the industry in the form of an external display? A display with built in GPU or with a slot for GPU? Unseen exceptional picture quality? Built in Apple TV? Wireless charging on the foot?

Apple usually tends to walk ahead instead of taking steps back.
 
Last edited:
Hmm, the day after this leak from Apple, Intel actually releases CPU packages that Apple can actually use in new Macs.
accidents.jpg
 
So, what comes to this ARM news (nothing new there, been talked over and over again even in this discussion chain...), to avoid bootcamp problems and x86 software emulation sluggishness, a solution could be this I wrote about two years ago:

https://forums.macrumors.com/threads/2016-nmp.1952250/page-75#post-22830123

"...Apple and AMD are working on ARM/Zen custom chip, which would be ARM native, but could run x86 compatible software on a window that is actually run on a Zen core, but user wouldn't notice the difference. It would just work. Two OS's' running on the same computer. - -. That would be a transition period solution, until all software is x86 independent. After that period Apple could charge extra $$ from those needing x86 compatibility.

It has been speculated, that macOS could bring UIKit to desktop. One purpose would be to get rid of those dozens of browser "apps", I mean tabs - like Facebook, and make it work like an app. - - these kind of apps are pretty much processor independent.
" - Zarniwoop 04.2016

So, what Apple could offer is two price categories:

- Mac's with ARM only, macOS, software x86 emulation, no bootcamp. Cheaper.
- Mac's with ARM and x86 cores. No x86 emulation needed, bootcamp is now an app, that runs native Windows. No need to shutdown macOS when running another OS. Expensive, Pro solution.
 
Last edited:
  • Like
Reactions: pinchu71
"...Apple and AMD are working on ARM/Zen custom chip, which would be ARM native, but could run x86 compatible software on a window that is actually run on a Zen core, but user wouldn't notice the difference. It would just work. Two OS's' running on the same computer. - -. That would be a transition period solution, until all software is x86 independent. After that period Apple could charge extra $$ from those needing x86 compatibility.

It has been speculated, that macOS could bring UIKit to desktop. One purpose would be to get rid of those dozens of browser "apps", I mean tabs - like Facebook, and make it work like an app. - - these kind of apps are pretty much processor independent.
" - Zarniwoop 04.2016

So, what Apple could offer is two price categories:

- Mac's with ARM only, macOS, software x86 emulation, no bootcamp. Cheaper.
- Mac's with ARM and x86 cores. No x86 emulation needed, bootcamp is now an app, that runs native Windows. No need to shutdown macOS when running another OS. Expensive, Pro solution.

It would be way easier, and far cheaper, to compile apps for both Intel and ARM instead of coming up with custom hardware to run both types of apps.
 
It would be way easier, and far cheaper, to compile apps for both Intel and ARM instead of coming up with custom hardware to run both types of apps.
But there are so many using Windows on their Macs in one way or another. Going 100% ARM and ditching Bootcamp would be quite risky.

So those needing both worlds, have to pay more. But less than owning two computers.
 
But there are so many using Windows on their Macs in one way or another. Going 100% ARM and ditching Bootcamp would be quite risky.

So those needing both worlds, have to pay more. But less than owning two computers.

I still don’t get it. They could have Intel Macs that run apps without an ARM processor if apps are compiled for both. You can solve this with universal binaries, you don’t need to add an ARM chip on. Just make every app compiled for ARM and x86.

Are you assuming ARM would be faster? That’s a pretty big assumption right now. And if ARM was faster I’d be surprised if they throw on x86 cores just so people can run Windows.

Windows solved this problem with Univeral Binaries, no dual type chips necessary.
 
Windows does ARM.
Raise the BS flag. BS

Windows 10 does a subset of the OS in ARM - but most people buy a computer to run other apps.

For the most part, those are x64 apps that won't run on an ARM processor. If there's a ARM layer to emulate x64 - it will most likely be unacceptably slow.
 
Are you assuming ARM would be faster? That’s a pretty big assumption right now. And if ARM was faster I’d be surprised if they throw on x86 cores just so people can run Windows.

I'm assuming that ARM is too slow for Pro desktop. It would be extremely slow to emulate x86 on it. So, putting Apple CPU with 8-core Zen+ could solve some of the problems.

Anyhoo, the main problem is the profit, Apple wants more, and Intel is less desirable companion in that regard. Apple wants Intel's' cut. Macbooks and similar Office + Internet browser use cases are well covered with ARM. Low cost iMac maybe so. But anything else, there has to be another solutions. If it is dual binaries or emulation with sacking Bootcamp, the Pro campers ajdö(adieu). I myself would need to buy a Windows laptop. Parallels could become notably slower, and more difficult to maintain = more expensive.

Past transition from PowerPC to Intel was easy, when Intel had more powerful chips. Now it is opposite direction, and emulation just wont cut in.

AMD would be a great helper, if Apple wanted a custom chip with ARM and x86. And AMD have a business model, where they create a custom chip for a customer that they can copy as much they want. With HSA, those CPU's could even share the same GPU. And both ARM and AMD are quite easy with the concept.
 
Last edited:
I'm assuming that ARM is too slow for Pro desktop. It would be extremely slow to emulate x86 on it. So, putting Apple CPU with 8-core Zen+ could solve some of the problems.

Anyhoo, the main problem is the profit, Apple wants more, and Intel is less desirable companion in that regard. Apple wants Intel's' cut. Macbooks and similar Office + Internet browser use cases are well covered with ARM. Low cost iMac maybe so. But anything else, there has to be another solutions. If it is dual binaries or emulation with sacking Bootcamp, the Pro campers ajdö(adieu). I myself would need to buy a Windows laptop. Parallels could become notably slower, and more difficult to maintain = more expensive.

Past transition from PowerPC to Intel was easy, when Intel had more powerful chips. Now it is opposite direction, and emulation just wont cut in.

AMD would be a great helper, if Apple wanted a custom chip with ARM and x86. And AMD have a business model, where they create a custom chip for a customer that they can copy as much they want. With HSA, those CPU's could even share the same GPU. And both ARM and AMD are quite easy with the concept.

Oh. I see. At this point I’m not sure it’s assured that pro desktops would even move to ARM, or if they did, that it wouldn’t be far enough out that everything has moved over.

The rumor was pretty open ended on if everything would move over or how long it would take. If Apple can’t get ARM fast enough for workstations, or without a clear advantage, they might just leave them be.

If you have to do an x86 co-processor, you might as well not do ARM on the pro machines at all. Apple would just leave it alone.
 
Could the Pro display be a VR/AR headset? What other new could Apple offer to the industry in the form of an external display? A display with built in GPU or with a slot for GPU? Unseen exceptional picture quality? Built in Apple TV? Wireless charging on the foot?

Still goes back to the problem that they’re asking people to accept a gpu for driving a vr headset, connected by thunderbolt, when their competition are doing vr stations with retail GPUs on the motherboard.

AMD (assuming it would be AMD-only) is already second-rate for driving 3D game engines at high resolution, add the TB chokepoint to that?

I’m still betting on a 32” 8k display - the infrastructure is already in place to make the panels, dell has a product that’s proven the panel, process evolution can bring the wide colour gamut to it - it’s virtually a turnkey option.
 
Still goes back to the problem that they’re asking people to accept a gpu for driving a vr headset, connected by thunderbolt, when their competition are doing vr stations with retail GPUs on the motherboard.

AMD (assuming it would be AMD-only) is already second-rate for driving 3D game engines at high resolution, add the TB chokepoint to that?

I’m still betting on a 32” 8k display - the infrastructure is already in place to make the panels, dell has a product that’s proven the panel, process evolution can bring the wide colour gamut to it - it’s virtually a turnkey option.

From all the real world stuff I’ve seen, Thunderbolt 3 is not a significant choke point.

But people also don’t want giant chains of boxes on the table. And not being able to use a Thunderbolt Display with eGPU can be frustrating. 5k Ultrafine and eGPU is not happening.
 
From all the real world stuff I’ve seen, Thunderbolt 3 is not a significant choke point.

*yet*

Is Apple really in a position to wage yet another “megahertz myth” campaign trying to explain to people why their VR solution is just as good, despite having only a quarter of the bandwidth available, when their own WWDC videos have their engineers saying thunderbolt is a secondrate solution?


5k Ultrafine and eGPU is not happening.
that’s a kicker I hadn’t thought of - the gpu in the eGPU box may be able to drive multiple big displays, but the system can’t feed it enough data for them over the TB cable.
 
*yet*

Is Apple really in a position to wage yet another “megahertz myth” campaign trying to explain to people why their VR solution is just as good, despite having only a quarter of the bandwidth available, when their own WWDC videos have their engineers saying thunderbolt is a secondrate solution?

Did Apple engineers say that? I know they've said that Thunderbolt isn't good for powering the internal display but nothing about VR. I know the 580 isn't ideal compared to Vega. Could that be what they meant?

that’s a kicker I hadn’t thought of - the gpu in the eGPU box may be able to drive multiple big displays, but the system can’t feed it enough data for them over the TB cable.

Yep. Another reason I don't think the Apple Pro Display will be Thunderbolt. Unless Intel and AMD do something with Thunderbolt on the GPU front.

It seems unlikely Apple will ask mid level pros to get a MacBook Pro, and an eGPU, and then oooops our display doesn't work with that. Not when Apple could just to DisplayPort and make everyone happy.

Right now in since Apple doesn't make a display they can disavow the UltraFine 5k and 4k whenever they want.
 
Did Apple engineers say that?

If you go back to the WWDC 2017 vids, it's one that covers external graphics, or possibly a dedicated VR one, the presenter literally says that eGPU isn't as good as a motherboard slot.

It seems unlikely Apple will ask mid level pros to get a MacBook Pro, and an eGPU, and then oooops our display doesn't work with that. Not when Apple could just to DisplayPort and make everyone happy.

Yeah, I imagine they could spin it as displays with docks in them being a good consumer option "but pros are telling us they find all that stuff junking up their monitors just gets in the way, and they'd rather have a dedicated IO dock, and let their monitor be just a monitor. So here's Jony Ive to tell us about the new Pro Display...

"We challenged ourselves to answer the question, 'what IS a display?'..."
 
well yes it is, by definition. a motherboard slot is better, and therefore first-rate, so less good than that, is second-rate.
Disagree (and so does oxforddict also apparantly, dont know where you made up your definition from), second-rate and first-rate are polar oposite by definition. eGPU being 5-10% slower than a motherboard slot does not make eGPUs ”substandard, below standard, below par, bad, deficient, defective, faulty, imperfect, inferior, mediocre, poor, appalling, abysmal, atrocious, awful, terrible, dismal, dreadful, unsatisfactory, low-grade, third-rate, jerry-built, shoddy, crude, tinny, trashy, rubbishy, miserable, wretched, lamentable, deplorable, pitiful, inadequate, insufficient, unacceptable, execrable, frightful”, the adjective form is even more clear https://en.oxforddictionaries.com/definition/us/second-rate
 
”substandard, etc"

All sounds like a perfectly apt description for eGPU as compared to a motherboard slot to me.

But hey, whatever helps you sleep at night, knock yourself out with all the stockholm syndrome you like - it's still a secondrate solution, & still only offers an x4 connection to the card...

...for no benefit in a non-mobile computer.

It's a more expensive way to get lower performance, end of story.
 
  • Like
Reactions: askunk and ssgbryan
If you go back to the WWDC 2017 vids, it's one that covers external graphics, or possibly a dedicated VR one, the presenter literally says that eGPU isn't as good as a motherboard slot.

That is more so what you wanted to hear, that isn't what got said.

There is transcript on the video's page. VR with Metal 2

One of the early mentions of eGPUs in the talk is
"... . And finally, by providing that foundational support for external GPUs, so that developers have a broader range of Mac hardware.Of VR capable Mac hardware to work on. ..."

Later on
"...
So, Thunderbolt 3 offers twice the theoretical bandwidth of Thunderbolt 2, which is great.

But you have to keep in mind that this is still a quarter the bandwidth of the PCI bus available to the internal GPUs in our platforms. So, this is important. ..."

This is your hand waving as "not as good as". It just is 1/4. For many mac systems the external GPU is also "bigger than" the GPU that is internal to the Mac. So the next sentence is.

"You have a choice, now, between using the internal GPU with a high bandwidth link, or a high performance external GPU with a link at about a quarter the bandwidth. ..."

The presenter goes on to how the developer can evaluate which choice is better in the context the program is being run in.

In a new Mac Pro, that is quite likely usually going to be the inside (possibly pragmatically embedded ) GPU. For a Macbook the bandwidth tradeoff has a high likelihood of being worth it. The software has to decide which one of the two ( because it is not always the same answer) thing to do.

Later on the following is presented

"... The best advice that we can give you is to render on the same GPU that's driving the display your app is on. I call this the golden rule of GPU selection. So, let's extend this and build a decision tree. ..."

So the notion of TB being a limit of pumping back the display output to be rerouted doesn't really line up.

As for as pre-caching data into the GPU VRAM/cache that is a relatively common issue in gaming and previous. There would be a longer pause before starting on an eGPU but once cached up for local scene this wouldn't be a problem except for extreme cases.


Yeah, I imagine they could spin it as displays with docks in them being a good consumer option "but pros are telling us they find all that stuff junking up their monitors just gets in the way, and they'd rather have a dedicated IO dock, and let their monitor be just a monitor. So here's Jony Ive to tell us about the new Pro Display...

"We challenged ourselves to answer the question, 'what IS a display?'..."

Chuckle. The "pro" series displays that Dell (Ultrasharp) , HP (Dreamcolor or UHD Z ) , Nec ( PA series ) , Ezio (colorEdge ) all have USB hubs in them. So the notion that most pro want monitors sitting on their desks that solely have video data capability is at serious odds with the reality of what the leading vendors in that market are selling. Nobody asked for but they are all doing it.

For example Eizo has a relatively new DCI 4K HDR monitor https://www.eizo.com/products/coloredge/cg3145/
Yes ... it has a 3 port uSB hub in it.

Yes cranking up the screen size up to 5K HDR or 8K ( and data hog HDR cherry on top of that) will push the solution out of a single TBv3 cable solution that doesn't involve compression. The market analysis problem there is how is that the market norm?

The reality here though is Apple hasn't introduced a new monitor that had no power providing features since 2004. That is 14 years ago. Apple started on the power providing monitor track in 2008: 10 years ago. In those ten the Mac market has grown considerably and the revenues and profits totals are way up. Apple is extreme unlikely going to reverse direction and optimize their product line up for the mix of Macs they sold in 2003-2004. Reality in 2018 is different. Mac Pro is going to be an even smaller portion of the Mac line that they were in 2008-2010 ( 2010 is when the last "display only" 30" product was discontinued). The iMac Pro is going to skim off even more.

Apple probably is going to continue to do the different track they have been on. A "Pro" monitor with a singular input on it. Probably almost no buttons ( configuration there software control panel) and yes Thunderbolt v3.

For folks who want something different a new Mac Pro with 4 TBv3 sockets , 2 HDMI and/or mDP sockets, and a open x16 slot could be hooked up almost every "pro" monitor out there. through some combination of configuration options.

Even 8K.
two TBv3 ports with type-C to DisplayPort cables ( or one of mDP sockets if present).
one TBv3 cable to desktop port dock for easy front facing I/O ports. ( not behind monitor either).

three cables up to desk.

If the internal AMD GPU doesn't pass Nvidia focus then a card in the 2nd GPU slot and nominal mainstream hook ups via cards mDP and the USB socket(s )off the box. Three cables up to desk.

For a 4K HDR Thunderbolt could have 1 cable up to desk and a USB controller bandwidth off the ports on the monitor ( and not sub 8 Gb/s USB hub).
[doublepost=1522859591][/doublepost]
....
Yep. Another reason I don't think the Apple Pro Display will be Thunderbolt. Unless Intel and AMD do something with Thunderbolt on the GPU front.

It seems unlikely Apple will ask mid level pros to get a MacBook Pro, and an eGPU, and then oooops our display doesn't work with that. Not when Apple could just to DisplayPort and make everyone happy.

How many Mac users across the whole market are going to buy eGPUs. You folks seems to off lost in the weeds where 10-15% of total Mac new users run off and buy eGPUs. How likely is that?

So let's put this into a more realistic context. Let's say probably somewhat % generously that 5-6% of Mac users buy a eGPU. That still leaves 94% of the Mac market left to sell a TB display into. If Apple sells TB display into about 2% of that base left over it still would likely to be profitable.

Leaving out the eGPU and the 2nd GPU (if provide an open secondary x16 PCI-e slot) isn't going to dramatically change the picture of the target market Apple can sell into. Most Mac Pro users are probably going to buy 3rd party monitors anyway. If they solely just get Mac Pro users who don't buy third party monitors that would likely be even more a market viability issue than the custom GPUs.

Is Apple really going to jump in to the market where they are trying to sell Apple monitors to HP, Dell, and Levnovo workstation owners? How likely is that? Or is Apple primarily going to just target Mac users (basically ignore selling accessories to the general workstation market) ? is BTO or random walk in off the street going to be the primary sales driver?


Right now in since Apple doesn't make a display they can disavow the UltraFine 5k and 4k whenever they want.

The 4K UltraFine option pretty much sucks when it comes to the USB ports on the monitor. They are simply USB 2.0.

As far as disavow goes, that is hardly credible. Those two solutions have only video input. In the rest of LG's lineup at roughly similar price range ( > $540 ) can you find another model they sell that has one and only one video input?
For whatever reason it appears that Apple abandon the development of these monitors and made some deal with LG to finish pushing them out to market. ( Apple industiral design clogged up with other higher priority work? some bean counter went Scrooge McDuck and steve'd them. something along those lines.)

The initial quality issue I think Apple would avoid. I suspect they have learned a lesson that nobody but Apple wants to build something quite like that and so they should probably design it. Very likely it will have a Rip Van Winkle product cycle where they will disappear for 3-4 years at a time. ( 30" monitor took 6 years to replace/retire. )
[doublepost=1522860893][/doublepost]
Could the Pro display be a VR/AR headset?

Extremely unlikely. As much as folks want to hype VR into some vast widely deployed market ... it isn't. Neither one of those make sense as a GUI interaction with a computer.


What other new could Apple offer to the industry in the form of an external display?

Apple's track record with discrete display has not been revolutionary at all. They had adopted changes as they appeared on the market ( LCD panels, hiDPI , 5K ), but there little to show where Apple was all outer their by their lonesome.

A display with built in GPU or with a slot for GPU?

The mania is that GPUs evolve rapidly. Monitors generally don't ( in a bit of a higher dpi and color gamut expansion now but that is probably going to wane soon. ) So coupling a "fast mover" to a "slow mover" makes what kind of sense?

Slot for GPU is putting fans and much bigger power suppliers with more moving parts into a monitor. One reason why monitors generally last a long time is that they avoid moving parts that can fail.

Don't really need a slot. Sonnet introduce a "puck" eGPU https://www.sonnettech.com/product/egfx-breakaway-puck.html . It is VESA mount compliant, so could just bloat it to the back of a monitor. As for folks who want something bigger and bulkier .... that "slot" basically would become the majority of the monitor case. It would more so be a GPU with a monitor attached than vice vera.

So no ... Apple doing either one of those is highly dubious. Apple is extremely likely to let 3rd parties cover the eGPU product space entirely. ( pretty steady track record when it comes to broad coverage of Thunderbolt periperhals )


Unseen exceptional picture quality?

The only thing here is that it wouldn't be surprising to see Apple move their 120Hz iPad Pro work up to the display. It isn't 120Hz for gaming spec porn chasing sake. It would be more the smoothing out the screen for more normal, app display ( not necessarily 3D stuff. ).

Apple has some catch up to do in HDR compliant monitors... not particularly in the "unseen feature before" category.


Built in Apple TV? Wireless charging on the foot?

nope and nope. Why would "Pro" being asking for that. Odd ball just for odd ball sake is unlikely.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.