Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

So far results for the Microsoft chip aren't particularly encouraging even compared to the last-generation and slower old M1 chips, but Microsoft does nothing if not iterate.
Microsoft is in an exclusive chip deal with Qualcomm, so until that agreement term ends: it is Qualcomm's failure, not Microsoft's? There is probably going to be an explosion of high-end ARM competitors within the next few years. This is what Nvidia's failed purchase of ARM was about. Interesting times.
 
  • Like
Reactions: DeepIn2U and spaz8
Back when Apple introduced the last Intel based MacPro they said that the those would stay on Intel for Pro users for quite a while. It's not unreasonable for users to believe that meant more than one iteration. It's also not unreasonable for Pro users expect an affordable option that didn't cost over 10x+ the price of the average Mac Pros prior to the last one.
Apple introduced that machine in 2019. This will be 4 years on that design and if they support it for two - three years after they sell that last one that would be 6-7 years (at a minimum). I think that qualifies as ”quite a while”.
 
Back when Apple introduced the last Intel based MacPro they said that the those would stay on Intel for Pro users for quite a while. It's not unreasonable for users to believe that meant more than one iteration. It's also not unreasonable for Pro users expect an affordable option that didn't cost over 10x+ the price of the average Mac Pros prior to the last one.
Huh? Hardly 10x. More like 2x. What are you talking about?!
 
In what way is this unfavorable for Pro 3D users preferring to use MacOS? Do you work in any of those areas for a living?
Yes, I work in/with Visual Effects and Post Production. Those who want to use macOS, would not want to do it on some third party system (especially one with a support track record as poor as Sony’s) with mediocre drivers from nVidia. The people that would want the system you describe would either want Window or Linux.

We will have to see where Apple is in a generation or two for it GPU, but Unified Memory is great for having lots of stuff in GPU RAM. If we get metal support for some of the 3D software, things will be great. If not, they would just be also ran systems anyway.
 
  • Like
Reactions: Nugget and spaz8
and so you’re willing to pay Apple’s premium, but Apple is charging you that premium just for the brand. Not for the design, performance, device integration, etc you were looking for and found, they just charge extra for the brand.

Do not follow. "Premium just for the brand" means to me that I am paying extra for that Apple Logo, nothing else. I am paying extra for the things that I listed because they are valuable. These things cost a lot more to provide than a the Apple logo on a product.
 
Perhaps, but my PCIe bus graphics are still spanking the hell out of Apple Silicon graphics overall.
Yes, BUT in tasks that were designed for PCIe bus graphics. A large part of “overall” has been designed with “GPU having it’s own pool of memory”. Such that certain use cases, which are actually desired for GPU’s, aren’t even evaluated because they’d be impossible within that architecture.
 
So ... with Microsoft Surface Pro X was not a Performance ARM chip created by Qualcomm? Not tracking success but it did perfect decently.

Now onto the Surface Pro 10, with Microsoft's one-up behind the scenes tweak it's slowly getting to a decent Cor-i3 performance level. Another year we'll see but I'm thinking it'll take another 4yrs.

Does intel have ANY decent road map to compete with TSMC's N5 or N3 design process?
The surface actually has an ARM version and a Core i7 version. The Core i7 version has decent battery life (estimated at 18 hours by Microsoft, but around 10 hours in practice), while being much more useful than an iPad and having eGPU support.
 
So ... with Microsoft Surface Pro X was not a Performance ARM chip created by Qualcomm? Not tracking success but it did perfect decently.

Now onto the Surface Pro 10, with Microsoft's one-up behind the scenes tweak it's slowly getting to a decent Cor-i3 performance level. Another year we'll see but I'm thinking it'll take another 4yrs.

Does intel have ANY decent road map to compete with TSMC's N5 or N3 design process?

Supposedly Intel 4 is already ready, then Intel 3 comes in second half of 23 and then new high na lithography stuff happens wth 2nm and below in second half of 25. That's all theoretical at this point.
 
  • Like
Reactions: DeepIn2U
In what way is this unfavorable for Pro 3D users preferring to use MacOS? Do you work in any of those areas for a living?
You've imagined a business arrangement that makes no sense at all for any of the companies mentioned. Doesn't really matter if it would result in products that you think your industry would find useful, the collaboration would be a total disaster.
 
You've imagined a business arrangement that makes no sense at all for any of the companies mentioned. Doesn't really matter if it would result in products that you think your industry would find useful, the collaboration would be a total disaster.
So basically this is your UN-qualified opinion. It's fine for people to have opinions, but not knowing anything about the industry, players or technology and then lobbing hyperbolic criticism bomb terms makes you look closed-minded and kind of desperate to just keep things exactly as they are.
 
Last edited:
Yes, I work in/with Visual Effects and Post Production. Those who want to use macOS, would not want to do it on some third party system (especially one with a support track record as poor as Sony’s) with mediocre drivers from nVidia. The people that would want the system you describe would either want Window or Linux.

We will have to see where Apple is in a generation or two for it GPU, but Unified Memory is great for having lots of stuff in GPU RAM. If we get metal support for some of the 3D software, things will be great. If not, they would just be also ran systems anyway.
No disrespect, but it sounds like you're primarily in the area most helped by Apple's advancements, so whether Apple upgrades it's 3D tech capabilities or not you're not really going to be greatly affected.

While I understand some reticence regarding Sony's past PC efforts assuming it would be exactly the same the second time around, with MacOS and Apple as a partner is kind of pure speculation. Aside from that it wouldn't alter Apple's product strategy/offerings at all and you wouldn't even really have to buy any of the products that catered more to 3D.

There's decent Metal support for somethings in a lot of 3D apps now, and you may be right in terms of Apple being able to close the gap, but there are a lot of gaps for 3D users that you don't know about until you encounter them and sometimes they're deal breakers. Having the option of being supported by industry standard GPUs with industry standard software would eliminate those and that makes a HUGE DIFFERENCE.
 
Last edited:
Do not follow. "Premium just for the brand" means to me that I am paying extra for that Apple Logo, nothing else. I am paying extra for the things that I listed because they are valuable. These things cost a lot more to provide than a the Apple logo on a product.

You’re not getting the point. One thing is why apple charges a premium and another thing is why you’re willing to buy apple’s products and still pay it. Apple charges a 30% premium just for the logo. If you’re finding other things you like such as those you listed and still want to buy the product because of those things, great for you, but the premium isn’t being charged for those other things, those other things are included in the remainder of the price, the other 60%

And if Apple charges a premium for the logo that’s not because of what it costs to put the logo on the products, it’s because they want to make their brand something of a status symbol.
 
Last edited:
  • Disagree
Reactions: Detnator
So basically this is your UN-qualified opinion. It's fine for people to have opinions, but not knowing anything about the industry, players or technology and then lobbing hyperbolic criticism bomb terms makes you look closed-minded and kind of desperate to just keep things exactly as they are.

I'm happy to let our respective comments stand on their own merits without trying to shore them up with some childish resume comparison. I've not mentioned my industry qualifications and you don't know them, much less my motivations which you clearly know even less about. I think your psychoanalysis skills are even more lacking than your business acumen.

None of the things you've predictied make any sense at all, namely:

Can you imagine Sony with a reborn Vaio brand selling EPYC and Threadripper based workstations, Sony Vaio EPYC Server racks with Nvidia drivers running MacOS?

When has Sony ever sold workstations or servers? What makes you think they'd be good at selling them now? Does Sony have a viable professional services and enterprise sales and support channel for for workstations and servers?

You're also suggesting with a straight face that there are no political barriers to an Apple+Nvidia partnership. Surely you're aware of the estranged relationship currently between Apple and Nvidia. Nvidia macOS drivers are not a realistic expectation at all. Please reference the hundreds of macrumors articles and threads over the past decade or two which documenti the decline and fall of the Apple+Nvidia relationship.

I feel confident Sony would be willing to buy back the Vaio company/brand if Apple suggested this kind of plan...

Sony's Vaio mark has never been used for enterprise products like servers or workstations. It's a luxury consumer electronics brand. I have no idea why you think it would be an attractive choice that Sony would even want to use for some theoretical EPYC server, even ignoring the fact that Sony spun it off in 2014 and it's even less appropriate now. Yet you seem to think that Sony would spend the money to re-absorb Vaio just so they could use it to brand a product unlike any Vaio product that's ever been made or sold under that mark. It's patently absurd.
 
You're also suggesting with a straight face that there are no political barriers to an Apple+Nvidia partnership. Surely you're aware of the estranged relationship currently between Apple and Nvidia. Nvidia macOS drivers are not a realistic expectation at all. Please reference the hundreds of macrumors articles and threads over the past decade or two which documenti the decline and fall of the Apple+Nvidia relationship.

If Apple really wants, they can reverse engineer and even write drivers for Nvidia cards. Will it have the same performance as a driver written by Nvidia themselves? Probably not, but if the Open Source community can do it with much less resources, I don't see why a trillion dollar would not be able to do it.

Of course, this is not a technical issue. It's more about politics + a clash of egos.
 
If Apple really wants, they can reverse engineer and even write drivers for Nvidia cards. Will it have the same performance as a driver written by Nvidia themselves? Probably not, but if the Open Source community can do it with much less resources, I don't see why a trillion dollar would not be able to do it.
Even if Apple are technically capable of reverse-engineering viable Nvida drivers, does it sound plausible that they would release a workstation product which relied on that approach? What about their ability to maintain those drivers in the face of a potentially hostile (or even just disinterested) Nvidia? Would enterprise buyers be attracted to such a tenuous product offering?

Who would buy a Mac Pro (or worse, a theoretical Sony Vaio workstation) using Apple Silicon and Nvidia GPUs if there's no guarantee of support for future Nvidia GPU cards in 2024 and beyond?

Of course, this is not a technical issue. It's more about politics + a clash of egos.

That's surely part of it, and it's not limited to just Apple either. I'm not pointing fingers, Apple and Nvidia both seem like tremendously difficult companies to partner with.
 
  • Like
Reactions: DavidSchaub
Even if Apple are technically capable of reverse-engineering viable Nvida drivers, does it sound plausible that they would release a workstation product which relied on that approach?

No. That scenario would be for third-party NVIDIA cards connected to Apple Silicon products. Not as the main product.
 

So far results for the Microsoft chip aren't particularly encouraging even compared to the last-generation and slower old M1 chips, but Microsoft does nothing if not iterate.

I know that.

My conversation with you was NOT about comparison Apple Silicon - I’m not that stupid I know better and a HUGe fan of Apples Siikicon team and rapid pace compared to anyone

Including Microsoft taking over a decade with joking work.

I’m saying they’re making a big change and it’ll affect the entire industry and developers especially and I’m hopeful arm code for rebuilt legacy apps which will no longer be under Microsoft’s hold.
The surface actually has an ARM version and a Core i7 version. The Core i7 version has decent battery life (estimated at 18 hours by Microsoft, but around 10 hours in practice), while being much more useful than an iPad and having eGPU support.
By 'much more useful than an iPad' its relevant to the user and targeted use. More importantly to the user experience.

In a corporate setting one tends to think yes a modern Windows tablet with eGPU support is better than an iPad.
RDP built-in
Windows 7/10/11 x86/x86-64 apps, indirect CAT5/6 networking via indirect adapters,
etc.

What it fails in consumer experience is:
less than sub-par touch input with/without stylii.
using in tablet form is absolutely terrible for both Pen and finger input and on-screen object manipulation.
Since its a desktop OS first, it continually will use more data on mobile networks even at system idle with no applications running beyond core system UI elements nor any interaction or app use by end users.
From a software perspective when issues arise there is a sea of steps to go through to resolve: installation configurations, installation path locations, registry remnants (think add-ins when versioning is to be considered), proper way of installation (as admin or only with admin credentials under end user installation).

In 4 different corporation businesses, across the last 6yrs ... over 1400 users supported totally all offices for all corps, only 30 or less have had any version of Surface tablet, and of those after the first year only 13 kept theirs. Across all businesses ... by comparison if iPad users (various models in the same time frame) ...

400 users estimate (more or less not by much),
150 Corporate paid, the rest personal
All have kept them after the first year and have upgraded to newer models.

Guess which was more productive for their business workflows while in-office or remote client facing (including mining sites).

hint: Its not running Windows ;)

Also Asus makes a MUCH better Widows Surface tablet with a high end mobile video card for gaming and still has an EGPU X-Station solution without the latter performance rivals that of a 16" RazorBlade laptop.
 
I’m saying they’re making a big change and it’ll affect the entire industry and developers especially and I’m hopeful arm code for rebuilt legacy apps which will no longer be under Microsoft’s hold.
I'm not clear on what you are saying here. Can you clarify? You want Windows to make a big change to impact the industry, and ARM code for rebuilt legacy apps ... won't be under Microsoft's hold? It never has been; anyone can compile code for Windows for any supported Windows compile target; right now that's ARM and X86 and X86_64; that was at one time or other Itanium, PPC, DEC, MIPS....I clearly don't understand your point. Please rephrase.
 
What it fails in consumer experience is:
less than sub-par touch input with/without stylii.
using in tablet form is absolutely terrible for both Pen and finger input and on-screen object manipulation.

I don't think it fails at all.
I would take the more awkward UI with the freedom to install any software I want over only a handful of apps for content creation, reading books and browsing the Internet.

What's the point of such a "marvelous" touch experience if most of the time my experience is no better than a Chromebook?

Sure, the iPad is wonderful for drawing, audio creation (IF you are a musician) or casual video recording (IF you work with visual content creation, and only in limited workflows).

But take away those creative workflows, and there's not much else you can do.
Sure, the iPad does have Microsoft Office or the Apple Suite, but if you need the HEAVY versions of MS Office, they feel like a toy.

I promise you drawing with a Microsoft Surface is not so bad. Sure, the pen is not as good as the Apple Pencil, but you're not restricted to the stock pen to begin with. You can run any device compatible with Windows Ink – even a Cintiq, if you somehow feel like it.
 
When has Sony ever sold workstations or servers?
Sony once sold the News Workstations. They were MIPS based and did not really sell very well.
What makes you think they'd be good at selling them now? Does Sony have a viable professional services and enterprise sales and support channel for for workstations and servers?
Well, given that when they did sell them, they had almost no success (despite them being pretty nice - I had one for about 6 months to evaluate), and that they had mediocre support then (I got mine from R&D and had access to engineers but could barely get any answers), I would bet against them. This is beyond wishcasting. The VAIO division had so little success that Sony eliminated it. Now, he thinks that they will reacquire the brand, design completely new systems to run an operating system for which they have no experience that would be even more of a niche system than what they were making before and for which they would be completely dependent on two companies who have terrible track records of working with partners and who hate each other. Sounds like a recipe for success.
You're also suggesting with a straight face that there are no political barriers to an Apple+Nvidia partnership.

Apple has every incentive to build an nVidia free ecosystem (as it has been doing). Anything that supported third party GPUs would make their story of ”Port to Metal and work on all our new systems” no longer true. It seems absurd to suggest that they would sign a deal with nVidia so that someone else could build macOS workstations that would compete with Apple’s own products. In what universe would that make any sense to anyone?
Surely you're aware of the estranged relationship currently between Apple and Nvidia. Nvidia macOS drivers are not a realistic expectation at all. Please reference the hundreds of macrumors articles and threads over the past decade or two which documenti the decline and fall of the Apple+Nvidia relationship.
nVidia repeated asked Apple what they would have to do to get back into Apple products and were told:
  1. Pay Apple back for the costs incurred by using nVidia’s faulty chips.
  2. Provide source to their drivers to Apple.
  3. Open source Cuda.
but nVidia was never willing to do any of those, let alone all three.
Sony's Vaio mark has never been used for enterprise products like servers or workstations. It's a luxury consumer electronics brand. I have no idea why you think it would be an attractive choice that Sony would even want to use for some theoretical EPYC server, even ignoring the fact that Sony spun it off in 2014 and it's even less appropriate now. Yet you seem to think that Sony would spend the money to re-absorb Vaio just so they could use it to brand a product unlike any Vaio product that's ever been made or sold under that mark. It's patently absurd.
In order to run an operating system with which they have no experience, and would be directly competing with the company from whom they were licensing it. The word delusional comes to mind.
 
  • Like
Reactions: Nugget
No disrespect, but it sounds like you're primarily in the area most helped by Apple's advancements, so whether Apple upgrades it's 3D tech capabilities or not you're not really going to be greatly affected.
So now it is not Pro 3D users, but some even smaller subset of that market that you think would want this? Visual effects and gaming are two of the biggest markets for 3D software. What "Pro 3D" are you discussing?
While I understand some reticence regarding Sony's past PC efforts assuming it would be exactly the same the second time around, with MacOS and Apple as a partner is kind of pure speculation.
Your argument seems to be despite that:
  • Apple once licensed macOS to others and had a terrible experience.
  • Sony made lifestyle consumer Microsoft Windows based PCs that looked nice but did not sell well enough to keep them from selling off the division 9 years ago.
  • nVidia used to be the primary GPU vendor to Apple, but ended the relationship so badly that Apple has said it would never work with them again.
Now:
  • Apple should convince Sony to get into the Intel-based macOS Workstation business for which they have no experience, after having bought back the lifestyle consumer brand for use in this market.
  • Apple should license their OS to this new division for a small niche that you feel is not being served by Apple Silicon, despite this action having been disastrous before, and despite it sending exactly the opposite message Apple wants to send (that Apple Silicon is the future of macOS and that it will be ready to meet the needs of all Mac users).
  • Apple should ignore all the problems that nVidia caused and either reverse engineer a driver for nVidia's GPU, or beg nVidia to produce a driver into which Apple would have no visibility, given they will not provide source access to it.
Aside from that it wouldn't alter Apple's product strategy/offerings at all and you wouldn't even really have to buy any of the products that catered more to 3D.
You have a very different definition of "product strategy" and "product offerings" then I do. Apple would have to go from saying: "Our devices are better because we control the whole experience from chips to hardware to software." to instead saying: "Our hardware is not good enough, so we had to beg two other companies to make systems that we cannot."
There's decent Metal support for somethings in a lot of 3D apps now,
You know how that gets better? By making it clear that the only way forward for these companies to stay on the platform is to take advantage of all the benefits of Apple's platform, porting to Metal and using Apple's other APIs to support their hardware/silicon directly.
and you may be right in terms of Apple being able to close the gap, but there are a lot of gaps for 3D users that you don't know about until you encounter them and sometimes they're deal breakers.
Since you clearly know about all these hidden users, why not let us all know who they are and what their use cases are such that they can only be served by a product that has never existed? You would have much more credibility if you provided some actual examples, rather than just asserting they exist.
Having the option of being supported by industry standard GPUs with industry standard software would eliminate those and that makes a HUGE DIFFERENCE.
Which industry? What software? If this software needs nVidia GPUs, it has not run on macOS for years. The idea that any company would build a pipeline based around a product that has no real champion is beyond belief. If this market is meaningful, Apple will be addressing it with future Apple Silicon but only if companies are forced to adopt their approach. If they are told: "Do not really worry about porting to our architecture, we will keep supporting that old stuff." it will never happen.
 
You’re not getting the point. One thing is why apple charges a premium and another thing is why you’re willing to buy apple’s products and still pay it. Apple charges a 30% premium just for the logo.
What I think you mean to say is: “You do not agree with me, therefore you must be wrong. I think that Apple charges 30% for the logo and you do not.”
If you’re finding other things you like such as those you listed and still want to buy the product because of those things, great for you, but the premium isn’t being charged for those other things, those other things are included in the remainder of the price, the other 60%
Or maybe you are wrong, and Apple’s price includes the cost of R&D to build these products for their smaller market, the increased materials costs for using higher quality materials, the cost for all the software development that other companies with Windows computers do not need to pay.
And if Apple charges a premium for the logo that’s not because of what it costs to put the logo on the products, it’s because they want to make their brand something of a status symbol.
While you keep asserting that you think Apple behaves in a particular way for a particular reason, but have provided no evidence to support your claims. You can keep repeating it, but that does not make it true.
 
  • Apple should ignore all the problems that nVidia caused and either reverse engineer a driver for nVidia's GPU, or beg nVidia to produce a driver into which Apple would have no visibility, given they will not provide source access to it.

The argument of closed source drivers seems reasonable at first, but if you look at it more closely, you'll see it's bollocks.

Here's why: Apple doesn't have just NVidia as an option. AMD Radeon drivers just happen to be open source, so Apple could even borrow inspiration from Linux drivers if they want to code support for third-party eGPUs.

Don't like AMD? Now we have Intel ARC too, which is also open source. It's the newest and weakest of the three products, but at least is an option. They could even collaborate with Intel to make the drivers as a whole more stable and better.

But obviously, this has never been about proprietary drivers. It's about holding the control of the whole stack.
This will eventually backfire, since we're not in the 1990s anymore.
 
The argument of closed source drivers seems reasonable at first, but if you look at it more closely, you'll see it's bollocks.

Here's why: Apple doesn't have just NVidia as an option. AMD Radeon drivers just happen to be open source, so Apple could even borrow inspiration from Linux drivers if they want to code support for third-party eGPUs.

Don't like AMD? Now we have Intel ARC too, which is also open source. It's the newest and weakest of the three products, but at least is an option. They could even collaborate with Intel to make the drivers as a whole more stable and better.

But obviously, this has never been about proprietary drivers. It's about holding the control of the whole stack.
This will eventually backfire, since we're not in the 1990s anymore.
It seems you think writing a device driver is easy. Something like a modern GPU is impossible to be supported commercially without actual documentation from the GPU manufacturer. Case in point: the open source community could not even begin to support Broadcom’s WiFi chipset because there’s no documentation they could use to even start.

If you think it’s easy, which it seems you do judging from your posts, I don’t know what to say to you, because it is extremely hard. Please do not equate what Asahi folks is doing with Apple Silicon GPU to what Apple could do with nVidia GPU, because the open source community do not have any obligation to support AS Macs. It is a take it or leave it kind of situation.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.