Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Going to Apple Silicon is a solution in search of a problem.

Losing the ability to boot into Windows to gain what? A bit more speed.

I was there for life when Apple was off doing its own thing while Intel ruled the world. It nearly killed the company. Intel architecture still rules the chip world. Going back to that way of life is the definition of insanity.
Except in this case emulation isn't crap. I remember trying to run x86 emulator on even the PowerPC chips was agonizingly slow. That is not the case with Rosetta 2. Also Microsoft has been trying to get Windows for ARM to become popular and with Apple going that route that will give Microsoft a shot in the ARM :)

Nevermind that it wasn't so much Apple not using Intel that caused problems in the 1990s but a series of totally idiotic business decisions with the way the Mac clone program was handled topping the list. Jobs had to kill a lot of projects that were still eating resources or had turned into roads to nowhere (Hypercard 3.0 and OpenDoc being the two I remember)
 
Last edited:
Highly doubtful. For better or worse , many Thousands of Mini's are racked up in custom slots for the current form factor. For example

Telling those folks they have to rip out all of their custom stuff and go to another new funky form factors will make several really big buyers unhappy. Lots of small dev shops with a mini(s) off in the corner enclosure/shelf/stacked doing something too. A major contributing reason why it didn't chance much in 2018.

The current Apple TV utilises an A series chip and comes in at almost exactly 1/4 the size including an internal PSU ... Take out the PSU and swap for USB-C instead (which many monitors already have with power delivery) and you have plenty of extra room in that tiny box for USB, Ethernet and Socketed Ram.

If you can fit 4x the number of mini servers in the same rack space using less electricity and producing less heat there's no way CoLo's wouldn't lap that up. It wouldn't even need much in terms of changing the racks beyond trays that hold 4 machines in the same (current) single Mac mini arrangement.

Personally I do see a decent size market for a Mini like this.

But hey, we all have our own visions most of which will most likely turn out to be absolutely nothing like Apple have planned for us ;)
 
I have no idea why people keep chasing after Apple and a Mac for their gaming needs [emphasis added].
I could explain why, but when you go on to write:
I have owned Macs since 1989 and enjoyed some games along the way, but never deluded myself into thinking a Mac will ever be a substitute for a console or a gaming PC… [emphasis added]
I suspect that you aren’t really opening to learning the reasons.

It’s so much easier just to dismiss behavior we don’t understand as delusional than to do the hard mental work of actually broadening our perspectives, eh?

Edit: If you are if fact genuinely curious, let me know and I’ll happily explain my own reasons. You’ll still be free to conclude that I’m delusional, of course, but at least then you’ll be able to do so on the basis of information rather than ignorance. :)
 
Last edited:
  • Disagree
Reactions: PickUrPoison
The successor for Lifuka will be codenamed Wefukedya because it will fix the bugs notorious in first gen Apple products, but those brave enough to beta test are stuck with outdated product.
 
  • Like
Reactions: johngwheeler
Taiwanese here. Just forget about this "source". It's not some technical specific pundit but just a gossip media. They don't have qualified columntator in technology territory that can write convincible stories. Their techno-related stories are either embedded advertising, or something we all knew a week ago. Even DigiTimes have better reputations.
 
The conversation about GPU in this thread is quite ridiculous. It seems some of you still don't realize what is happening.

Apple is switching the entire Mac lineup to Apple Silicon within 2 years.

This isn't an experiment, or a side project, or some curious distraction. They have announced this as the way forward, to be completed in a short amount of time.

They wouldn't be doing this at all if they weren't fully able to meet or exceed the performance of top end AMD GPUs with the chips in their pipeline.

One needs only to look at what they've been able to do in iPhone and iPad, where the name of the game is efficiency due to thermal and battery constraints, and yet the chips there are stunning in CPU and GPU performance. Just imagine what they'll be able to do when thermal and battery are less of a concern (MacBook) or not a concern at all (desktops).
 
  • Like
Reactions: BigSplash and wyarp
a Desktop ARM processors by next year is extremely quick...I don't think people understand the huge leap that is happening here.
 
The complete lack of support for AMD GPUs would be a showstopper for me - I use AMD GPUs for ML work.

Given that Apple has been pushing the eGPU route for some time, not supporting cards that are currently supported in eGPU setups currently would be a big problem - and would more than likely stop me migrating over.
 
Going to Apple Silicon is a solution in search of a problem.

...go look at all the posts on this site complaining about overheating, fan noise and thermal throttling. Or delays in updating Macs because Intel hasn't released the particular wattage/iGPU combination that is needed for that Mac. Or the pathetic graphics on the Mac Mini because Intel doesn't make desktop chips with Iris/Pro/plus/whatever iGPUs... Or maybe, for many Mac users, the ability to run iPhone/iPad apps directly on the Mac is just as useful in 2020 as the ability to run Windows was in 2006. Those are the problems that are potentially getting solved. Also - with increasing interest in ARM-based servers - there's a gap in the market for a decent ARM desktop for web/server-side development (...of course, you can do that "in the cloud" or on an external ARM box- but then the same goes for x86 development on an ARM machine...).

As for the iMac - the #1 best thing Apple could do to solve the Windows problem on the ARM iMac is to include an extra HDMI/DisplayPort input so, when we absolutely have to have Windows or AAA games, we can plug in a NUC/Surface Pro/XBox/whatever and use our nice iMac screen. Not holding my breath there, unfortunately.


I was there for life when Apple was off doing its own thing while Intel ruled the world. It nearly killed the company.

Sigh. Wintel revisionist history.

Apple was nearly killed by decisions in the early/mid 90s - including a messy, over-priced and confusing range, interesting but unaffordable forays into PDAs and cameras and - in particular - the epic failure of the next-gen Copeland OS. Of course, this wasn't helped by monopoly abuse by Intel and Microsoft that killed off anything that wasn't a straight Wintel PC ...heck, the clones even killed off the IBM PC! No guarantee that Windows-compatible Macs would have had any effect beyond killing off MacOS for good and turning Apple into just another clone-maker.

Apple's turnaround happened in the 90s/early 00s with the Second Coming of Jobs, the iMac, MacOS X and the iPod "halo effect" - help by the rise of the Internet and open standards for communication which started to chip away at Wintel's wall of proprietary standards. Apple was turned around before the switch to Intel, which was necessitated by IBM and Motorola's abandonment of mobile/personal PPC chips...

Oh, and people also forget that Intel's Pentium 4 "Netburst" space-heaters from the early 00s were a steaming (almost literally) pile, which sent Intel back to the drawing board (or, rather, back to the Pentium Pro) and the first Intel Macs were also among the first systems to use the vastly improved and more power-efficient Core chips. Pentium-4 based Macs wouldn't have been much fun.

The ability to run Windows was a useful feature - in 2006 - but it wasn't what saved the Mac. Anyway, that was 14 years ago, and today we have tiny, cheap PCs and ultrabooks, the option of running Windows in the cloud and what looks like a serious attempt by MS to make an ARM version of Windows - not to mention a software ecosystem that is 14 years more up to date and is more likely to use abstracted os-provided frameworks than hard-code support for specific hardware, making re-compiling for a new processor much less of a big deal.

One of the reasons that Windows is Windows and MacOS is MacOS is that Windows is hamstrung by backwards compatibility concerns from the corporate sector, whereas the MacOS world is accustomed to "extinction events" every decade or so (PPC, OS X, Intel, 64 bit...) that clean away the dead wood. Windows dumped it's DOS-based core in favour of a solid, modern OS back around the same time as MacOS X - but the obsession with backward-compatibility has kept it burdened with legacy rubbish: half of the security problems with Windows XP were not because XP couldn't handle a proper security model, but because everybody ran with permanent admin privileges so that old Win9X software would work. One "advantage" of Windows is that it will still, happily, run 25-year-old binaries. The advantage of MacOS is that it doesn't have to carry around that baggage. The messy split-personality interface of Win10 (e.g. two competing control panels) is because MS haven't been able to move developers and users on to their "modern" UI/application framework (whereas Apple killed off the "carbon" Classic-to-OSX transition framework years ago). That's also why switching to ARM is feasible for Apple but a major hurdle for Windows (... 'modern' Windows apps should be even more trivial to port to ARM than for MacOS since they're compiled to CLR bytecode rather than x86 binaries).

Basically, the time has come when Apple has to decide whether it is more important to make better Macs than to support x86. Even so, we'll most likely get Windows 10 for ARM in a VM - and we know we're getting ARM Linux, which is already far better supported than ARM Windows.
 
a Desktop ARM processors by next year is extremely quick...I don't think people understand the huge leap that is happening here.

iPads and iPhones have been challenging the lower-end Macs on performance for a couple of years now (and that's taking a cautious view of the benchmarks that show them performing at 15"MBP i7 levels). From what we've seen of the developer system, which is just a slightly tweaked iPad CPU, it is more than a match for the current Mini.

...and while developing processors is never trivial, a desktop processor has fewer design constraints than a chip designed for an ultra-thin, battery powered mobile device. Otherwise, we'd all be packing Intel-powered phones...

Plus - Apple are a $2TN company with a virtual Scrooge McDuck gold-filled swimming pool - this isn't Tim and Craig sitting down in a bar, grabbing a beer-mat and biro and saying "CPU design - how hard can it be?*" - they can afford to buy whatever expertise and IP they need.

(* although that's pretty much how the original ARM came to be, but then they have better beer in Cambridge...)
 
Who says another year till an Apple Silicon iMac is released?

(a) The GPU part is for machines that today would use a discrete GPU. Like an iMac. It has no affect on machines like the MacBook that only use the Intel built-in GPU and will do the same using the A14X's on-SoC GPU.

(b) Still no reason not to expect those first machines in late 2020/early 2021.


(c) The existence of an Apple Silicon discrete GPU is no surprise. People who understand the tech issues have been wondering exactly how this will play out, and the interesting issue is not the existence of an Apple discrete GPU, it's the question of how it is attached to the rest of the system. (ie is it a separate PCI board? a separate die in an MCM? a chiplet?)

(d) For some reason people are assuming this GPU will only be delivered in a year or so. I don't know where this assumption comes from except the usual cluelessness that when people first hear an Apple code name they assume that means Apple only started working on it yesterday.

Nobody with any verifiable authority says, and that’s the problem. Apple is still way too obsessed with secrecy about products and software. Battery fix throttles CPU when the battery gets low? Well, a lot of Android phones do that too, and for the same reason. But they told you about it and gave you access to the settings. You want your phone to keep chewing through the battery even when it makes your phone die in minutes instead of hours? Ok, make this setting change. Apple didn’t announce WHAT they were going to do just that a fix for phones dying quickly would be implemented “soon”. It was put out in an upgrade unannounced and it soon started throttling back performance and people got mad and sued, over something other phones did as well. But they told you.

And everyone thought that the HomePod was going to be an Alexa/Google Home like device. Because it’s Apple it would probably cost $100-150. NOPE It’s a serious music speaker that can’t do a lot of the voice assistant functions that the other 2 do but it costs $350.00. Yes, it can do more now but it already has a bad rep that it hasn’t recovered from.

Lack of general information about products people know Apple is developing doesn’t mean no information is being released. It just isn’t being released by Apple so “experts” step in to fill the void. Sometimes they are mostly right, sometimes they miss the boat completely. Apple doesn’t have to lay out detailed explanations, and if there are pleasant “surprises” with features or performance they don’t need to give them away. But most of the people arguing on this thread are speculating on CPU and GPU performance and none of us really have a clue at all what they will really be.

If commercial games were important to me I don’t think that I would wait, I’d buy a Intel or AMD solution. I KNOW that they will work. Maybe the new processors/graphics will. Maybe they could but no developer is bothering. We don’t know and we have no official information to make an informed opinion from. Using iPads as your reference doesn’t bolster a gaming case. You can’t run most games like WOW or COD on an iPad Pro even if its processor and graphics are technically able to handle the load. What game developers are working on Apple Silicon based games? Do we know? Apple is in a fight over graphic standards and so far, and this is over a number of years, they haven’t broken even on any of those fights. Big titles come to Apple late, if ever.

For other software-documents, drawing, photos, streaming Apple will be fine. They already are a major player. And maybe Apple doesn’t think that the market is profitable enough to justify whatever it would take to really implement it.

If that’s the case say so.
 
  • Like
Reactions: coolspot18
It will be interesting to see how they compare from the offerings from Nvidia and AMD. Power efficiency in a desktop is usually less important than sheer grunt so if they can pull the grunt side of things off to levels of Nvidia’s upcoming chips I will be seriously seriously impressed.

Unlikely it will come anywhere close to what Nvidia is offering - it will probably perform similar to the high-end embedded GPUs available in Intel chips, i.e. render 3D for a midrange game at 30 - 45FPS. The high end Nvidia cards alone cost $1,000 which is surely won't fit into the cost model of an iMac plus they're also much larger and higher powered.
 
Comparing back to GPUs designs from 2-4 years ago isn't saying much.

MBP 13" 2020 Gen 11 graphics ( 10nm 10th gen ) is 10519

That is better than the Radeon Pro 450 too.

I got the numbers from https://browser.geekbench.com/metal-benchmarks. I don't know why Barefeats shows double the performance but in that case Intel(R) Iris(TM) Plus Graphics 645 is faster than A12Z with a Metal score of 10519 vs. 10244. That's even worse than I thought for A12Z. The point of comparing to Radeon Pro 450 was of course to show that A12Z is as slow as a dGPU from 2016.
 
I had plans to update my iMac this year but all of this changed my mind. Let’s wait to see this beast. I’m also curious to see the plans for the “Armini”
 
As others have mentioned, the current A12Z smokes the best iGPU from intel. It smokes the best AMD iGPUs as well.
The iPad Pro scores >50 fps in gfxbench offscreen while the RX Vega 11 does not even reach 30 fps. Gfxbench is not the most demanding benchmark, but it's the only one we have on iOS.
The best AMD iGPU struggle at ~20 fps on Shadow of the Tomb Raider at 30 fps under medium settings 1080p. The A12Z sustains 30 fps at 1080p via emulation. We don't know the settings, but they're not the lowest. Texture filtering appears quite low, but volumetric light is enabled and purehair appears to be on medium.
And this is a GPU from a frickin' tablet!

So I don't think it'll be impossible for the Apple desktop GPU to beat the 5700XT of the current iMac.
 
Last edited:
  • Like
Reactions: Colonel Blimp
we have zero proof of that plus if its not faster running content people want and expect its pointless.
Of course we have a proof. Every single Mac Apple releases is going to be faster than the previous one in both CPU and GPU power. There hasn't been a single exception to this so far. They will NEVER EVER release something that is not faster than the old one.
 
Of course we have a proof. Every single Mac Apple releases is going to be faster than the previous one in both CPU and GPU power. There hasn't been a single exception to this so far. They will NEVER EVER release something that is not faster than the old one.
Well Apple made references to "longer battery life" with their processors, so it might be acceptable to them to market a computer giving remarkable energy efficiency with no real speed improvements but only to lower performance laptop solutions. For the iMac no way, its got to be better than the 2020 Intel CPU/AMD GPU as far as performance. :cool:
 
Last edited:
If commercial games were important to me I don’t think that I would wait, I’d buy a Intel or AMD solution. I KNOW that they will work. Maybe the new processors/graphics will. Maybe they could but no developer is bothering. We don’t know and we have no official information to make an informed opinion from.
I have every expectation that Feral Interactive will port games to Apple Silicon, and I’d be surprised if they’re not already playing around with the Developer Transition Kit.

Why wouldn’t they? Remember, they got their start porting games to Mac OS on PowerPC, and since then they’ve continued to port a steady flow of games to the Mac on Intel 32 and 64, to Linux, to Android, and even to the Nintendo Switch.

Nor is it much of a stretch to imagine that Aspyr will bestir themselves. The market will certainly be there for game ports to macOS on Apple Silicon, since game developers and publishers will no longer be able to rely on Boot Camp to reach Mac users.

In fact, I wouldn’t be at all surprised if this transition leads to a second golden age of Mac game porting, comparable to that of the early 2000s. The transition to Apple Silicon has the potential to be a windfall for Mac game porting specialists like Feral and Aspyr.

Of course, this transition may well take a few years to get up to steam, so in the near term I suspect you’d be right not to wait.
 
All of you need to realize that the A12Z is already easily outperforming the most powerful integrated GPU in any intel chip.

These rumors are talking about a much more powerful GPU, one that will likely outperform the 5700xt while being integrated into the SoC.

It may not beat it in raw performance but thanks to unified memory, extra tile memory, hardware accelerators and very high efficiency and utilization, it’ll give you better real-world performance.

No driver issues either.

Ah yes you're the Apple fanatic from MaxTech I remember. Rumors are nice and well, but to confidently predict that it will outperform a current RX 5700XT Pro when even Apple is tightlipped is quite unwise.

2 years ago Apple seemingly broke off with Imagination quoting that in 2 years they would have completely abandoned Imagination PowerVR sourced IP, but fast forward to 2020, amongst those in the tech sector, it is well known that Apple GPU as it is currently is nothing more than heavily modified PowerVR IP and Apple signed a new licensing agreement which pretty much says that Apple's custom GPU efforts were largely for naught.

At this point, while yes Apple's custom CPUs are indeed quite powerful, there is really no need to separately dislodge AMD. But regardless am interested to see Apple's take on TBDR desktop graphics, which again, can't be called 'custom' since the underlying IP is sourced from Imagination Technologies
 
  • Like
Reactions: BigSplash
Then prepare to be disappointed....it is very unlikely that Bootcamp or Windows 10 (Intel) virtualization will be working by the end of the year, if ever.
As theluggage posted above, Apple have already stated that there will be no Boot Camp for Apple Silicon Macs.

And as for virtualization, you cannot virtualize hardware that doesn’t exist. While it should be possible to virtualize Windows 10 for ARM on Apple Silicon, provided that Microsoft ever decide to license it, it will never be possible to virtualize Windows 10 for Intel on Apple Silicon.

People who want to run Windows apps on Apple Silicon Macs have to hope that Windows app developers compile (and test and support) their apps for ARM, or that Microsoft release their rumored Intel 64-bit emulator for Windows 10 for ARM, or that some worthy successor to Connectix releases an Intel PC emulator for macOS on Apple Silicon (à la Virtual PC for PowerPC Macs).
 
amongst those in the tech sector, it is well known that Apple GPU as it is currently is nothing more than heavily modified PowerVR IP and Apple signed a new licensing agreement which pretty much says that Apple's custom GPU efforts were largely for naught.
This is simply false. Apple-designed GPUs (which start at the A11) have been leaders and perf/Watt, as results show. The A12X/Z sit alone, miles ahead any other tablet GPU on GFXBench. It is certainly not "well-known that Apple GPUs are modified PowerVR IP". They use a new Apple design. Yes, their are TBDR, like PowerVR GPUs, but that's about it.
Apple having a recent licensing deal with Imagination certainly doesn't mean that Apple GPU have been disappointing. We don't know the details, but some suspect it has to do with ray tracing hardware, which Apple (and AMD/Intel) GPUs currently lack, and which Imagination has been developing.

And do you think Apple are idiots, that they will ditch AMD knowing that their own GPUs cannot compete?
 
  • Like
Reactions: Azrael9
Bootcamp very probably yes. ( technological gaps ).

Windows 10 probably not if willing to accept it in a virtual machine. ( biggest gap is Microsoft licesning .. which isn't a technical problem. More a business problem ). Windows 10 on ARM has increasingly been getting better virtualization support. So putting it on a virtual machine on a Mac isn't a huge jump from the Hyper-V and other virtualization that folks have done ( stuffed on QEMU ). But Microsoft doesn't sell it to mainstream folks detached from a system. there are some specific virtual driver issues to be worked out for stuff like Apple's trackpad for Apple specific features. ( so may be some Microsoft - Apple - VM vendor finger pointing as to who suppose to fix what when there are some quirks. ) . Microsoft Azure probably is quite keen on selling time on Windows 10 on Arm virtual instances.

Now some folks will still be disappointed with running Windows 10 only in a virtual machine context. Especially, if the GPU is highly simulated ( rather than close to 'raw' on the hardware). However, that isn't "dead on arrival". It works. just not happy with how it works.

The one thing that has always confused me is how the term of "virtual machine" is used. In my view that means the hardware (including CPU and GPU) is emulated. The current version of SheepShaver is an example of this as it emulators old 680x0 Mac architecture on an intel machine. This method tends to be slow. One my friends went as far as to uncharitably state trying to emulate a relatively (then) modern PC on a 680x0 Mac was akin to 'watching a dying dog take its last dump".

Virtualization on the other had I view more at a translator where the calls to the hardware by the OS are intercepted and turned into something the native OS can handle. WINE is the most famous of virtualization software and at best, in its current form, all it would be able to do would be to run Windows for ARM leaving x86 code out in the cold (except for what ever X86 emulation WoA was doing). This has the advantage of calls being translated rather than actual hardware being simulated making it far faster. I suspect this is how Rosetta 2 actually works - it does't simulate any hardware but translates calls instead.
 
The Mother Lifuka?

I hope so. It's only got the 'so last year, mid-range 5700XT' to beat (from a company that can't even produce a gpu flagship gpu.)

Azrael.
 
The one thing that has always confused me is how the term of "virtual machine" is used.

Unfortunately (or, maybe, fortunately), we don't have a central authority for deciding what jargon means. Arguably, "virtual machine" just means any "computer" that is somehow abstracted from the physical hardware running it so that it appears, software-wise, to be a separate machine. That's the definition used here:


...along with a handy tree diagram. However, in common usage, "virtualisation" seems to have been hijacked to refer to what that article calls "Hardware assisted full virtualisation" and "emulation" to mean "Software assisted full virtualisation". That is, of course not even wrong. Who ya gonna call?

Of course, the reality of the situation is a lot more messy: virtually all hypervisors are, in practice, what the article calls "Hybrid virtualisation" (the first thing you do after installing an OS with Parallels etc. is to install the 'tools' including paravirtualised drivers for video etc.) and most "hardware assisted" virtualisation includes some degree of software emulation of the hardware and firmware in the host system (...which is why you can run ARM Linux - and probably will be able to run Arm WIndows if MS makes it available) under Parallels on an ARM Mac whereas they are unlikely to work on bare "Apple Silicon" metal even if Apple allow it.

Then the article misses a distinction between "Binary translation" (i.e. 'just in time' converting guest binaries into host code and executing the result on the host processor - a la 'Rosetta' - and actually pretty much how modern X86 hardware works anyway) vs. old-school software emulation (simulating the host CPU in software and feeding it the host code) - I'd guess most modern emulator software is more like the former than the latter but I'd assume that the older "Virtual PC" stuff was more akin to the latter.

Oh, and Android, Java, and modern Windows apps all use virtual machines as well, with applications actually distributed as bytecode...

...and then I'm not sure if things like Rosetta even count as virtual machines, because they translate "guest" code into "host" code that then runs like any other application under the host OS... From the sound of it, Rosetta 2 does this at install time - there's no reason why the result of that shouldn't run at near-native speeds unless the original code was very lovingly optimised or called CPU-specific accelerator/vector functions directly.


TL:DNR: complicated subject is complicated.
 
  • Like
Reactions: Colonel Blimp
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.