Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
So wait... the next iPad Pro and MacBook Pro will share the same chip, according to this simplified table? If that's the case then erm... no thanks. I kind of expect something beefier in an actual work machine. Not gonna place another bet on Apples ecosystem for an "improvement" I was never asking for.
 
  • Like
Reactions: Janichsan
So wait... the next iPad Pro and MacBook Pro will share the same chip, according to this simplified table? If that's the case then erm... no thanks. I kind of expect something beefier in an actual work machine. Not gonna place another bet on Apples ecosystem for an "improvement" I was never asking for.

Beefier? The current iPad Pro with a 2018 SoC currently beats most Intel CPU on Macbook Pro. So if you really want beefier you want Apple Si.
 
  • Like
Reactions: 2Stepfan and osplo
I’m guessing there might be some translation issues. After the sentence you quote it’s pretty apparent that they’re actually talking about a discrete (separate from the SoC) GPU chip to be used instead of AMD for iMac, and also 16” MBP too I suppose. At least that’s how I read it.

The A14X should be more than sufficient for the smaller 21.5” iMac and a MacBook/MacBook Air class GPU, shouldn’t it?

I agree with you here. I think it isn't the best translation and I believe the smaller iMac will be fine with the A14X or whatever they call it for the iMac.

China Times said in April that a 23" iMac would be released 2H20. Not sure where that's gone but I still think Ming Kuo won't be far off when he said in June that a 24" ARM iMac would be released 4Q20.
 
this could be the bigger imac....the smaller 24" that replace the 21.5" Intel, maybe will come this year, WITHOUT custom GPU, why? because i bet the iGPU from the A14 based chip for the AS imac will be already stronger than the current 21.5" amd dGPU that we already have today

Yes I think this China Times article relates to the bigger iMac. I think Apple showed with the Tomb Raider demo at WWDC that they already have the graphics performance suitable for the smaller iMac.
 
Last edited:
I hear you, my 2012 iMac is still chugging along just fine. I usually refresh around the 10-year mark so the announcement of the Apple silicon Mac's has been fortuitous. With my slow replacement plan there is little incentive to get the new Intel Mac's. Why get something inferior and will also not be Apple's focus for the next 10 years? Doesn't make sense.
Yup. I'd like to avoid putting money into another Intel Mac unless AS Macs are not available yet.

I do understand if people need Intel chips and Bootcamp, but for my needs I should be fine with AS going forward.
 
I wouldn’t be too surprised to see it actually. The Mini is the closest thing to a tinkerer platform as you’ll get from a Mac, which makes it a good “pitch to enthusiasts” product.

I think there will be a Mini - but I hope a completely redesigned, kick all the competition in the nuts "Magical Mini" worthy of a proper "one more thing" moment.

I get the feeling the developer kit mini is just a red herring to throw everyone off the form factor that will eventually be released.
 
Back to another era of Macs being truly differentiated machines. Hedge your bets now or get ready to jump in.

Not a stretch for them to be confident in surpassing Intel in graphics. AMD/Nvidia are a different caliber though. Hope this doesn’t mean the end of eGPU support.

I thought they announced that eGPUs wouldn't be supported in Mac with Apple Silicon? Also, eGPU isn't that great a solution: if you're GPU has 16 PCIe lanes but Thunderbolt only supports 4 lanes, what's the point? Your GPU is crippled.
 
  • Like
Reactions: richinaus
Beefier? The current iPad Pro with a 2018 SoC currently beats most Intel CPU on Macbook Pro. So if you really want beefier you want Apple Si.

I don't want an oversized iPad. The MacBook Pro 16" offers a 100Wh battery, what are they gonna do with that amount of power available if consumption is that of an iPad? I don't need 100 hours of battery life, that's just past the point of diminishing returns. On the other hand I also don't need the battery to shrink so that my MacBook Pro becomes even thinner, since its going to be large anyways, solely because of the screen estate I need to be able to do my work with it. And it's already at the border of being too fragile in the first place. So what's the whole point of putting a low power consumption chip in there, if what most people need for their Pro machine is actually more computing power?
 
Sold! Even if it's the first gens. Folks.. It's totally time to breakup with Intel and AMD. We need to get away from CPU's quickly to advance. The possibilities have a larger array of combinations that would allow us to get back to 128bit computing on personal computers. I get the jokes that people make about this.. The really funny thing is.. They will surpass x86 quit quickly in next generations to come. #EverythingGPU!

Advance? By removing the ability to run Windows, or even virtualize it? And how is creating a THIRD GPU platform that developers must code for to create games an advance at all? These feel like steps backward: Apple is decreasing the utility of future Macs, and thus decreasing their market share.
 
Since when has Apple been interested in (serious) Mac gaming? I don’t think devs are much interested either. If you expect that to change due to an architecture switch you might want to give that thought another shot. PCs or consoles, that’s where I’d be looking if I were a gamer.

Apple could always switch course sometime in the future, but I don’t see much evidence of it. I’m sure they’re looking at a lot of high-performance AR and VR applications though in the R&D dept.

This has always been the problem with Apple. Bill Gates would sell to anyone, but Apple just throws market share away with both hands. Why NOT target the gaming market? It can only drive Mac sales up.
 
Advance? By removing the ability to run Windows, or even virtualize it? And how is creating a THIRD GPU platform that developers must code for to create games an advance at all? These feel like steps backward: Apple is decreasing the utility of future Macs, and thus decreasing their market share.

I see where your coming from. Those are software issues that can and will be fixed. While I have no proof that devs will fix this.. I'm almost sure that Microsoft will develop further ARM version of Windows 10 per say. The advancement that I speak of is future thought that GPUs are always better than CPU's and it's time to get away from x86 instruction sets. More complex calculations can occur that would take a CPU significant longer to complete. When those concerns you mentioned are ironed out.. You'll be fine.
 
For a variety of reasons I long ago dropped out of gaming online. Only some of those reasons were Mac related. But I no longer know enough to say company ’X’ has the best graphics cards or what a it takes to be considered above average in graphic performance.

But I have been in a similar situation in the mid 1980’s when I owned Commodore Amiga computers. The graphic capacity on the Amiga far surpassed any PC or Apple computer. IBM clones (what we now call PC computers) could only display 16 colors. The 1984 Macintosh could display 256. The Amiga could display 64,000 colors, and could multitask, something no Apple or PC could do.

So why aren’t we all using Amiga or Amiga style computers? Because few software companies wrote programs for the Amiga, and even fewer wrote programs that took advantage of the Amiga having both a separate graphic co processor and also a separate sound processor. So the Amiga looked and sounded the same as a PC but cost more and sometimes didn’t work as well, despite physically having better graphic hardware. Having great graphic capabilities doesn’t do you much good if the programs you want to use, games or otherwise, aren’t written to take advantage of them. And having an oddball system that doesn’t use what the most popular system uses puts you on the back burner as far as software developers are concerned.
 
  • Like
Reactions: richinaus
You seem to know something about this topic, would it in fact be integrated into the SoC? How big would that be? I guess I just assumed it would be a discrete chip at the higher performance (and wattage) levels. Thx.
That seems to be Vadim from Max Tech on Youtube.
 
Last edited:
  • Like
Reactions: PickUrPoison
So wait... the next iPad Pro and MacBook Pro will share the same chip, according to this simplified table?

Remember, the MacBook Pro range starts with the "Pro in name only" 13" models with 15W i5 CPUs. The A12X has already been shown to perform at 15" i7 MBP levels, so even if you take those benchmarks with a pinch of salt it is not unreasonable to expect the 'updated' version to thrash a mere 15W i5 with Intel integrated graphics - and quite feasibly replace the higher-end 3" too.

As per most of the rumours - the first AS Macs are likely to replace the Air and entry-level 13" MBPs. I'd expect the 16" MBP to come a bit later, along with the iMac and the rumoured new GPU.

(Maybe Apple will take the opportunity to sort out the "MacBook Pro" branding).
 
A12Z has already "better performance and more energy efficiency than the Intel GPU". That shouldn't be the goal. Apple must aim at the best GPU in the current iMac or it won't look good. Right now A12Z scores as good as Radeon Pro 450 from 2016. Of course it's better than Radeon 6750M in my 2011 iMac and they can optimize Mac SoC better but they still have a long way to the top. Even if they double the GPU performance of A12Z with A14Z it still can't reach the score of Vega 20 in iMac 21.5".

Mac Mini Intel(R) HD Graphics 630 Metal score 3757
iMac 21.5" Intel(R) Iris(TM) Plus Graphics 640 Metal score 4929
MBP 13" Intel(R) Iris(TM) Plus Graphics 645 Metal score 5481
AMD Radeon Pro 450 Metal score 10215
iPad Pro 11" A12Z Metal score 10244
AMD Radeon Pro 560X Metal score 17586
AMD Radeon Pro Vega 20 Metal score 24535
iMac 27" AMD Radeon Pro 5700XT Metal score 74768
 
Last edited:
So another year before an Apple Silicon iMac is released. Add another couple of years before I consider it safe to buy myself one ...

Li'fuka :)

Who says another year till an Apple Silicon iMac is released?

(a) The GPU part is for machines that today would use a discrete GPU. Like an iMac. It has no affect on machines like the MacBook that only use the Intel built-in GPU and will do the same using the A14X's on-SoC GPU.

(b) Still no reason not to expect those first machines in late 2020/early 2021.

(c) The existence of an Apple Silicon discrete GPU is no surprise. People who understand the tech issues have been wondering exactly how this will play out, and the interesting issue is not the existence of an Apple discrete GPU, it's the question of how it is attached to the rest of the system. (ie is it a separate PCI board? a separate die in an MCM? a chiplet?)

(d) For some reason people are assuming this GPU will only be delivered in a year or so. I don't know where this assumption comes from except the usual cluelessness that when people first hear an Apple code name they assume that means Apple only started working on it yesterday.
 
Something to remember- Apple has been pushing 4K for a while now with their silicone. Apple TV 4K (don’t know/remember all the specs) uses a A10x. The iPad Pro with A12z pushes 4K over usb C, I think only at 30 FPS, but we have to remember that is also powering it’s on Retina display.

4K video and pictures ( 2 dimensional ) data to the screen isn't a huge "ask".

Amazon Fire 4k and upcoming "Chromecast" , $40-80 dongles do it. Sub $300 TV's processors do it.

4K compressed video decode is a fixed function block sidecar to the GPU processor (usually) . It doesn't necessarily denote much in terms of general purpose "horsepower".

4K 3D rendering is a different "ask" of the system. The "4K" prefix is present on both of those.


so I believe for most day to day stuff Apple silicone can already do what needs to be done for single screen set ups that aren’t majorly intensive.

But so are all the current iMacs (intel based). Run a web browser. Handle Zoom calls, Watch Youtube/Netflixs . Word documents , and some basic spreadsheets. Much around with Photo collection and tweak some 2-3 minute video clips from the phone.


So, I would feel any rumor coming out of the mill would be for a much bigger and grander GPU than what’s tied to their current chips (even the a13, which is only in the phones).
Personally I can’t wait to see what they bring out. 🤤

I would expect that the Apple SoC still had a basic GPU still built into it even on outside chance they are doing discrete GPU ( and embedded) coupled to another place on the logic board.


Or that this "discrete" GPU was part of the same package [ CPU and other stuff die + GPU die + HBM all on a largish interposer package. More so a "scale up" of the die integrated GPU but capping the die sizes at a threshold Apple didn't want to cross. Intel has a similar DG1 that is a "super sized" version of the integrated on-die version of Xe-LP. but still not a mid-high end GPU. ]


if all Apple is trying to peel off the Radeon Pro 555X - 560X ( maybe Pro Vega 20 ) class dGPUs off the iMac 21.5" then that wouldn't be too surprising for 2nd half 2021. The GPU still would be relatively small compared to mainstream desktop GPUs on add-in-cards. There is going to be point at making their "unified memory" , integrated on die GPU was going to run out of steam when connected to the main memory that the ARM cores are using. At some point they won't be able to finesse that with a bigger system cache. ( laptops they can cover most of the models with Unified and bigger cache. )
 
An updated iMac not coming until the latter half of next year makes the newly updated Intel offering very attractive to those of us who are still using a Mac from 2012... :) My new 2020 iMac should be shipping this week!
 
  • Like
Reactions: HappyIntro
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.