Report: Apple Silicon iMac Featuring Desktop Class 'A14T' Chip Coming First Half of 2021

By SPECperf I can safely claim Apple Silicon is at least better than Intel's lakes by a huge margin on general propose computing.

BTW SPECPerf is exactly what Intel used to show it's generational difference.
Thanks for this.
Do you know if what OS and other things the chip needs to also be doing will affect this benchmark?

It's why I don't take much notice of the "Wow look how fast the iphone runs" tests.

Well, yes. Now take that same CPU out, get it to run a full blown windows OS and deal with all the background stuff, THEN get it to run the test and see how fast it is now (if you see what I mean)
 
I think Apple realises that they need to enter "with a bang", and mid-spec laptops are their biggest sellers.

Well there can be bangs in other areas.

You could make it super thin, super battery life, Fast on dedicated all, and just "ok" on other (what they will call) non optimized apps and put the blame on the apps.

So it may not be that fast, but they could sell it's other benefits strongly
 
WAIT...if I'm looking at the graphic correctly, Apple Integrated Graphics are only going to be in the iMac and NOT the MacBook Not Pro?

Something doesn't smell right here...
The graphic means that the MacBook and an entry-level iMac will have integrated GPU on the Apple Silicon SoC (same as current A-series). There will also be a high-spec iMac with some kind of additional or different GPU architecture (Lifuka). It may be a separate chiplet on the same SoC package, or a discrete GPU, or even just a more powerful integrated GPU.

The graphic doesn't mention the Mac Pro...but some kind of discrete GPU support seems likely.
 

In this session Apple shows you can do reduced security mode to boot older version of macOS.
By old I mean "old in the future" that is still Big Sur and later.

This just officially confirmed running macOS is still not tethered to apple's signing server unlike iOS right now.
ah, sorry, by ”old” i thought you mean pre-big Sur, which didn’t make any sense.
 
The A14X doesn't have the port and I/O output to support a 16" laptop ...
... a Thunderbolt v4 controller onto the SoC package they'd still need pins out to provision the 4-6 ports/connections that a MBP 16" would need.
But the larger package required to support the much higher thermal envelope will have plenty of room for more (and faster) IO pins.
 
Well there can be bangs in other areas.

You could make it super thin, super battery life, Fast on dedicated all, and just "ok" on other (what they will call) non optimized apps and put the blame on the apps.

So it may not be that fast, but they could sell it's other benefits strongly
I agree. My guess is that performance will be "somewhat better" than Intel CPUs that it replaces, but with much better GPU power and very good battery life (>15 hours).

I think that certain apps (e.g. Final Cut Pro) will be optimized to run very well on Apple Silicon
 
Thanks for this.
Do you know if what OS and other things the chip needs to also be doing will affect this benchmark?

It's why I don't take much notice of the "Wow look how fast the iphone runs" tests.

Well, yes. Now take that same CPU out, get it to run a full blown windows OS and deal with all the background stuff, THEN get it to run the test and see how fast it is now (if you see what I mean)

SPEC perf is mostly affected by CPU and Compiler. OS usually do nothing about it.
And since we are talking about a multicore CPU in 2020, background tasks usually does not affect the single core performance of a CPU.

iOS may looks like a single task OS but in fact has same Grand Central Dispatch under the hood and is multi-tasking multi-threaded from day one.
 
Well, in 4 weeks or less we'll all know.

I do hope it's more factual than the iPhone event. 5G 5G 5G 5G 5G
But Apple does seem to want to glam up things at recent events than the older more honest/tech ones with Steve.

I'd rather have proper info that a glossy show, so I'm ready for that!
We'll have to wait for REAL reviewers of course to get the good stuff.
 
Faster than Intel eh?

I hope you have a hat, as I'll be asking you to eat it. These are CPUs intended for telephones. The idea of putting them into desktop PCs is frankly absurd.
Oh dear, you really need to do some research!

Think about it.

Do you really believe that Apple would commit to changing their entire Mac product line to processors that are slower than the ones they are replacing? You can bet your life that Apple would only make this transition when they are certain they can deliver.

Do you not believe the numerous benchmarks and application performance tests that show A-series chips and beating Intel processors in many tasks?

I'm genuinely curious why you think Apple Silicon will be slower than Intel CPUs that they will replace. What data do you have to the contrary?
 
What's the point of an A14 in a desktop? The A14 is for saving power and extending the life of the battery. On a computer that remains plugged into AC mains power, you are best off using a more powerful CPU

Small size notebooks that are not used for high-end stuff like 3D CAD or Final Cut Pro are the logical use for these iPhone processors. It will take time for Apple to develop higher-end ARM chips so the transition is starting at the low end of the line were the use of electric power matters the most
From what I’ve read MacMini Intels have had heat throttling that knocks down their performance. If the ARMs can beat the Intel‘s performance why not use them in everything, including the MacMini.
 
The concern, at least for me, isn’t the CPU. It’s almost certain the new products will come with a redesigned form factor. As an owner of a 2016 MBP (Gen 1 of the current design) I can say Apple burned a lot of its customers.

Mine has been in for repairs at least 5 times — the keyboard was the main issue, but one of the batteries was also faulty (because they had to add a new one every time the keyboard was replaced), and the coating on the screen was defective and created marks all over it (entire screen had to be replaced). The repairs were covered by the warranty and recall but the time and productivity lost was notable.
Same here. There are risks associated with a brand new physical design, with the added uncertainly of first-generation MacOS-on-ARM and potential immaturity of 3rd-party native software or Rosetta 2 short-comings.
 
Lifuka is discrete graphics.
But is may not be discrete in the sense we understand today. It's possible that it will be a separate module (chiplet) on the same SoC package because it needs to share memory with the CPU cores. It's possible that Apple has developed a new kind of bus between SoC and a truly discrete GPU, but it would need to be *much* faster than PCIe if it going to share memory.

I have also read that Lifuka may just refer to an improved (high-spec) on-die GPU, or something similar to the AMD APUs used in the PlayStation and XBox.
 
Not without some haxie in there to make it work. The move to ASMacs signals a huge, possibly deadly, hit to the Hackintosh community.

You can bet Apple isn't going to do this stupidly. They will bork it to ensure iPads don't run it without major work.
i think Apple would offer it officially.
 
Uhh... Windows is already available on ARM! Been out for quite a while. Not every game has been cross compiled, that’s for sure.
Errr...kind of. Yes, Windows on ARM exists but there is not much native software for it. Until very recently only 32-bit x86 software ran under emulation...very slowly. MS announced 64-bit support, but I expect it will take at least a couple of years before performance is acceptable. Emulated games will suck. Native games will take some time to appear, if ever.
 
They’ve had plenty of time to design high end processors. You are all going to be surprised.
Even if you take this prediction into account that you are going to be surprised :)

Seriously, if they come out with a chip with 8x fast and 4x powersaving CPUs I would be positively surprised.
 
And a Ferrari drivers prefers 93-octane gas. Guess gas stations should ignore the 95% of their business selling other octane gasses and retool their business to service that 5%.

Reality is you represent a minority of a minority of computer users.
93-octane? You can't even buy fuel that low in many countries. My small Skoda requires a minimum of 95-octane (RON95) and I usually run 98-octane. I bet a Ferrari would run sub-optimally with only 93-octane. I have read that US fuel is often low-octane. Maybe the standards differ?
 
One aspect I feel I'd like to address is the common view which I think is a mistake and it goes something like this.

Wow, you have seen just how fast the current A14 chips are in the iPhone and iPad. Can you just imagine how fast they will be when they have a ton more power to feed them and large heat sinks to cool them.

I've seen the about type comment again and again, and even I thought it. However, I'm feeling this is wrong.

When has Apple ever really stuck a MASSIVE heatsink onto a chip in a laptop device?
Some of the latest laptops hardly have any decent heatsink at all, which why they run so hot and on the edge of throttling.
Likewise, throwing loads of power at it?
Have you seen the batteries in a large iPad ? Why is a think and light laptop going to have vastly larger batteries?

Whilst there mat be a little room to change, I feel it wrong to expect some major power delivery and cooling changes inside an ARM laptop that's not already in a high end iPad.

Welcome to hear your views on this.
 
Errr...kind of. Yes, Windows on ARM exists but there is not much native software for it. Until very recently only 32-bit x86 software ran under emulation...very slowly. MS announced 64-bit support, but I expect it will take at least a couple of years before performance is acceptable. Emulated games will suck. Native games will take some time to appear, if ever.

Windows On ARM's most important problem is device. Snapdragon 8cx is stupidly expansive and perform much slower than a non-pro iPad.
Native app can not save a device costing $999 and run slower than a $350 device.
 
93-octane? You can't even buy fuel that low in many countries. My small Skoda requires a minimum of 95-octane (RON95) and I usually run 98-octane. I bet a Ferrari would run sub-optimally with only 93-octane. I have read that US fuel is often low-octane. Maybe the standards differ?
Different. Europe uses RON, US uses (R+M)/2. US 91 is like Europe 95.
 
macOS doesn't even support 64 cores let alone several orders of magnitude more. Quite likely most of the Mac line up is going to be in the sub 16 core range for a quite long period of time. The core count for the transition Mac Pro probably will not be a big jump in core count. (e.g., 28 -> 32 or 28 -> 30 or even just an even "swap" at 28. ) . And the Mac Pro solution probably not coming any time soon ( more on the 2022 timeframe. )
Question is how easily that limit can be changed. Maybe there is a single constant defined in the operating system which decides the maximum number of cores, and that number can be changed from 64 to 128 or 256 or something bigger. Maybe someone decided "we will never have more than 64 cores, so we don't need to change this" and that limit is built into many places. We don't know. But with 56 virtual cores available in actual machines today, I would have given someone the task to make this changeable a while ago.

On the other hand, there may be bottlenecks that are Ok(ish) with 64 cores and hit hard when you have 100.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.
Back
Top