Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Oh, I was just thinking of this exact example! The old Mac Mini with more cores was actually more powerful in many ways than the new. It was a cost-cost savings move. This is a parallel situation. The old A10X with all those cores is actually a better gaming machine than the new one.

View attachment 1763187


The GPU core count above is a bit of "Apple's to Oranges". The baseline of the A10 era SoCs still had a design mainly based on Imagination Technology GPU ( PowerVR) . Apple had replaced bits of the overall implementation but it was still in the phase where was trying to appear as a ImagTech GPU. At A11 Apple shifted over to doing their own more explicit implementation. When they did they changed the appoarch of how they count 'cores'. So in realm of comparing AMD 'CU's to Nvidia ' CUDA cores'.

The A10 had 6 'cores' and A10X has 12 'cores'

The A12 has 4 GPU 'cores' and the A12Z has 8 GPU 'cores'.


Apple is doing substantive changes on each GPU iteration. The "cheaper to make" is easier to tag in a common generation. The A12 is smaller than the A12X so is less expensive to produce . ( even before take into account that the A12 has other volume Apple products to share production costs with and A12Z has none . Other Apple products driving down unit costs contributes to "cheaper for Apple" also. ).
 
  • Like
Reactions: perezr10
Hi there,

after years of silent reading I decided to register here and share my thougts.

I agree with You regarding the unused opportuity to use a more powerful chip for the new ATV4K. But I'm really not sure at all whether the A12 is really less powerfull than the old A10X, even regarding gaming.

Geekbench 5 lists the following results (already mentioned by others):
https://browser.geekbench.com/ios-benchmarks/

Geekbench 5 TestA12A10XResult
CPU Single Core1112831+34%
CPU Multi Core28292273+24%
GPU (Metal)53646971-23%

As You can see, the A12 perfoms better in the CPU tests, but worse in the GPU test.

From my understanding, when it comes to GPU gaming performance, THE benchmark to use is 3DMark. Apple Arcade is gaming, therfore we should have a look at a real gaming benchmark:
https://benchmarks.ul.com/hardware/phone/Apple+iPhone+XS+review
https://benchmarks.ul.com/hardware/tablet/Apple+iPad+Pro+2017+10.5+review

3DMark TestA12A10XResult
Sling Shot Unl. (OpenGL)67216314+6%
Sling Shot Extreme Unl. (Metal)43304430-2%
Wild Life Unl. (Metal)61745636+10%

What we see here is a better performance of the A12. It only shows slightly worse results (-2%) than the A10X in the Sling Shot Extreme test. This test is kind of outdated according to 3DMark (it is for comparing old iOS devices against their midrange Android competitors). State of the art appears to be the Wild Life test which is created for modern devices. Maybe it is more optimized to Metal than its old variant. I might be wrong, but this appears to be the only relevant GPU test of the four for future applications (Geekbench 5 vs. Sling Shot vs. Sling Shot Extreme vs. Wild Life).

Verdict: CPU-wise the A12 outperforms the A10X clearly (Geekbench). For gaming (3DMark), the A12 is on par but mostly better than its predecessor.

One last note (not to be taken too seriously): Apple didn't mention anywhere, if they use ony one A12 processor in their new ATV4K. Maybe they developed a SLI-like system consisting of two A12 per box. :)
 
Last edited:
Hi there,

after years of silent reading I decided to register here and share my thougts.

I agree with You regarding the unused opportuity to use a more powerful chip for the new ATV4K. But I'm really not sure at all whether the A12 is really less powerfull than the old A10X, even regarding gaming.

Same here. If you take a look at Apple's Newsroom article about the new ATV4K here, they state that:
At the heart of the new Apple TV 4K is the A12 Bionic chip that provides a significant boost in graphics performance, video decoding, and audio processing.

I would think this means that they've either overclocked the A12 GPU slightly since it has active cooling, or did some other tinkering to improve the GPU performance over the A10X.

Regardless, I'll probably wait for reviews on it and decide from there.
 
What I’m missing, and hoping someone here can explain, is what the real life benefit of hdmi 2.1 is for most content on an Apple TV. Other than gaming, it seems like most movies are 24fps? I have a tv with Dolby vision and hdr10. I’m truly asking, a bit of a novice here. I have an Apple TV 4K and have it set to match the source frame rate and source for hdr content.
 
What I’m missing, and hoping someone here can explain, is what the real life benefit of hdmi 2.1 is for most content on an Apple TV. Other than gaming, it seems like most movies are 24fps? I have a tv with Dolby vision and hdr10. I’m truly asking, a bit of a novice here. I have an Apple TV 4K and have it set to match the source frame rate and source for hdr content.
Some movies are shot at 60 FPS, and 120 FPS will likely also appear in the future. I could see it do wonders for documentaries.

See this article: https://www.pcmag.com/how-to/hdmi-21-why-it-matters-for-pcs-and-tvs-in-2021
 
  • Like
Reactions: anthony13
I was misguided by the A12 number vs. A10 (I didn’t even remember it was the X version) until I watched a YouTube video from MaxTech telling me that A12 is weaker than the A10X in the older Apple TV 4K. Then I came across this post. Luckily I did my research before hitting the buy button, thanks Apple for not letting me pre-order right away. Haha.

I may as well skip this generation. I dont’ use the remote much as I always use my phone to control the Apple TV.
Well but it’s “just” a hundred something dollar purchase over a 4 year period. Probably I will reconsider it after the reviews are out.

Probably the only thing I’m buying in this big Apple announcement may be the AirTags. Funny times!
 
Last edited:
Hi there,

after years of silent reading I decided to register here and share my thougts.

I agree with You regarding the unused opportuity to use a more powerful chip for the new ATV4K. But I'm really not sure at all whether the A12 is really less powerfull than the old A10X, even regarding gaming.

Geekbench 5 lists the following results (already mentioned by others):
https://browser.geekbench.com/ios-benchmarks/

Geekbench 5 TestA12A10XResult
CPU Single Core1112831+34%
CPU Multi Core28292273+24%
GPU (Metal)53646971-23%

As You can see, the A12 perfoms better in the CPU tests, but worse in the GPU test.

From my understanding, when it comes to GPU gaming performance, THE benchmark to use is 3DMark. Apple Arcade is gaming, therfore we should have a look at a real gaming benchmark:
https://benchmarks.ul.com/hardware/phone/Apple+iPhone+XS+review
https://benchmarks.ul.com/hardware/tablet/Apple+iPad+Pro+2017+10.5+review

3DMark TestA12A10XResult
Sling Shot Unl. (OpenGL)67216314+6%
Sling Shot Extreme Unl. (Metal)43304430-2%
Wild Life Unl. (Metal)61745636+10%

What we see here is a better performance of the A12. It only shows slightly worse results (-2%) than the A10X in the Sling Shot Extreme test. This test is kind of outdated according to 3DMark (it is for comparing old iOS devices against their midrange Android competitors). State of the art appears to be the Wild Life test which is created for modern devices. Maybe it is more optimized to Metal than its old variant. I might be wrong, but this appears to be the only relevant GPU test of the four for future applications (Geekbench 5 vs. Sling Shot vs. Sling Shot Extreme vs. Wild Life).

Verdict: CPU-wise the A12 outperforms the A10X clearly (Geekbench). For gaming (3DMark), the A12 is on par but mostly better than its predecessor.

One last note (not to be taken too seriously): Apple didn't mention anywhere, if they use ony one A12 processor in their new ATV4K. Maybe they developed a SLI-like system consisting of two A12 per box. :)
Thanks for the additional data.

Although, you can’t discount Geekbench 5. They are the benchmark of benchmarks. And their Metal test was designed specifically to measure all around graphics performance for Metal. The guys at Geekbench thought that Metal was important enough that they would deconstruct it and figure out how to best measure it.

Ignoring Geekbench 5 is like saying that all media publications and blogs need to stop using Geekbench because they don’t know what they are doing.

Ultimately, there will probably be YouTube videos from guys like MaxTech where they put the two systems side-by-side and see which one stutters more. That’s what I’m waiting for to make my decision on whether to upgrade.
 
  • Like
Reactions: hmbtnguy
Thanks for the additional data.

Although, you can’t discount Geekbench 5. They are the benchmark of benchmarks. And their Metal test was designed specifically to measure all around graphics performance for Metal. The guys at Geekbench thought that Metal was important enough that they would deconstruct it and figure out how to best measure it.

Ignoring Geekbench 5 is like saying that all media publications and blogs need to stop using Geekbench because they don’t know what they are doing.

Ultimately, there will probably be YouTube videos from guys like MaxTech where they put the two systems side-by-side and see which one stutters more. That’s what I’m waiting for to make my decision on whether to upgrade.
Hi,

it was not my intention to discount Geekbench at all. It is definetly an important and highly respected benchmark. But since 3DMark has been there for ages as a trusted tool for measuring gaming performance, I considered it as more relevant in that discipline and worth mentioning.

Nevertheless, as You said, side-by-side comparisons will hopefully show the performance differences...
 
Last edited:
There was a pretty solid rumor that there might be 2 ATVs released this year one of which was supposed to be higher end/gaming focused.....I think perhaps this is only he first release of the year as it pretty much matches the prediction for the lower specced model.
 
There was a pretty solid rumor that there might be 2 ATVs released this year one of which was supposed to be higher end/gaming focused.....I think perhaps this is only he first release of the year as it pretty much matches the prediction for the lower specced model.
Yes, agree. They removed some gaming abilities from the new Siri Remote (e. g. no gyroscopes) and improved the ATV4K's CPU / GPU moderately. This is quite a pure highend streaming box now (with casual gaming abilities), maybe the beast for serious gaming will come later.
 
Last edited:
Hi,

it was not my intention to discount Geekbench at all. It is definetly an important and highly respected benchmark. But since 3DMark has been there for ages as a trusted tool for measuring gaming performance, I considered it as more relevant in that discipline and worth mentioning.

Nevertheless, as You said, side-by-side comparisons will hopefully show the performance differences...

Well I don’t consider 3D Mark as “more relevant” because I see Geekbench everywhere included by journalists who choose to ignore 3DMark. These guys could easily reference Geekbench for processor and 3DMark for graphics but most don’t. And a 3DMark sighting now is more rare than 5 years ago.

I’m guessing the A10X with it’s 12 graphics cores is going to be at least equal to the A12 if not significantly better. But I’d be happy to be wrong. Because I’d love to upgrade.
 
Easy marketing sales for Apple.
12 is a bigger number than 10...and a better remote. People not in the know will buy this new 2021 model.

Those in the know, will sit on the sidelines with their current ATV, but they’ll buy the new remote separately.

Maybe they’re not interested in the hardcore gaming market.
 
Yes, agree. They removed some gaming abilities from the new Siri Remote (e. g. no gyroscopes) and improved the ATV4K's CPU / GPU moderately. This is quite a pure highend streaming box now (with casual gaming abilities), maybe the beast for serious gaming will come later.

Don‘t forget support for the PS5 / Xbox Series X controllers, added in 14.5. The message is you shouldn’t try to game with a Siri remote.
 
  • Like
Reactions: satchmo
This feels mostly right to me? Marketing, for sure. I don't think Apple was *ever* interesting in 'hardcore' AAA gaming.

To what extent do some iOS /  Arcade games become 'more sophisticated' than dumb phone games? I mean, I don't know; that's sometimes obvious (BaSS) and sometimes not so obvious. I think hoping for a gaming-focused device (or variant of the  TV) is seductive wish-casting (I would totes love one!), but also sadly not a very realistic reading of Apple's behavior and demonstrated interest level.

If Apple wanted to make that gaming device, I think they would have either put the A14 / M1 into the  TV 4k (instead of the A12), or they could have offered a step-up option with the A14 / M1 and maybe even more memory. (They clearly aren't afraid of managing a ****-ton of SKUs.). But, as we saw, they conspicuously didn't do that.

I think the choice of A12 is more prosaic: is the new bottom processor they are willing to continue manufacturing in bulk for the next couple or several years in the  TV, the iPhone XR, and the bottom two iPads. Maybe they have a warehouse full of A8s for use in the  TV HD and A10s in the iPod Touch? But those are obviously very low-volume products. If/when they step these up — maybe quite soon? — I bet they go to the A12 chip…

So, I guess my point is choosing the A12 is even MORE insulting from a capability perspective than it appears at first, considering my supposition that it is the ********* chip Apple is putting in anything *new* at this time and forward. And the cost delta between an A13 or A14 versus an A12 can't be THAT much?! On a device that is grotesquely over-priced to begin with! Ugh.

If the specs are a decent improvement (TBD, obviously) I'll get a new 4k, but ****balls what a missed opportunity to do something better... [sigh]

Easy marketing sales for Apple.
12 is a bigger number than 10...and a better remote. People not in the know will buy this new 2021 model.

Those in the know, will sit on the sidelines with their current ATV, but they’ll buy the new remote separately.

Maybe they’re not interested in the hardcore gaming market.
 
MaxTech had a very convincing argument that this is not the AppleTV update we are looking for:

TL;DR: This is just a spec bump update for what is going to be the entry level AppleTV with a newer AppleTV Arcade / AppleTV Soundbar yet to come.
 
  • Like
Reactions: star-affinity
Thanks for the link. That’s a really interesting video; I certainly hope he’s correct! it certainly makes a lot of sense, if you accept the theory that  is interested or planning to make a more gaming oriented media box. (Which they should!)

But, we‘ve been disappointed by rumored products not appearing more than once in the recent past. I hate to get too anticipatory only to be dashed later. Fingers crossed…
 
  • Like
Reactions: ErikGrim
He kind of goes off the rails with the HomePod-integration theory. That path makes a lot of sense for a particular future version of the  TV, but personally, I think that pathway and the gaming-centered pathway are two very different roads to different product destinations. [shrug] I hope we find out!

It is entirely likely, of course, that  is building a bunch of prototypes trying to figure out where they should go in the living room. Maybe even testing out manufacturing with some of them. I'd wager the most modest possible thing is the most likely — so hopefully an ' TV Arcade' with an A14X/M1 class chip, more memory, and a *really* nice  designed controller. (I mean, I hope the controller is awesome, because otherwise it will probably be a total catastrophe! Apple's controller/remote track record is… not great.)

That sounds very appealing to me, but imagining very appealing  gear is not hard. :)

I don't understand why they wouldn't have either offered such a device last week, with the new 4k, or alternatively, held the new 4k until when ever they are ready to show the gaming box. That we got the new 4k alone makes me wonder — as a realist/pessimist — if they killed the gaming box…?
 
  • Like
Reactions: ErikGrim
What I’m missing, and hoping someone here can explain, is what the real life benefit of hdmi 2.1 is for most content on an Apple TV. Other than gaming, it seems like most movies are 24fps? I have a tv with Dolby vision and hdr10. I’m truly asking, a bit of a novice here. I have an Apple TV 4K and have it set to match the source frame rate and source for hdr content.
Well sport and concert for ones, will be benefit from 2.1

Also, nobody stops somebody from shooting film at 60 fps. Peter Jackson did with The Hobbit. We're not in film era anymore. It's just Hollywood being lazy.
 
Well sport and concert for ones, will be benefit from 2.1

Also, nobody stops somebody from shooting film at 60 fps. Peter Jackson did with The Hobbit. We're not in film era anymore. It's just Hollywood being lazy.
No, no, no!
24Hz is the Cinematic Experience! At any other frame rate, the magic is lost. It lives between the frames! If the frame rate is higher you won’t be able to catch it.
 
No, no, no!
24Hz is the Cinematic Experience! At any other frame rate, the magic is lost. It lives between the frames! If the frame rate is higher you won’t be able to catch it.
Funny people believe there is magic number in film speed. In the old days they chose 24 fps only because it's the slowest speed that the motion won't be jagged, of course to save films. It stuck because we were using films for so long and in the digital era it's still somewhat needed because of those old films but why someone shooting movie today won't choose faster speed is beyond me. 60 fps should be ideal.
 
Well I don’t consider 3D Mark as “more relevant” because I see Geekbench everywhere included by journalists who choose to ignore 3DMark. These guys could easily reference Geekbench for processor and 3DMark for graphics but most don’t. And a 3DMark sighting now is more rare than 5 years ago.

I’m guessing the A10X with it’s 12 graphics cores is going to be at least equal to the A12 if not significantly better. But I’d be happy to be wrong. Because I’d love to upgrade.
I don’t think GB5 is testing rasterization performance with the GPU test, they are looking at compute performance (because the two can differ). When talking about games, most of the time, rasterization performance matters more than compute.
 
  • Like
Reactions: macbuzzr
I don’t think GB5 is testing rasterization performance with the GPU test, they are looking at compute performance (because the two can differ). When talking about games, most of the time, rasterization performance matters more than compute.
So do we have a sense of whether the new A12  TV 4K is meaningfully more performant (esp for games) than the old one? Apple marketing pablum says it is, but how reliable is that?

Gameplay gets more realistic with the A12 Bionic, which powers smoother motion and greater responsiveness.
That could pretty easily be a nothingburger, or it could be something meaningful like +50% improvement…
 
Last edited:
Truthfully we will likely never know. No one makes a GPU benchmark for AppleTV. Maybe playing Oceanhorn 2 with an external capture device to get the frame rate could work, but I don’t think there is really anything demanding on AppleTV, games wise, that would benefit from more power.
So do we have a sense of whether the new A12  TV 4K is meaningfully more performant (esp for games) than the old one? Apple marketing pablum says it is, but how reliable is that?

Gameplay gets more realistic with the A12 Bionic, which powers smoother motion and greater responsiveness.
That could pretty easily be a nothingburger, or it could be something meaningful like +50% improvement…
 
From what has been stated, the 60 fps HDR coupled with support in AirPlay is for the new iPhone 12s video. It is not clear, however, whether this will be enabled in the old Apple TVs. Regardless, if AirPlay 2 and the iPhone 12 AND your TV and a new Apple TV all support 4k 60 fps HDR, that could make for some mighty fine home videos....
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.