Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Snow Leopard was a fantastic, stable, reliable, solid OS at the end, likely because the core of Mac OS X was being optimized behind the scenes for the impending, but not yet publicly known, iPhone.

Yep, Snow Leopard pretty much killed Vista, aka ‘Shorthorn’, lol. Win7 was sound, a nice upgrade for business but nothing really better for consumers, IMO. There are many articles about this.

There were a fair number of years where you could see that the dictator at Apple cared about details.

Yes he did, and I absolutely agree, with an exception or two …. I wish that we had him reviewing software now. How many iOS upgrades require 1-3 extra ‘clicks’ to accomplish the same thing?

The exceptions? One is the classic is ‘You’re holding it wrong.’ No, he was designing it wrong, as evidenced by his customers input.

Today, the OSes are bloated, …

Absolutely. And bloated OS are also less secure.
 
Snow Leopard was a fantastic, stable, reliable, solid OS at the end, likely because the core of Mac OS X was being optimized behind the scenes for the impending, but not yet publicly known, iPhone.

Snow Leopard shipped two months after iOS 3. Its predecessor Leopard shipped four months after iOS 1. That was indeed delayed to get iOS 1 out the door.
 
  • Like
Reactions: Azathoth123
These things would've happened very early in the iphone planning cycle. I'm not sure what's the point of this news, other than some seeds to drum up demand for the 15 lineup. "Oh did you hear? They had to scale back the A16 somehow. That means the A17 on the iphone 15 Pro should be awesome!" And next year Apple will see huge demand for their iphone 15 Pro with a price increase, and Tim Cook is happy
 
  • Like
Reactions: dampfnudel
No need for this whatsoever, unless you are a bigtime gamer. Apple needs to worry about modem speeds. My used S21 blows away my coworker's iPhone 13 and his M1 Ipad Pro on our network at school. And it gets worse at distance. Speeds exceed 2 to 3 times faster transfers with the Samsung. The iPhone CPU and GPU in older A13 models reached "fast enough" for most people. Apple needs to make a more well-rounded device, as transfer speeds over the network are becoming a real hinderance.
 
  • Love
Reactions: turbineseaplane
Nobody cares! It's all about the integrated graphics performance of Apple Silicon.
You asked why nobody cares. It is because it isn't happening.

The only way they can increase generalized graphics performance by significant margins is to actually dedicate more of their power and silicon budget to gaming performance. This is simply not possible for mobile phone chips - because the batteries can only get so large before they are no longer able to be taken on planes, and the phones can only get so hot until people can no longer hold them and they start to have to throttle to keep from components from failing.
Again. We're talking about the iPhone GPU, not the Mac Pro.
If you want to limit "Apple Silicon" to just the iPhone GPU, then no. There is zero chance there are going to be significant increases to catch up with desktop-class or console-class GPU.

You need to stop pretending as if there's any competition for Apple. We're not talking about the PC gaming market, we're talking about gaming on Apple Silicon. iOS, iPadOS and macOS.
There is no competition in the market of Apple Silicon running on Apple devices, because only Apple can manufacture them :p
There are no trade offs. Apple invests in their own chip technology and the customers reap the benefits for free.
Architecturally there are a ton of trade offs, some of which I listed.

Of course it [PC gaming market] is too small, but luckily Apple owns the largest gaming market that ever was, the AppStore. And the same Metal engine runs on everything.
The App Store does not have PC gaming titles, both because of porting difficulty and because the systems do not have sufficient horsepower.

I'll need a citation for Apple owning the largest gaming market that ever was, given that there are more Android devices than iPhones by a fair margin.
Nobody cares about existing games. And that very special task is to create realism through natural lighting. That's useful for every virtual world.
Lots of people care about high quality, existing games.

Very few people care about "metaverse" virtual worlds.
The next process node shrink will add a few billion more transistors to the chip in need for a good purpose. We already have a Neural Engine, which most people don't need. A Gaming Engine is a no-brainer. Of course you add it if you can.
Process shrinks do not "add" transistors. Adding transistors is a cost and yield and equation. Newer processes shrink the die, but typically come with lower yield and higher per-area costs. It is the improved power performance (and in CPU cases, reduced infrastructure needed for signal travel time) that allows them to scale more cores or have a higher clock.

It will be interesting when you get that there is no differentiation anymore between an iPhone SE and the Mac mini with M1 Ultra. They are meant to be the same platform, with the same basic chip architecture. A few more cores here and there make no difference.

There is no M1 Ultra Mac mini, and there never will be.

If you are talking about the Mac Studio with M1 Ultra, then the difference is that there is a different balance of performance vs efficiency, a much wider memory architecture, additional infrastructure for Rosetta, additions of PC hardware infrastructure (e.g USB4, SSD, thunderbolt, HDMI) , 8x the RAM and 4-6x the number of cores.

There are other features which are harder to tell if they are present in the A15 vs M1 - such as virtualization/paravirtualization support, and GPU backward compatibility for non-tile families in metal.

The max TDP of the iPhone SE processor and GPU is 6W. The Ultra is 60W. It's literally an order of magnitude difference to support the additional features and processing power.
What about a feature for every iPhone do you not get?

The pictures are only prettier, because ray tracing support makes the necessary calculations so much faster that you can actually switch the feature on. You could always calculate pretty pictures on any computer, ray tracing support on the chip is what makes the frame rates acceptable.
The frame rates are not acceptable with zero ray tracing. Adding some zero cost hardware ray tracing might make the shading look nicer, but it won't improve the texel count. Best case scenario - the current 10 fps for a demanding title will be maintained, but now those ticks will be pretty enough to frame.
 
  • Like
Reactions: SteveW928
There are no trade offs. Apple invests in their own chip technology and the customers reap the benefits for free.
This is a bizarre statement.
R&D carries costs. Silicon area costs directly, and as it impacts yields.
And of course doing calculational work cost energy, which impacts battery life, cooling requirements et cetera.
Putting stuff into the the SoC that see little to no actual use is a waste. A waste where ultimately the consequences are pushed to the end user.
Nobody cares about existing games. And that very special task is to create realism through natural lighting. That's useful for every virtual world.
You’ve got that completely backwards.
People ONLY care about the games they can actually play. What algorithms might be used for ambient occlusion on hypothetical games in the future is utterly insignificant.
The next process node shrink will add a few billion more transistors to the chip in need for a good purpose.
They can be used to improve cache latencies, video codec support, or a plethora of other things that see actual use for all or a large number of users. Or saved, to cut costs and reduce power draw improving battery life. It’s is not as if there aren’t options.
We already have a Neural Engine, which most people don't need. A Gaming Engine is a no-brainer. Of course you add it if you can.
The bizarre thing is that you equate gaming and RT, when RT hardware was added by Nvidia to their GPUs in an attempt to lock in the rendering market, and what little adaption there has been in games has typically been funded by them directly.

Last time I looked RTX3070 desktop GPUs and up was 5% of the Steam hardware base. (And that’s not to say that these people actually use that hardware feature, most seem to do like me - flick the switch if available in settings, check it out, and turn it off - not worth it even with the required hardware.)
To be counted in the Steam survey you have to both have an account and actively use it, so those 5% represent a sampling from the more active PC gamers. And PC gamers represent roughly a quarter of gaming software revenue, so those 5% are roughly 1% of the gaming software market. And again, most of us in that percent just have capable hardware, it doesn’t mean we actually care.

The pictures are only prettier, because ray tracing support makes the necessary calculations so much faster that you can actually switch the feature on. You could always calculate pretty pictures on any computer, ray tracing support on the chip is what makes the frame rates acceptable.
That’s debatable. There is always a compromise.

Presence or absence of RT hardware is irrelevant in a market that is dominated by games that aren’t even 3D. The hardware on iPhones are capable of vastly more sophisticated rendering than is actually utilized. Pretending that the gaming market on IOS looks like it does graphically due to technical limitations of the hardware is ridiculous.

And to promote a very inefficient way to adress lighting seems like a bad call given the cost/complexity/density wall that lithographic technology faces. Efficiency is the order of the day. Intelligent upscaling is a good example. Not RT.

RT might make for a nice marketing checkbox though, at least if you are targeting the demographic that has been affected by Nvidias markting budget. I don’t think that really describes iPhone users in general though.
 
the gpu in an iPhone really doesn't need to be more powerful unless they allow it to have USB-C and then allow it to connect as a computer to an external display

Perhaps not, but those chips go into other devices like iPads and Apple TV too. If Apple TV wants to have true console-competitive gaming, for example, then it will need a more powerful GPU one day.
 
  • Disagree
Reactions: JapanApple
Perhaps not, but those chips go into other devices like iPads and Apple TV too. If Apple TV wants to have true console-competitive gaming, for example, then it will need a more powerful GPU one day.
I look at the library of games the Nintendo Switch has, even though it's still using a years-old chip. If Doom Eternal can be optimised for that, it can run on a modern day iOS device without any issues. You don't need a high-end console to play a game like Pokemon or animal crossing either.

I think the problem is that people have become conditioned to freemium games on iOS, and developers find it a challenge to release AAA titles on the iOS game store for fear that consumers simply won't pay $60 for such a title. It may also be a market that Apple is willing to give up (ie: the freemium market and the AAA gaming market are mutually exclusive, and at the moment, Apple is happy to dominate the mobile gaming market at the expense of the latter because that's where the money is, predatory as these games tend to be).
 
  • Like
Reactions: JapanApple
R&D carries costs.
Divided by 200+ million iPhones in the first year. It's a rounding error.
Silicon area costs directly, and as it impacts yields.
Nah, you pay a premium for iPhones anyway. The A-series chips will always spend huge amounts of die area for special tasks. The only question is what are they going to improve, photography, neural networks or gaming?
And of course doing calculation work cost energy, which impacts battery life, cooling requirements et cetera.
There's a lot of gaming going on on the iPhone right now. It will just run more efficiently with better details.
Putting stuff into the the SoC that see little to no actual use is a waste.
No, it's a luxury.
You’ve got that completely backwards. People ONLY care about the games they can actually play.
People want and need iPhones and they care only about games which run on iPhones.
What algorithms might be used for ambient occlusion on hypothetical games in the future is utterly insignificant.
Right now. But this is a rumors site to discuss future developments.
[Transistors] can be used to improve cache latencies, video codec support, or a plethora of other things that see actual use for all or a large number of users. Or saved, to cut costs and reduce power draw improving battery life. It’s is not as if there aren’t options.
And among these options Apple chose Ray Tracing as an area in which to improve Apple Silicon.
The bizarre thing is that you equate gaming and RT, when RT hardware was added by Nvidia to their GPUs in an attempt to lock in the rendering market, and what little adaption there has been in games has typically been funded by them directly.
Nobody cares what and why Nvidia is doing anything. The iPhone only works because Apple is doing everything themselves. They don't even let others help with their chip design anymore. All the priorities which influence performance are set by Apple. We're only talking about why Apple might add ray tracing to their own chips.
Last time I looked RTX3070 desktop GPUs and up was 5% of the Steam hardware base. (And that’s not to say that these people actually use that hardware feature, most seem to do like me - flick the switch if available in settings, check it out, and turn it off - not worth it even with the required hardware.)
Nobody cares! iPhones and iPads are 100% of the AppStore gaming market.
That’s debatable. There is always a compromise.
The compromise is that German cars aren't cheap. ... and Apple devices neither.
And to promote a very inefficient way to adress lighting seems like a bad call given the cost/complexity/density wall that lithographic technology faces. Efficiency is the order of the day.
No, iPhones and iPads are already way too powerful. Using Apple's performance surplus for something totally amazing, which Android phones can't even dream of, is the name of the game.
RT might make for a nice marketing checkbox though, at least if you are targeting the demographic that has been affected by Nvidias markting budget. I don’t think that really describes iPhone users in general though.
Ray Tracing was cool long before Nvidia began to sell it. In fact they are selling it in a totally unattractive package even for the vast majority of 1st world hedonist gamers. PC gaming is a dead end, because it is tied to awful PCs. Nobody wants a PC.
 
These things would've happened very early in the iphone planning cycle. I'm not sure what's the point of this news, other than some seeds to drum up demand for the 15 lineup. "Oh did you hear? They had to scale back the A16 somehow. That means the A17 on the iphone 15 Pro should be awesome!" And next year Apple will see huge demand for their iphone 15 Pro with a price increase, and Tim Cook is happy
Bingo bango! This is just typical Apple PR: “We had to pull our bleeding-edge advancement at the very last minute to perfect it for our customers, who we love. When this is ready, we can’t wait to see the incredible things our customers will do with it!”
 
I have not read all the comments in this thread so I am not sure if I am repeating anything already volunteered. If I am I apologize in advance.

In the 'old' Apple days, Apple would not have released a product that was not ready as planned and held it until it was ready. It is a sad fact that those days are long gone and Apple are forced to march to the need for revenue at specific times of the year. The idea that the iPhone 14 could have been delayed by some time (say 6 to 12 months) is now unconscionable. I would have waited, to be honest, but the fact that Apple is driven by competition and revenue and not product integrity is now a fact and has been for a while.
 
In the 'old' Apple days, Apple would not have released a product that was not ready as planned and held it until it was ready. It is a sad fact that those days are long gone and Apple are forced to march to the need for revenue at specific times of the year. The idea that the iPhone 14 could have been delayed by some time (say 6 to 12 months) is now unconscionable.

How is the iPhone 14 “not ready”? Cutting a feature to meet a deadline is precisely how you make something ready.

Mac OS X often did this. You think Apple wanted to ship 10.0 without disc burning and DVD playback? Or iPhoneOS 1.0 without copy and paste? No, they did it to meet a deadline.
 
Again, you totally missed the part that described how the GPU upgrade wasn’t nearly as significant as previous years. The Kool-Aid must taste pretty good, otherwise why else would you do this nonsense?

You’re like the Trump people who will say anything, make up anything and believe anything – and attack journalists – in the name of the belief that Apple Is Perfect And Can Do No Wrong, Ever.
I don’t know how you took that Apple can do no wrong from my post. They’re one of the biggest tax cheats on the planet.

Anywho, the barrier here is that these features are designed for the 3nm
I have not read all the comments in this thread so I am not sure if I am repeating anything already volunteered. If I am I apologize in advance.

In the 'old' Apple days, Apple would not have released a product that was not ready as planned and held it until it was ready. It is a sad fact that those days are long gone and Apple are forced to march to the need for revenue at specific times of the year. The idea that the iPhone 14 could have been delayed by some time (say 6 to 12 months) is now unconscionable. I would have waited, to be honest, but the fact that Apple is driven by competition and revenue and not product integrity is now a fact and has been for a while.
Apple has always operated in a capitalistic system.
 
  • Like
Reactions: turbineseaplane
The only way they can increase generalized graphics performance by significant margins is to actually dedicate more of their power and silicon budget to gaming performance. This is simply not possible for mobile phone chips - because the batteries can only get so large before they are no longer able to be taken on planes, and the phones can only get so hot until people can no longer hold them and they start to have to throttle to keep from components from failing.

Yep. For portable device GPU, you want energy efficiency, and the path to useful processing power is pruning away the nice-to-have features while leaving the must-have features. (You could draw an analogy with RISC itself, which uses the same principle at an instruction-set level.)

For desktop gaming, it's okay if your GPU needs a nuclear submarine to power it. Raytrace to your heart's content. For something like an F1 racing simulator, seeing the reflections on your hood update live is probably a nice effect.

Does a phone need raytracing hardware? No f'n way. Ask most users, "would you trade this niche capability for super graphics on games pretty much nobody actually plays on their phones for longer battery life and less heat" and I suspect the latter would win pretty handily.

There is a slice of the gamer world that thinks only AAA games area games, and only GPUs that require a nuclear submarine to power are GPUs. They are wrong. Apple is laughing all the way to the bank in casual gaming, a lot of which is 2D anyway.
 
In the 'old' Apple days, Apple would not have released a product that was not ready as planned and held it until it was ready.
You mean like they hold back the large iMac? Apple is still Apple, but the iPhone is the iPhone and there will be a new one every year, with the features which are ready now. Other manufacturers release a dozen new phones every quarter. So Apple is still limiting itself to what they think is doable. Not every development is an instant success.
 
  • Like
Reactions: throAU
For portable device GPU, you want energy efficiency, and the path to useful processing power is pruning away the nice-to-have features while leaving the must-have features.
No! Everything Apple sells is a luxury product brimful of nice-to-haves. For must-haves you buy a cheap Android phone. To achieve energy efficiency you make one way of graphics acceleration mandatory and optimize this path along the full hardware and software trajectory.
For desktop gaming, it's okay if your GPU needs a nuclear submarine to power it. Raytrace to your heart's content. For something like an F1 racing simulator, seeing the reflections on your hood update live is probably a nice effect.
For desktop gaming, first you throw out the desktop and build an iMac that's basically a large iPad with an integrated stand. Then you inherit all the efficiency of mobile computing and you advance performance from there.
Does a phone need raytracing hardware? No f'n way. Ask most users, "would you trade this niche capability for super graphics on games pretty much nobody actually plays on their phones for longer battery life and less heat" and I suspect the latter would win pretty handily.
There is no meaningful difference between an ARM iPhone and an ARM Mac. It's all one platform with a bunch of different screen sizes and performance levels. This entire platform benefits from raytracing at every level. Do we need smooth scrolling, shadows and reflections, jiggling and bouncing? Or even rounded corners! Nobody needs anything of what makes using an Apple device nice. It's all a luxury we want. So the real question is: do you want more realism in gaming? Hell, yeah! I do.
 
  • Like
Reactions: throAU
Same processor as last year but overclocked. And a bit more energy efficient than last year to make up for the overclocking. Makes you rethink all the drama of the iPhone 14 having the same processor as last year… 🤣



If nobody posted it already:



 
Same processor as last year but overclocked. And a bit more energy efficient than last year to make up for the overclocking. Makes you rethink all the drama of the iPhone 14 having the same processor as last year… 🤣



If nobody posted it already:



A16 is pretty different in CPU department and memory support.

 
Last edited:
  • Like
Reactions: chucker23n1
No! Everything Apple sells is a luxury product brimful of nice-to-haves. For must-haves you buy a cheap Android phone. To achieve energy efficiency you make one way of graphics acceleration mandatory and optimize this path along the full hardware and software trajectory.
That's what they did. They optimized raytracing out because it's (a) almost completely irrelevant in almost every iPhone use case and (b) doesn't synergize with the one way of graphics acceleration Apple *did* make mandatory, tile-based deferred rendering via Metal 2.

A-series and M-series chips are two different series. But even with M-series, you've still got energy efficiency as a central goal because of their use in laptops.

Your argument seems to be that one size must fit all, and therefore iPhones must play AAA games, no matter how scorchingly hot they get and how few minutes a charge lasts, despite the fact that people don't buy iPhones in the mistaken idea that they're AAA game consoles in disguise.
 
  • Like
Reactions: bobcomer
Setbacks are always a part of the R&D process. When you try to really push a technology you’ll inevitably hit a few walls.
I will say, real-time ray tracing on a mobile processor is pretty ambitious. 🤔

Also, loosing talent always results from bad management [witch affects culture and moral] and poor incentives.
 
No! Everything Apple sells is a luxury product brimful of nice-to-haves. For must-haves you buy a cheap Android phone. To achieve energy efficiency you make one way of graphics acceleration mandatory and optimize this path along the full hardware and software trajectory.

For desktop gaming, first you throw out the desktop and build an iMac that's basically a large iPad with an integrated stand. Then you inherit all the efficiency of mobile computing and you advance performance from there.

There is no meaningful difference between an ARM iPhone and an ARM Mac. It's all one platform with a bunch of different screen sizes and performance levels. This entire platform benefits from raytracing at every level. Do we need smooth scrolling, shadows and reflections, jiggling and bouncing? Or even rounded corners! Nobody needs anything of what makes using an Apple device nice. It's all a luxury we want. So the real question is: do you want more realism in gaming? Hell, yeah! I do.
There’s one major difference, battery size.

That doesn’t seem to matter to you, but power dictates what you can do with it. There isn’t the power or die budget available to do this on 5nm, hence why the entire premise of this article is techno-illiterate ********.

My guess as to what actually happened: this was supposed to be 3nm, that node isn’t ready given the supply chain issues the last two years. Apple looked at the feasibility of doing this on 5nm and quickly abandoned it given heat and power concerns, so we got A16 as-is.

The “journalist” who doesn’t understand anything about chip manufacturing got his wires crossed, even though on face value the story he handed in just doesn’t make sense if you understand what goes into chip manufacturing, and everyone know believes the story because it’s been “reported”.

Btw, remember Bloomberg’s big fat lie about secret Chinese chips being in everyone’s data centers? That was “reported” as well and complete BS.
 
  • Like
Reactions: ian87w
That's what they did. They optimized raytracing out because it's (a) almost completely irrelevant in almost every iPhone use case and (b) doesn't synergize with the one way of graphics acceleration Apple *did* make mandatory, tile-based deferred rendering via Metal 2.

No, they left it out because they didn’t finish it in time, and didn’t consider it high-priority.

A-series and M-series chips are two different series.

They have exactly the same GPU cores. Not the same clock and not the same amount, but the same feature set.

 
  • Like
Reactions: Gudi
This reminds me of the how Intel does/did things. It's like Apple is starting to believe their own PR/hype. A smart company would start to lengthen the design cycles instead of letting their useless managers keep pushing unrealistic expectations.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.