Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The framing of the issue as "setback" and failure is so sensationalising, not making it clear that part of research and development HAS to involve experimentation (DUH!) and various trials and errors. I wouldn't necessarily call it a "setback" which is too sensational and tries to capture the audience's mind in a particular way. Boo!

Anyone who has done any serious research and development will recognize that testing, development, and various failed approaches is part and parcel of the work.

Setback, delay, reversal, what word do you want? It’s all the same thing. They planned a feature and didn’t ship it due to problems.

Failure is your word, not the article’s.
 


Apple planned a major generational update for the iPhone 14 Pro's graphics capabilities, but was forced to scrap plans for the new GPU late in development after "unprecedented" missteps were discovered, according to The Information.

A16-iPhone-14-Pro.jpeg

In a paywalled report, The Information claimed that Apple engineers were "too ambitious" in adding new features to the graphics processor designed for the iPhone 14 Pro, including features like ray tracing – a lighting technique to achieve a further level of realism in games. This meant that iPhone 14 Pro prototype were found to draw much more power than expected, impacting the device's battery life and thermal management.

According to several individuals that claim to possess first-hand knowledge of the incident speaking to The Information, Apple discovered the flaw in the iPhone 14 Pro's GPU late in the device's development cycle, meaning that it had to hastily pivot to revert largely to the GPU from the A15 Bionic chip from the previous year's iPhone 13 lineup.

The incident is reportedly unprecedented in Apple's chip design history and is responsible for why the iPhone 14 Pro shows only small improvements in graphics performance compared to the leaps made by previous iPhone generations. The error resulted in Apple restructuring its graphics processor team and moving some managers away from the project, including the exit of key figures that apparently contributed to Apple's emergence as a chip design leader.

The report goes on to reveal how Apple's chip design team has been forced to contend with a loss of talent in recent years, with the company having lost dozens of key people to various silicon design companies since 2019, as well as interpersonal feuds and lawsuits with chip startups.

Article Link: iPhone 14 Pro Faced 'Unprecedented' Setback Leading to Removal of New Graphics Processor
R&D is not a set back, it is an investment. There is a reason Apple leads the field.
 
The framing of the issue as "setback" and failure is so sensationalising, not making it clear that part of research and development HAS to involve experimentation (DUH!) and various trials and errors. I wouldn't necessarily call it a "setback" which is too sensational and tries to capture the audience's mind in a particular way. Boo!

Anyone who has done any serious research and development will recognize that testing, development, and various failed approaches is part and parcel of the work.

what are you talking about, the issue became apparent well beyond the r&d stage, when everything should be locked to start the massive production cycle. It’s unprecedented, calling it a setback is not sensationalism it’s actually down playing it.
 
  • Like
Reactions: bobcomer
I find it extremely unlikely that the team, when experimenting with a feature that everyone knows is power hungry, would overlook doing power / thermal mgmt testing until late in the development cycle. So late in fact, that a near emergency last minute change had to be made.

I guess I don't have the same flavor of Kool-Aid that you're drinking?
The Kool-Aid that doesn’t try to create a nonsense defence and apologia for Apple? Do you not know what that term even means? You know that Apple doesn’t love you back right?

“Everyone knows”. Well, apparently they didn’t actually. There are infinite possibilities as to why. You’re just contradicting reported facts, based on nothing. You’re so weird man.
 
  • Like
Reactions: burgman
The Kool-Aid that doesn’t try to create a nonsense defence and apologia for Apple? Do you not know what that term even means?

“Everyone knows”. Well, apparently they didn’t actually. There are infinite possibilities as to why. You’re just contradicting reported facts, based on nothing. You’re so weird man.
“Facts” that don’t line up with how the actual timeframe and process of developing and manufacturing chips are not facts.

This whole things reads like standard tech journalism these days. Someone who is slightly more tech literate but still way out of their depth has pieced together a narrative that doesn’t make any sense whatsoever when the details are examined, but will be lapped up as “fact” by those who read it.

You don’t just swap out that much of the design of a chip that’s going to be produced in the hundreds of millions, you’d have to redesign the entire thing.

This entire story doesn’t pass the sniff test and it’s a damn shame so many here are using it to extrapolate further.

3nm is delayed, that’s the beginning and the end of what happened this year.
 
There is no need to read so much into this. Come on, there are many people with iPhone SE 2's and iPhone X's. This was not life changing, lets move along. Its better they pulled back than ship a buggy product. When they fix this by iPhone 16 or 17, they will still be ahead of the competition. Where I think Apple is likely falling down with the iPhone is the area they have been putting a lot of emphasis over the years; the camera. MKBHD's photo comparison voting contest proved just how lacking the iPhone camera is. My blind votes showed the iPhone 14 winning just 7. What is giving Apple its edge right now is the App Store, the tightly integrated ecosystem. But if you are really just buying the iPhone for its camera, you could actually get better value elsewhere. I'm not even talking about flagship.
 
Look, I like to make snarky comments as much as everybody. And a rumor about a scrapped feature is not as good as a reliable report of a successful implementation. But why aren't more people excited at even the slight possibility that Apple is working on vastly improving the gaming performance of Apple Silicon? Isn't Ray Tracing the one hardware feature Mac users envy the PC Master Race for? I for one salute to the efforts of the Apple Silicon team! 🫡
Apple’s focus on improving GPU performance has been known for some time because it is crucial to their AR and VR products in development. Other products using Apple silicon will benefit, but they aren’t the focus.

This report might even be part of the reason Apple’s rumored glasses have apparently seen delays…
 
Look, I like to make snarky comments as much as everybody. And a rumor about a scrapped feature is not as good as a reliable report of a successful implementation. But why aren't more people excited at even the slight possibility that Apple is working on vastly improving the gaming performance of Apple Silicon? Isn't Ray Tracing the one hardware feature Mac users envy the PC Master Race for? I for one salute to the efforts of the Apple Silicon team! 🫡
I don't seriously game on my Mac's or PC's, and I'm not going to have a need for anything but simple games on a phone, so a ray tracing engine just wont do anything for me. Just like a 48 Mp camera. But I think for creators it's a whole different ballpark, so yes it's exciting that some mythical M? processor will eventually get a serious GPU. I probably wont need it though.

One thought though, they may need it for AR/VR, our eyes get so many clues about our surroundings from light and shadows...
 
There is no need to read so much into this. Come on, there are many people with iPhone SE 2's and iPhone X's. This was not life changing, lets move along. Its better they pulled back than ship a buggy product. When they fix this by iPhone 16 or 17, they will still be ahead of the competition
Don't think it will be accomplished by next Sept on the iPhone 15? Thats a year later as far as design/manufacturing. We also have iOS17 that same timeframe with its numerous changes. :)
 
With ray tracing having come to ARM, perhaps this is why we have seen unreleased MacBooks appear on Steam logs. I am cautiously optimistic the upcoming M2 MacBook Pro will have ray tracing capabilities for better gaming and Blender performance. As it is right now, an M series chip is not suited for Blender, Maya, gaming or any other graphics-heavy application.
 
  • Like
Reactions: SteveW928
I would respectfully choose ‘different’ than ‘better’ in your statement, but I understand your sentiment. In the ‘olden days’ Apple was a small company run by an authoritarian - a brilliant one, but still an authoritarian with the final say on most everything. Jobs threw a lot of ideas and products, some brilliant, some not so much, on the wall and saw what stuck. He was learning too.

A 3 trillion dollar company can’t do that, the supply chain is too large, costly, and complex, not to mention that a few really - let’s be charitable and say ‘odd’ products - can erase hundreds of billions of dollars in company value. I’m not saying that the current state of the company is better than the old state of the company, but I think few would argue that the market, problems, and economics are quite different today.

I could see Apple spinning off a group that would be modeled after the old Apple, willing to do just what the old Apple did, but without crashing a trillion dollars of value if it failed, and the old Apple did have it’s failures and successes. They could even fly the old Apple pirate flag!
The era that saw OS X being developed deserved criticism for how bad Mac OS was at the time, but it suddenly got a lot better, and especially for users when more developers stopped being lazy about updating their software for the new APIs.

Snow Leopard was a fantastic, stable, reliable, solid OS at the end, likely because the core of Mac OS X was being optimized behind the scenes for the impending, but not yet publicly known, iPhone.

There were a fair number of years where you could see that the dictator at Apple cared about details. Apple’s own software offerings started being competitive with the big names… until someone at Apple made the decision to trash native Mac OS apps and back-port their iOS half-siblings over to Mac OS and consider them “upgrades”.

[insert lots of other things here, that I’ve talked about endlessly before]

Today, the OSes are bloated, mostly to serve iPhone sales and corporate “services” mentalities, and it is all so terribly buggy… to an extreme that makes me feel like my escape from the mess that was/is Windows was but a brief respite from the hell of the computer industry; a brief taste of reasonable reliability. A brief time where one company (Apple) actually forced its larger competitors to improve things, in general, for all of us.

That time is over. That Apple is gone. I stick with today’s Apple because it’s the least bad option. I miss when it was the superior choice for humane and reliable computing.
 
“Facts” that don’t line up with how the actual timeframe and process of developing and manufacturing chips are not facts.

This whole things reads like standard tech journalism these days. Someone who is slightly more tech literate but still way out of their depth has pieced together a narrative that doesn’t make any sense whatsoever when the details are examined, but will be lapped up as “fact” by those who read it.

You don’t just swap out that much of the design of a chip that’s going to be produced in the hundreds of millions, you’d have to redesign the entire thing.

This entire story doesn’t pass the sniff test and it’s a damn shame so many here are using it to extrapolate further.

3nm is delayed, that’s the beginning and the end of what happened this year.
This whole article is classic Apple PR. Get everyone excited about a “bleeding edge” feature and then claim you had to “pull it at the last minute” for the user experience. Apple gets to have their cake and eat it too: claim they are on the “bleeding edge,” meaning some things may not be perfect, while also claiming they always put the customer experience over everything else, which means they don’t release anything until it is absolutely perfect. In reality, Apple never planned this feature for the iPhone 14 Pro, but this story is a nice way to set the narrative about how “customer focused,” and “nothing ships until it’s perfect” Apple is, in the face of some major iOS 16, iPhone 14 Pro (camera focus, etc.), and Ventura (WiFi bug on M-Series Macs) problems.
 
The report goes on to reveal how Apple's chip design team has been forced to contend with a loss of talent in recent years, with the company having lost dozens of key people to various silicon design companies since 2019
If only there was a way this could have been avoided.
 
O.M.G.
An interesting rumor!?
It's a Christmas Miracle!

While it's unfortunate they couldn't make it for the iPhone 14 launch this is actually incredibly exciting news if true, especially on the Mac side.

As great as the A14/A15 derived GPUs have been, Apple's platforms, especially the Mac, are long due for a new GPU architecture that can better compete with Ampere/RDNA2 on features (mainly Ray Tracing) and Ada Lovelace /RDNA3 on performance, while hopefully maintaining Apple's massive lead in power efficiency.
 
Explains why we got last year's phone repackaged as a new phone this year.
As I pointed out in another thread the SoCs are running into other issues as we try to go to 3 nm.
see
 
But why aren't more people excited at even the slight possibility that Apple is working on vastly improving the gaming performance of Apple Silicon?

Because it isn't difficult to create a Nvidia or AMD-class gaming GPU. Just create a card that eats up nearly 500W max, and tell system integrators they gotta cope with that power requirement and the resulting heat. If you're Apple making it for your own machines, it becomes Mac Pro exclusive, and you can support two of them before your machine has need to move to dedicated power infrastructure in the home or business.

Instead, Intel and Apple don't have competing GPUs because they don't have a business reason to do so. The trade-offs don't make sense for the majority of their market, and there's not enough volume for the high-end-gaming market to justify more competitors. In fact, I suspect the entire PC gamer market is too small for Apple to care about, even if gamers were willing to consider switching to Apple en masse.

Hardware ray tracing makes a very specialized task more efficient. It won't do anything to accelerate the vast majority of existing game titles. For that, you just need more horsepower (input power, dedicated silicon, and resulting heat). If you aren't willing to dedicate more power budget and silicon budget to the problem, you will just see a game of inches on optimization of the graphics architecture and on performance improvements due to newer silicon processes.

What will be interesting is if a future Apple architecture takes advantage of more of a chiplet-style design, and they pile GPU cores into desktop and workstation-class machines where it makes sense. I suspect they could scale very high with the same performance-per-watt due to the ridiculous unified memory architecture. They just need a way to actually accomplish that scale without it being a new processor for a few expensive BTO SKUs.
Isn't Ray Tracing the one hardware feature Mac users envy the PC Master Race for? I for one salute to the efforts of the Apple Silicon team! 🫡

Not really. It is pretty, for the few titles that take advantage of it. Prettier at an unacceptable frame rate still means the gameplay is unacceptable.
 
  • Like
Reactions: tubular
I use After Effects, Cinema 4D, and the entire Adobe suite daily on my MacBook Pro and it’s incredible.

Can’t comment on Unity, but the Unity real-time renderer in Blender works fine.
Yes, I can comment on Unity. As of 2021.3.16f1 they have fixed serious performance issues (related heavily to Ventura most though) and it’s been a joy.

I recently bought a refurbished M1 Ultra (impulse buy, didn’t need it) but man… every time now my PC colleague friends would hit play on their computers (infinite cores AMD/Intel something, RTXs this and that) to show something in play mode, a dreaded popup window progress bar timer appears that can take up to a minute in our current project (even with a Fast Play mode option). Hitting stop to go back to editor mode can also take another minute.

On the M1 Ultra? Seconds… sometimes 4 sometimes 8. Hitting stop? Maybe a couple of seconds.

Sure, the M1 Ultra can’t draw more than ~100W (I think?), doesn’t have all those RT-cores (yet) overclocked and fed by a 800W power supply to gain the maximum benchmark score of the day… but for the day to day use: hit play and pause often, switch application to model something in blender (everything is faster except cycles rendering in practice), switch to FCP to edit some captures in-game video, substance designer, affinity etc etc at the speed of thought hands down beats all those theoretical synthetic scores.

EDIT: forgot to add Houdini to that list too. I don’t do millions of polygons crazy things though.
 
Because it isn't difficult to create a Nvidia or AMD-class gaming GPU. Just create a card that eats up nearly 500W max, and tell system integrators they gotta cope with that power requirement and the resulting heat.
Nobody cares! It's all about the integrated graphics performance of Apple Silicon.
If you're Apple making it for your own machines, it becomes Mac Pro exclusive, and you can support two of them before your machine has need to move to dedicated power infrastructure in the home or business.
Again. We're talking about the iPhone GPU, not the Mac Pro.
Instead, Intel and Apple don't have competing GPUs because they don't have a business reason to do so.
You need to stop pretending as if there's any competition for Apple. We're not talking about the PC gaming market, we're talking about gaming on Apple Silicon. iOS, iPadOS and macOS.
The trade-offs don't make sense for the majority of their market, and there's not enough volume for the high-end-gaming market to justify more competitors.
There are no trade offs. Apple invests in their own chip technology and the customers reap the benefits for free.
In fact, I suspect the entire PC gamer market is too small for Apple to care about, even if gamers were willing to consider switching to Apple en masse.
Of course it is too small, but luckily Apple owns the largest gaming market that ever was, the AppStore. And the same Metal engine runs on everything.
Hardware ray tracing makes a very specialized task more efficient. It won't do anything to accelerate the vast majority of existing game titles.
Nobody cares about existing games. And that very special task is to create realism through natural lighting. That's useful for every virtual world.
For that, you just need more horsepower (input power, dedicated silicon, and resulting heat).
Not if you hardware accelerate the task with a special engine on the chip. Then the result is less heat and power consumption than using cpu or gpu.
If you aren't willing to dedicate more power budget and silicon budget to the problem, you will just see a game of inches on optimization of the graphics architecture and on performance improvements due to newer silicon processes.
The next process node shrink will add a few billion more transistors to the chip in need for a good purpose. We already have a Neural Engine, which most people don't need. A Gaming Engine is a no-brainer. Of course you add it if you can.
What will be interesting is if a future Apple architecture takes advantage of more of a chiplet-style design, and they pile GPU cores into desktop and workstation-class machines where it makes sense.
It will be interesting when you get that there is no differentiation anymore between an iPhone SE and the Mac mini with M1 Ultra. They are meant to be the same platform, with the same basic chip architecture. A few more cores here and there make no difference.
I suspect they could scale very high with the same performance-per-watt due to the ridiculous unified memory architecture. They just need a way to actually accomplish that scale without it being a new processor for a few expensive BTO SKUs.
What about a feature for every iPhone do you not get?
Not really. It is pretty, for the few titles that take advantage of it. Prettier at an unacceptable frame rate still means the gameplay is unacceptable.
The pictures are only prettier, because ray tracing support makes the necessary calculations so much faster that you can actually switch the feature on. You could always calculate pretty pictures on any computer, ray tracing support on the chip is what makes the frame rates acceptable.
 
  • Like
Reactions: throAU
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.