Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I returned my iPhone 14 Pro. There was literally no noticeable difference between it and the 13 pro.
The 48MP raw photos are amazing--YOLO, so capture it now--the reduced power consumption helps prevent throttling and display shutoffs when shooting in warm weather, and bluetooth connectivity/responsiveness is improved.
 
That’s someone else case-meme but had 2 situations when 100% changes to hit were missed :)
“That’s xcom baby”
Once something similar when a soldier was next to enemy and had only 70% to hit, lucky tango down.

Anyways I recommend on iPad for sure, it was for me better experience than lagging / frame dropping xcom2 on PS5.

Yeah well, i got the first one when it was new on PC. And can't stand the turn limit in 2. And it looks like Feral has ported a few other games that I already have or I'm not interested in.

Anything nicely designed *for* these touch devices?
 
  • Like
Reactions: Lcgiv
People should stop thinking that Apple is designing phones or any other hardware hoping that people upgrade every year, they are targeting people who are using iPhone 11 or 12, very few people upgrade from 13 to 14,.

as the primary target for unit volumes? No. Does Apple target some? Yes .


( have to give your unpaid for phone back for essentially little credit . And have burying perpetua Apple Care into program . But they are playing at those who ‘have to have’ the latest ‘shiny’ or suffer FOMO withdrawal sickness. ) Apple does not discourage this kind of churn at all. But they are not counting on it for the primarily fiscal basis of phone development either.


There is a certain stream of refurbished units they need to hand out as warranty replacements . This is a likely predictable steady supply of product and/or refurb components. ( and insurance revenues to cover what to need to buy new ).
 
Nvidia just got acceptable raytracing performance in its recent 500W+ monstrosities. Trying to do the same on a battery powered phone chip is completely nuts… what were they thinking?
Remember when people said you couldn't use a phone processor in a desktop computer? They were thinking outside the box, not constrained by dogma.
 
I would much rather have a longer battery life than better graphics on an iPhone. The current iPhone graphics are more than capable. Please make the battery better. My iPhone 14 Pro Max (Always on Display off, Low Power mode) only lasts an hour better than my iPhone 2020 which is over 2 years old.
 
It’s laughable when people thought some figures leaving Apple’s chip team as insignificant. Lol, they left with all of the experiences which Apple have to find someone to fill. That’s never easy if your chip’s design were pushing boundaries, you need the best of the best talents to do that year after year. Those talents are not easily, if even possibly, replaceable.
 
  • Like
Reactions: dysamoria
The framing of the issue as "setback" and failure is so sensationalising, not making it clear that part of research and development HAS to involve experimentation (DUH!) and various trials and errors. I wouldn't necessarily call it a "setback" which is too sensational and tries to capture the audience's mind in a particular way. Boo!

Anyone who has done any serious research and development will recognize that testing, development, and various failed approaches is part and parcel of the work.
Pulling the GPU design at the last moment and substituting the previous generation after discovering a flaw in the design is unquestionably a setback and a failure. You don't get on the front door of production and pull the GPU and have it be less than that.

All of the things you mention are what is done and completed before you go to production on your device. You are not going to makes millions of units while you are still experimenting and doing R&D, are you? I wouldn't and it seems Apple wouldn't either.

The article also makes it clear heads rolled and the team was broken up with some engineers either canned or shifted to other projects. This is not about R&D and experimentation or "trials and errors". That's what comes before you get set to produce.

It is pretty clear Apple's leadership team decided it was a failure and time to go back to the drawing board. I think most people would agree with that assessment.
 
When Steve Jobs was running Apple, he focused on fewere features that just work.
But people want new UI design every year now, so they are forced to update UI and new features evey year, if they don't do it then people will say Apple is not innovating.
as they add more features they break stuff.
Apple should stop adding more features and focus on making OS stable.
This is nonsense - with the exception of Snow Leopard every major OS release under Jobs, both Classic and Mac OS X was filled with loads of new features, many of them half baked, and a whole host of bugs or issues. Mac OS X went through multiple different UI designs, from original Aqua to Brushed Metal, to Leopard's Unified look. This is to say nothing about all of the half-baked features and design choices Apple rolled out under Jobs' tenure - the round iMac mouse, the buttonless iPod shuffle, the implementation of iPod photos, MobileMe, iTunes Ping (and the general monstrosity iTunes grew into over that time period), QuickTime 4 and Sherlock 2's interface, I could go on.

Don't get me wrong there are plenty of issues with Apple's current management, but let's not pretend Apple's execution under Jobs was perfect.
 
Wow, this news is rather shocking to me. After several years of enjoying a Bionic series that "just works" and soundly beats the competition at being reliable, efficient and top of benchmarks, it didn't occur to me they'd produce one that threatened to sink an entire generation of iphones this way.
They didn't " produce one that threatened to sink an entire generation of iphones this way." They pulled the design prior to production. Nothing was sunk.
 
  • Like
Reactions: noraa and 5105973
For the device that I, at most, want to charge once per day, preferably less, I would be much more impressed by an iPhone that's maybe not significantly more graphically capable than a 12-14 Pro but instead has far better battery life.
 
It would be nice to know this prior to the phone going on sale! Let apple work out the bugs and then earn my money when the ambitious tech enters mass production.
 
I'm really shocked that Apple would get rid of so many people who were involved. Mistakes happen, and this report makes Apple sound like a toxic workplace.
This was not a case of making a mistake. This was a major screw up that probably cost them tens of millions. A problem of this magnitude requires strong corrective action and that involves holding people who screwed up responsible. And I don't agree that the report gives the impression of a toxic workplace.

I am not sure a lot of the commenters here who seem to think management's response was too harsh understand how big of a screw up this was and what was required to fix it and revert to a previous design. It was huge.
 
Well, Dynamic Island is a game changer for an iPhone tho. A 48MP camera is great too and A16 Chip is blazing fast.
Like I said there may be a speed increase but in day to day I didn’t notice it. I’m more than happy with the photo quality of the 13 pro. Barely notice the difference either there. And dynamic island is such a small addition. It’s cool for sure but not enough to warrant an upgrade for me.
 
  • Like
Reactions: nycjdc
I’m more than happy with the photo quality of the 13 pro. Barely notice the difference either there.
The improvement in 14 Pro cameras won't be noticeable until in suboptimal light. But the 48MP raw photos are outrageously good--no comparison with the 13 Pro. While raw isn't what one will use the majority of the time, for special occasions it's wonderful.
 
I find it extremely unlikely that the team, when experimenting with a feature that everyone knows is power hungry, would overlook doing power / thermal mgmt testing until late in the development cycle. So late in fact, that a near emergency last minute change had to be made.

I guess I don't have the same flavor of Kool-Aid that you're drinking?
Which may go some distance on an explanation for why people would be removed from teams or employment.

🤷🏽‍♂️
 
  • Like
Reactions: Lcgiv
The framing of the issue as "setback" and failure is so sensationalising, not making it clear that part of research and development HAS to involve experimentation (DUH!) and various trials and errors. I wouldn't necessarily call it a "setback" which is too sensational and tries to capture the audience's mind in a particular way. Boo!

Anyone who has done any serious research and development will recognize that testing, development, and various failed approaches is part and parcel of the work.
I'd agree with this characterization.
This is part of the larger issue that most people can only think in terms of narratives, ie simple stories with clear morals and obvious good vs bad guys. Reality (physical reality, social reality, history, ...) is much richer than this.

Even if you have all the facts straight, you may be crippling your ability to think clearly by forcing them into a narrative, whether than narrative is "Apple gets what they deserve" or "Apple is betrayed by its engineers" or "Pride comes before a fall" or whatever...

Everything genuinely new is also genuinely hard (otherwise it would have already been done!); or, to put it differently, if you don't fail occasionally, you're not trying hard enough. In an efficient organization, delays like this are not permanent setbacks, they just rearrange the schedule.
 
Everything genuinely new is also genuinely hard (otherwise it would have already been done!); or, to put it differently, if you don't fail occasionally, you're not trying hard enough.
Agree; if you don't keep pushing the boundaries (and failing many times) you become like BlackBerry
 
  • Like
Reactions: Lcgiv
I assume we are talking about real time ray tracing or hardware accelerated ray tracing because ray tracing already exists on A series chips with Metal.
"Ray Tracing" has two main parts: building a structure, generally called a BVH, from the geometry, that allows for rapid ray intersection tests; and then performing tests of the rays against the structure.

"Accelerating ray tracing" or "HW ray tracing" is a vague term that can mean anything from
- using the CPU to construct the BVH and existing GPU HW to perform the intersection testing TO
- providing GPU HW to construct the BVH and augmenting GPU HW to allow performing multiple intersection testing in one GPU "core cycle" to
- providing even better GPU HW that can make minor changes to the BVH based on minor changes to the geometry from to frame (so that we don't have to build the full BVH every time geometry changes).

Right now Apple is (as far as we know) at the first of these levels. If you look at the Metal Feature Sets PDF that's essentially what's meant by what they offer ("Ray tracing in render pipelines").
BUT it's worth noting that the Metal Feature Set tables only go up to A15/M2... So who knows what was planned (and what is available) in A16?

There's also the larger point that you have to remember that
- the journalists who report these things are not tech experts AND
- the engineers they spoke to are not making formal reports.
We did not have the engineering team give a presentation to some journalist about "Postmortem on what went wrong with the A16". Instead we have someone talking to someone else in vague terms and, quite possibly, half way through realizing "OMG I've said too much" and starting to redirect the conversation to cover their tracks and obscure the issue.

It is probably true that the A16 GPU is less than what was hoped. But why?
- Perhaps it was designed for N3, then investigated for N4, considered not feasible, so we revery to A15 GPU.
- Perhaps it was designed assuming some other aspect of the chip design (SLC, NoC, Coherence protocol) would be in place but they were delayed so the GPU was delayed.

I find the story as presented ("GPU was badly designed, runs far hotter than expected [implied because of ray tracing], this was only discovered late in design") to be very unlikely. More likely I think is that different facts have been put together in an attempt to create a story; the facts are kinda true, the story not at all.
Probably there is HW ray tracing in the original A16 GPU.
Probably there are multiple other features to make it faster.
Both those are obvious. But why the delay? As I said, unlikely to be the fault of the GPU per se, most likely to be because the GPU targeted a process that was not feasible at the time a shipping decision had to be made.
 
You know, I turn that off when playing games on my PC rig. I view it in most cases as a gimmick. I was interested to see it in a game called Control and my reaction was a big giant “meh”.

The Witcher 3 had a recent large update just for ray tracing and for me it was the same reaction. I turned it off.

Been upgrading every generation since the 2080 Ti, then 3090 and now 4090 just to have faster ray tracing at 4K, I honestly can never play those games that have the option for ray tracing off, just looks like crap.

The Witcher 3 is absolutely night and day between ray tracing on and off, lighting is a very important part of making the world feel more natural and off looks terrible, especially in this game.

Just with these quick comparisons, how the hell does anyone like it off, looks very unnatural, washed out and the colours look terrible. Witcher 3 has always had a blue tint and ray tracing finally fixes that.

Honestly can't wait to to get some for of useable ray tracing running on an iPhone.

2.jpg


3.jpg


4.jpg


5.jpg
 
This is all based on comments from individuals that “claim” they have direct knowledge. Given that Apple plans products and services years in advance, I seriously doubt that any of this is true. The capability could have been planned for the future regardless — this doesn’t mean there was a pivot from a planned release schedule.

I have both the 13PM and 14PM — the 14PM is a nice upgrade in fit and finish, the DI, and especially the camera system.
 
I’m suspicious of how much of this story is accurate and how much is creative writing, from the ‘failure’ standpoint. It kind of sounds trollish, and face it, Apple is a troll magnet, lol. Remember Dvorak (whatever happened to him?) and FCN who basically got paid by Apple’s competitors to write disparaging articles?

Apple likely tries many things that don’t work and have to be either scrapped or delayed.

Apple has also issued incremental update models of the iPhone, for example, the ‘S’ models. So if this phone had been called the ‘iPhone 13S‘ no one would have batted an eye WRT to the GPU I think. But if it’s called a ‘14’ then maybe certain writers/commentators can sell a piece that would otherwise not get printed.

The iPhone 14 Pro and Pro Max are in extremely high demand and have sold extremely well, and I’m sure that Apple’s competitors likely will pay up for some trash talk.

The piece just seems to have that ‘flavor’.
 
  • Like
Reactions: AlexMac89
This is nonsense - with the exception of Snow Leopard every major OS release under Jobs, both Classic and Mac OS X was filled with loads of new features, many of them half baked, and a whole host of bugs or issues. Mac OS X went through multiple different UI designs, from original Aqua to Brushed Metal, to Leopard's Unified look. This is to say nothing about all of the half-baked features and design choices Apple rolled out under Jobs' tenure - the round iMac mouse, the buttonless iPod shuffle, the implementation of iPod photos, MobileMe, iTunes Ping (and the general monstrosity iTunes grew into over that time period), QuickTime 4 and Sherlock 2's interface, I could go on.

Don't get me wrong there are plenty of issues with Apple's current management, but let's not pretend Apple's execution under Jobs was perfect.
It wasn’t perfect. It was better.
 
  • Like
Reactions: Lcgiv
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.