Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
OpenAI and Jony Ive can try and force things all they like but once the market crashes and the idea of investing in more centralised models is on ice it will be model distillations that find their way into every conceivable piece of technology.
Maybe we should look beyond "OpenAI and Jony Ive." There are other players in the game. The market might crash in the US, but not in Canada or Mexico, the US neighbours. There are also other operating systems, besides Android and iOS for mobile phones, that are growing rapidly—much faster than one might expect. Some of these operating systems can also serve as desktop operating systems, which neither Android nor iOS can.
 
  • Haha
Reactions: G5isAlive
Maybe we should look beyond "OpenAI and Jony Ive." There are other players in the game. The market might crash in the US, but not in Canada or Mexico, the US neighbours. There are also other operating systems, besides Android and iOS for mobile phones, that are growing rapidly—much faster than one might expect. Some of these operating systems can also serve as desktop operating systems, which neither Android nor iOS can.
Sadly all innovation in the OS market died when Microsoft killed off Windows Phone 8.
 
It was the tech press saying Apple is doomed back then. Despite their high quality. Now it’s Apple’s own users complaining that Apple’s software has gotten noticeably worse and that it doesn’t bode well for their future

And also, who is saying Apple is doomed? I’m saying the lower quality isn’t affecting their sales but it could in the future. That doesn’t mean they’re doomed
Maybe you haven't said Apple is "doomed" as such, but it's all the same genre of post. The more things keep changing, the more people will complain about it. (Note, they will also complain if things don't change enough)

It won’t find it. If this forum had a ‘required reading’ list then The One Device by Bryan Merchant would be top of the list. The iPhone was a once-a-century crux point for technology that brought together dozens-if-not-hundreds of ideas dating back to the Victorian era into a singular converged device. It was an organic process, in the right place at the right time. Maybe somebody else might have gotten there first but they didn’t. It took a few more years to perfect things by adding the App Store and changing the design but every manufacturer on the planet has been making copies of the iPhone 4 ever since.

LLMs are an interesting concept yet they are still a feature trying to disguise themselves as a product. OpenAI and Jony Ive can try and force things all they like but once the market crashes and the idea of investing in more centralised models is on ice it will be model distillations that find their way into every conceivable piece of technology. In this way AI will become even more of a feature, helping to break down accessibility for all consumer electronics. In 10 years time you’ll be conversing with a vending machine about the best beverage for a hot day or asking your car about the tyre pressure. ChatGPT will be a stepping stone on the way to that future but it is far from it.

Maybe there is no next device. Even with all the wild technology they had on The Expanse, arguably the most believable science fiction work of the current age they still used an evolution of the smartphone.
I dread the day I have to chat with my things to get them to work. That's the day I will truly be left behind and embrace my inner grumpy old man. I want to do things myself, not tell virtual slaves to do it for me! I don't want to ask my car how much power, or wiper fluid it has left! That's what gauges are for!!!

Despite my creeping horror, your post is the best take on AI I've seen for a while, and it's true smartphones are a mature technology now. There's very little room left for real innovation. All the low-hanging fruit has been picked, as well as a good bit of the harder-to-reach fruit!
 
Maybe you haven't said Apple is "doomed" as such, but it's all the same genre of post. The more things keep changing, the more people will complain about it. (Note, they will also complain if things don't change enough)


I dread the day I have to chat with my things to get them to work. That's the day I will truly be left behind and embrace my inner grumpy old man. I want to do things myself, not tell virtual slaves to do it for me! I don't want to ask my car how much power, or wiper fluid it has left! That's what gauges are for!!!

Despite my creeping horror, your post is the best take on AI I've seen for a while, and it's true smartphones are a mature technology now. There's very little room left for real innovation. All the low-hanging fruit has been picked, as well as a good bit of the harder-to-reach fruit!
The holy grail is Star Trek level tech.
 
Maybe you haven't said Apple is "doomed" as such, but it's all the same genre of post. The more things keep changing, the more people will complain about it. (Note, they will also complain if things don't change enough)


I dread the day I have to chat with my things to get them to work. That's the day I will truly be left behind and embrace my inner grumpy old man. I want to do things myself, not tell virtual slaves to do it for me! I don't want to ask my car how much power, or wiper fluid it has left! That's what gauges are for!!!

Despite my creeping horror, your post is the best take on AI I've seen for a while, and it's true smartphones are a mature technology now. There's very little room left for real innovation. All the low-hanging fruit has been picked, as well as a good bit of the harder-to-reach fruit!
If you ever seen it I imagine the vending machines off Red Dwarf.
 
actually, we may have reached 'good enough' , where marketing needs force cosmetic changes but functionality is good enough of most people (think of the last system crash you had while saving a document [ in my case it was a document
using word on system 7]) I suspect that lack of disaster moment and easier peripheral setup means we are at good enough.
 

Yes but there is a silver lining: Linux/open source

Linux is free and open source, meaning anyone can see and fix the code.
Apple's software is closed source, only Apple can see and fix the code.

Open source software is the future, which is already happening:
Valve/SteamOS/Proton is now winning BIG TIME, thanks to open source.

As OP said, it doesn't matter how much money Apple has, when trust is gone.
 

The Haus of Apple OS: How a Thousand Paper cuts killed “It Just Works”


There’s an old saying:




Apple has money. They have more than enough money. But what money cannot buy—at least not directly—is time, discipline, and humility. And those are exactly what Apple’s operating systems are missing right now.


Most people don’t sit around dissecting OS architectures, QA pipelines, or release cadences. What they do notice—slowly, and subtly—is that their eyebrows start creeping up when they ask themselves:




A feature that worked every single time the same way in iOS 6, now has been either silently deprecated or altered in a weird way that you can no longer get it to work as intended in iOS 7. A Mac that had a ridiculously good uptime, and hardly ever had error report dialogs randomly appearing not from apps crashing but key system components of macOS crashing suddenly start appearing after you upgraded to the latest version. You find yourself doing multiple workarounds, and “I’ll just avoid that feature for now until it gets fixed in an update” before you know it, you’re living in a digital house held together with chewing gum and duct tape.


It’s not one catastrophic failure.


It’s a thousand paper cuts.


And I don’t think this happened by accident.


What follows is a connected web I put together—from CPU transitions to OS pricing, from OS development, to AI marketing—that, in my view, explains how Apple slowly dug itself into the hole it’s in now… and how, if they’re smart, they can still climb out.



1. The eras of transitions


Let’s start with the hardware context, because macOS doesn’t exist in a vacuum— it runs on silicon.


Roughly speaking, the Mac has lived through three major CPU transitions:


  • 68K to PowerPC:
    • 68K Macs: 1984
    • PowerPC transition: 1994
    • Duration: ~10 years
  • PowerPC to Intel:
    • First PowerPC Macs: 1994
    • Intel transition announced: 2005
    • First Intel Macs: 2006
    • Duration: ~12 years
  • Intel to Apple Silicon:
    • First Intel Macs: 2006
    • Apple Silicon announced: 2020
    • First Apple Silicon Macs: 2020
    • Duration: ~14 years

Each era had a transition window too. Those transitions didn’t happen overnight, they had very meticulous planning and had to be navigated with great precision:
  • 68K to PowerPC: ~2 years from announcement to completed transition.
  • PowerPC to Intel: In just 210 days by 2006, the transition was complete.
  • Intel to Apple Silicon: 13 months; by 2023 the transition was complete.

Now fold macOS into that timeline.


Tiger, Intel, Leopard, Snow Leopard: The “We Have One Job” Era


  • Mac OS X 10.4 Tiger was announced in 2004 and released in 2005 for PowerPC.
  • In June 2005, Apple announced the transition to Intel.
  • A special Universal Binary version of Tiger shipped for Intel Macs when those computers launched in 2006.
  • That same year in 2006 the Mac lineup completed its transition to Intel in just 210 days.
  • Mac OS X 10.5 Leopard arrived in autumn 2007 as a Universal binary release, the last to support both Intel and PowerPC.
  • In summer 2009, Mac OS X 10.6 Snow Leopard shipped as Intel-only. It included Rosetta, the translation layer for PowerPC apps, for the last time.

In other words: by 2009, macOS was effectively Intel-only in public. The old world was gone.


From 2009 to 2011, OS X 10.7 Lion was in development—a ~2-year gap between 10.6 and 10.7. After that?


That shift—from multi-year cycles to annual releases—sits right on top of another pivotal change: how macOS was priced.




When macOS had a price tag, and why it mattered.


For most of the Mac’s history, Mac OS (later Mac OS X) was very clearly a product that complemented the hardware it ran on—with a price that reminded you of the work behind it.


Historically:


  • Mac OS X 10.0–10.5
    • USD $129 for a single license
    • USD $199 for a family pack of five

Then came one important exception:




Puma was free not because Apple suddenly felt generous, but because 10.0 Cheetah was, frankly, unfinished. 10.1 was the apology. Apple knew it, users knew it, reviewers knew it. So they made the fix free.


After that:
  • 10.2 Jaguar, 10.3 Panther, 10.4 Tiger, 10.5 Leopard
    back to the USD $129 pricing model.

Then Snow Leopard (10.6, 2009) shook things up:


  • USD $29 for a single license
  • USD $49 for a family pack
  • USD $9.95 under the “Up-to-Date Program” if you’d just bought a Mac
  • USD $19.95 for a family pack under that program

This was Apple saying, quote:




Then came:


  • OS X 10.7 Lion (2011):
    • USD $29 via the Mac App Store (The first digital download only release of Mac OS X)
    • USD $69 on a USB install drive, single license only
  • OS X 10.8 Mountain Lion (2012):
    • USD $19.99 via the Mac App Store
    • No USB option, no family pack—your Apple ID handled multi-Mac installs.

Then, in 2013, they drop the price down to $0:


  • OS X 10.9 Mavericksfree.
  • Every macOS release after that → free.

And right around that same time—autumn 2013—iOS 7 is released, with a massive redesign and massive instability to match.


You can draw a line right through 2011–2013 and watch the quality begin its roll downhill.




From Complementary Product to “Part of the Package”


When macOS had a price tag:


  • There was a clear economic signal:
    “This took effort. It funds a team. It justifies taking our time.”
  • Apple could justify longer development cycles and refinement releases like Snow Leopard.
  • The OS was treated as a flagship product, not just a side menu item.

Once macOS became free:


  • It stopped being a revenue line.
  • It became a support pillar for hardware and services.
  • macOS releases aligned more tightly with marketing dates and event calendars.
  • The internal incentive shifted from “make it worth the money” to “make it ready in time for the keynote and for the hardware release.”

And at almost the same moment, iOS went through its own trauma:


  • iOS 7: a complete visual and UX overhaul.
  • Infamous for instability, bugs, and “we’ll fix it in the point release.”

You can now clearly see and feel the change in the air:




The ethos of “it just works” took its first major blow.




iOS: “Free” on the Surface, Paid for Behind the Scenes


A lot of people point out, correctly, that:




Yes and no.


Technically, you never bought a disc or license key for iOS. But:


  • You couldn’t install iOS on anything but an iPhone (and later, iPad) Unlike the Mac which both on PowerPC and Intel you technically could install alternate operating systems.
  • The cost of iOS development is baked into the iPhone’s MSRP.
  • iPhones are deeply integrated with carrier financing, contracts, and upgrades.

You can technically use an iPhone without cellular service, sure—but almost nobody buys it for that. Realistically, the ecosystem looks like this:


  • Carriers subsidize or finance devices.
  • Users pay monthly.
  • Apple gets stable, predictable hardware revenue.
  • iOS evolves as the software layer that justifies the price of the next iPhone.

So iOS doesn’t need a separate line item. Its budget is wrapped in:

  • hardware profits
  • carrier deals
  • App Store revenue
  • services like iCloud, Apple Music, Apple TV…

macOS, on the other hand:

  • Runs on machines people often keep for 7–10 years.
  • Lives in a world where customers might skip hardware upgrades.

When macOS stopped being a product you buy to compliment your Mac and became “a de facto default part of the Mac experience,” it lost the internal leverage that says:




The software didn’t suddenly become cheaper to make.


It just became easier to justify rushing.




“You Can’t Fix Everything by Throwing Money at It”


This leads into one of my core theories:




Engineers, designers, QA testers—these are highly skilled people building the most central part of the computing experience. When you charge for a major upgrade, you’re not paying only their salaries with that money, but you’re acknowledging the value of their work.


When macOS went free, I think something easily overlooked but has created widespread implications as a result happened:


  • The OS lost its identity as a standalone product that complimented the very hardware it runs on.
  • The people building it risked feeling like they were working on “support infrastructure” rather than a flagship product.
  • The company’s internal narrative moved away from “this is something people will choose to buy” toward “this is something people expect to get every year.”

Do I think Apple literally paid their OS teams entirely out of OS X license sales? No, that’s too simplistic.


But symbolically—and culturally—paid upgrades enforced discipline. They made polish part of the promise.


Without that discipline, the temptation to ship “good enough for now” becomes stronger.


And there’s another uncomfortable angle:




If that’s true, even partially, it would help explain the growing sense of inconsistency and fragility across all their platforms.




Annual Everything: When the Schedule Becomes the Master


We’ve had hints over the years, but let’s say it bluntly:




Once macOS went annual and iOS was already annual, it was only a matter of time before:


  • iPadOS
  • watchOS
  • tvOS
  • visionOS

…all fell under the same gravitational pull.


WWDC isn’t just a developer conference anymore. It’s a deadline. The September iPhone event isn’t just a showcase. It too is a deadline.


When the date is immovable, you only have two levers left:


  • scope
  • quality

And what we’ve seen is:


  • Scope rarely shrinks (because marketing wants “new, big, and exciting”).
  • So quality takes the hit.

That’s how we got to where iOS and macOS is today:






The Version Numbers are telling: iOS 7, 8, and Beyond


Take a closer look at iOS version histories and you can see the internal struggle reflected in the dots.


Before iOS 7 / 8, you’d typically get:


  • X.0 → main release
  • X.0.1, .0.2 → urgent bugfixes
  • X.1 → one major follow-up
  • Maybe X.2, rarely beyond that

With iOS 7, we get up to:


  • 7.1.2 as the final version—still relatively modest.

But with iOS 8, the pattern shifts:


  • iOS 8.0: released 17 September 2014
  • iOS 8.1: released 20 October 2014
  • iOS 8.4.1: the final release in the 8.x series

From there, .1, .2, .3, .4 become routine, not special. And .0.x patches remain for immediate fires.


What does that tell us?


  • Features and changes are landing too late in the cycle.
  • .0 releases are going out the door under pressure.
  • Stabilization is happening after millions of devices are already running the software.

Not only is there a rollout strategy problem at the public-facing level, but there is also a profoundly deep problem internally at Apple that has seeped into the public level, leaving them in a hole they keep digging deeper.


More point releases don’t automatically mean more love and care.


Sometimes they mean the opposite: more scrambling to get things right.




Public Betas: Tech enthusiasts, Not Real QA


On paper, public betas look like a gift:


  • More coverage
  • More devices
  • More feedback
  • Fewer bugs slipping through

In reality?




They’re not:


  • running structured test plans
  • tracking regressions carefully
  • filing detailed bug reports with logs and reproduction steps

They’re installing betas because it’s cool.


That does not make them bad people. It just makes them terrible QA substitutes.


If internal QA and partner testing are already time-compressed, leaning on public betas to catch what you didn’t catch internally is not responsible—it’s desperate.


And when you combine that with annual deadlines, you get the modern Apple OS cycle:


  • Dev betas
  • Public betas (often still rough)
  • .0 release with visible issues
  • .0.1, .0.2 hotfixes
  • .1 through .4 carrying both fixes and features that missed .0

It feels less like a carefully staged rollout and more like live surgery.




A Thousand Paper cuts and a Bleeding Edge


None of this means Apple’s software is “trash” or “broken.” That’s not honest.


But it does mean:


  • Features that used to feel magical now feel fragile.
  • “Upgrade day” is no longer universally exciting; for many, it’s a coin toss.
  • You accumulate small annoyances:
    • AirDrop behaving inconsistently
    • Notifications stacking strangely
    • iCloud sync pausing for no clear reason
    • HomeKit not detecting your connected devices
    • Reduction in battery life after updating (not because of Spotlight re-indexing everything in the background either)
    • UI animations stuttering unpredictably

It starts that way. Before you know it, you have a thousand paper cuts—each one from painstakingly using a workaround to get that one "magical" feature that used to "just work" on your iPhone or Mac until an update broke it, forcing you to wait for yet another update to fix it and get it to work again.


No single cut kills you.


But after a thousand paper cuts it’s a river of blood and you’ve got no more pulse.




Apple Intelligence + Siri 2.0: Just the Tip of the Iceberg


Enter the current era: Apple Intelligence, Siri 2.0, and AI everything.


The ongoing Apple Intelligence + Siri 2.0 debacle in my opinion is just the tip of the iceberg and not the straw that broke the camel’s back.


AI is incredibly demanding:


  • heavy system integration
  • privacy constraints
  • on-device processing
  • cross-platform coordination
  • expectation of “intelligence” and reliability

If the underlying OS development is already strained—racing to ship every year, leaning on public betas, juggling six-plus platforms—trying to layer a massive AI strategy on top will strain everything to the point of grinding metal on metal.


But, paradoxically, this might be the shock that Apple needs because quite frankly, I hope this gets all of Apple to straighten up and summon the important people into the boardroom immediately so they wake up and stop dodging the elephants in the room.


The real danger here is Apple losing its customers trust.




IBM, ThinkPad, and the Myth of “Too Big to Fail”


People have said similar things before:




And in the literal sense, they were right. IBM still exists.


But… When was the last time anyone purchased an IBM PC? People are purchasing Lenovo ThinkPad PCs; they weren’t always Lenovo and the ThinkPad was all IBM until it wasn’t.


That’s the cautionary tale.


The company didn’t vanish.


Its relevance in the consumer space did.


Apple isn’t in danger of going bankrupt anytime soon. That’s not the point.


The danger is:


  • drifting from being the industry standard to just being another option
  • slowly eroding the “it just works” reputation
  • forcing users to ask themselves, “Why am I putting up with this?”

Relevance and most importantly trust once lost, is almost impossible to get back.


And no amount of cash on the balance sheet can fix the moment when people collectively decide:






“If Apple Slows Down, They’ll Fall Behind” — A False Fear


There’s a belief floating around that:




I don’t buy that. In fact, I believe the opposite.


Slowing down cadence isn’t the same as slowing down innovation. It's honing the blade instead of swinging it relentlessly until it loses its edge.


And here’s the key difference:


  • Android slowing down risks fragmentation because OEMs and carriers control pieces of the stack.
  • Windows slowing down risks chaos because OEMs ship their own bloatware on top and driver ecosystems drift.

But Apple has the advantage of…


It controls:


  • the hardware
  • the silicon
  • the OS
  • the frameworks
  • the App Store
  • the update mechanism

They are in the best possible position to slow OS releases without causing fragmentation.


They could:


  • Ship new iPhones, iPads, Macs, Watches, and so on on schedule.
  • Keep all platforms unified with a simple version scheme—like they’ve now began with iOS 26 / macOS 26 versioning.
  • Move to a two-year major OS cycle with interim, smaller feature drops.
  • Declare some years as foundation years, quietly focusing on stability, performance, and cleanup.

You don’t have to fragment version numbers or create “confusing options” to do this. Apple historically detests confusing options. Steve Jobs said it clearly:




That doesn’t have to change.


What needs to change is the pace and structure behind the scenes.




The Version 26 Hint: Model Years, Not Fragmented Numbers


With the recent move to unifying the version numbering scheme across all of their platforms in 2025 by using the two digit year as the version number (macOS 26, iOS 26, watchOS 26, visionOS 26 etc) instead of iOS 19 (being the nineteenth release of iOS) Apple may have hinted at something bigger that is in the works…


It’s like cars:


  • A 2026 Toyota Camry arrives in 2025.
  • A 2027 Toyota Camry model arrives in 2026.

The naming follows the model year, not literal chronology.


Aligning OS names to the year:


  • resets psychological expectations,
  • makes it easier to treat the OS as the "iOS of the current year" instead of “v18 vs v19 vs v20 etc,”
  • it gives Apple room to make a given year more about refinement without looking “lazy.”

No one panics when a new car model year is mostly subtle updates. They actually like that. It means the design is mature and the kinks are worked out.


Why shouldn’t software have the same luxury?


One can only hope this versioning shift is a sign of internal restructuring ahead of something like a WWDC 2026 reset.




Learning from Debian and Ubuntu—Without Scaring Users


I’ve argued Apple should take a page from Debian and Ubuntu:


  • Debian:
    • unstable → testing → stable
  • Ubuntu:
    • 6-month interim releases
    • 2-year LTS releases that focus heavily on stability

Apple could do something similar internally


  • Use rings and branches (unstable/testing/stable)
  • Maintain long-lived, foundation branches for deep refactors.
  • Reserve certain years as major under-the-hood cleanup cycles.

But for users?




No “iOS 27 Stable vs iOS 27 Testing” labels.


No multiple channels.


No fragmentation of features on the same hardware.


If my iPhone and your iPhone are both on iOS 27.2, we should see the same features and the same experience.


All the complexity belongs back at Apple Park, not in the Settings app.




FOMO, Branches, and the “Single OS Strategy”


Apple hates user-visible complexity:


  • They don’t want five different “editions” of macOS or iOS.
  • They don’t want customers comparing “branches” like a Linux distro list.
  • They know most people are in a constant background state of FOMO as it is.

So the answer is not to expose:




That would be a nightmare.


Instead:


  • Keep developer betas truly for developers (and serious testers).
  • Delay public betas until the builds are fairly solid.
  • Avoid creating a permanent user-facing notion of “tracks” or “channels.”
  • Use point releases (27.1, 27.2, etc.) as clearly communicated feature waves, not as silent triage.

From a user perspective:


  • You buy an iPhone, you get iOS 27.
  • Updates appear.
  • Features arrive in well-explained waves.
  • You’re not made to feel like you’re missing out because you didn’t sign up for some cryptic branch.

From an internal perspective:


  • Apple gets the freedom to restructure development like Debian/Ubuntu—unstable/testing/stable, long-lived branches, and slower foundational churn—without turning the product lineup into a versioning circus.



What Apple Needs to Do to Fix the “Haus of Apple OS"


If Apple truly wants to prevent their “Haus of operating systems” from crumbling, I think they need to:


1. Freeze earlier and harder


No more “stuff this huge feature in two weeks before .0.”


If it’s not ready, push it to 27.1 or 28.


A clean freeze buys QA and stability.


2. Treat public betas as polish, not phase-two QA


By the time the public sees it, the OS should already be basically stable.


Public betas should be about edge cases, not “Wi-Fi doesn’t work.”


3. Developer betas should go back behind a paywall.


Not because Apple wants to be stingy or gatekeep, but because developer betas were never meant for hobbyists, clout-chasers, or the “I want the new emoji early” crowd.


They were intended for:


  • people who actually build apps,
  • who know how to file proper bug reports,
  • who understand logs,
  • who test on secondary devices,
  • and who won’t scream on social media when a beta (shocker!) acts like a beta.

Bringing back the paywall does three healthy things:


4. It drastically reduces unqualified testers.


The moment you put a price tag on something, even a small one, the “beta tourists” scatter like dust bunnies under a Roomba.


Who remains?


People who actually use the beta as intended.


5. It stops the public beta cycle from becoming a social flex.


Developer betas are supposed to be raw, unpolished, and sometimes a little feral.


That’s perfectly fine… when you’re a developer.


6. Reduce marquee features per cycle


Fewer “headline” features means:


  • fewer moving pieces
  • fewer regressions
  • more attention on each new capability

You don’t need 20 big things per year. You need 5 really good ones.


7. Plan feature waves through point releases


Let .0 focus on fundamentals.


Let .1, .2, .3 be pre-planned feature drops, not emergency repairs disguised as updates.


8. Have explicit “foundation years”


Even if they never say it out loud in the keynote, internally they should mark some cycles as:




Performance. Stability. Reliability. Framework cleanup. No shame in that.


9. Respect and honor the psychological contract with users


When people update, they shouldn’t be bracing for breakage.


They should feel:








Right now, too many updates feel like gambles.




The Bottom Line: Trust Is the Real Operating System


For many of us, Apple once represented the closest thing to software that truly: Just works. And works all the time.


No software is perfect. Nothing ever will be. Software development is not monolithic; it’s layered, complex, full of trade-offs and trial-and-error.


But to pretend Apple’s software hasn’t drifted away from that ideal would be a lie.


I don’t think the situation is hopeless. Far from it.


  • Apple still has incredible engineers.
  • They still control the full vertical stack.
  • They still have a loyal user base.
  • They still have the power to set standards the rest of the industry copies.

But they need to wake up before the waterline gets too high.


They need to drink the coffee while it’s still in the pot, not after it’s spilled and burned them.


Because without “the rest of us,” Apple has nothing to offer “the rest of us.”


If they keep dodging the elephants in the room, keep using the public as unpaid QA, and keep treating OS releases like a never-ending, bleeding-edge rolling release cycle, they risk going the way of IBM in the consumer mind:


Respected. Historic. Still around.


But no longer the place where personal computing lives.


Apple is not too big to fail at the level that actually matters:


our trust.


And that is something no amount of money can simply patch in macOS, iOS, watchOS, tvOS, HomePod Software, visonOS 27.1.2.

I have an alternative theory, it's not that Apple Quality has gone downhill, quite the opposite. Apple Quality used to apply to computers. An easy task in comparison (but they still screwed up). Now it's integrating, computers, phones, tablets, tv's, headphones, AR visors and no doubt some I have forgotten. What has changed is the user community now thinks fingerprints is a major issue worth complaining about, and that color is the deciding factor in a purchase.
 
  • Like
Reactions: Bungaree.Chubbins
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.