Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
One factor that has allowed the decline to continue is that consumers have failed to punish Apple for it. There are various indications of this. For example, on this site if you were to complain about Apple, you'd always get responses like: "Are you Tim Cook?" "Tim Cook made it a trillion dollar company!" "Are you an Apple engineer?" "Do you think you know better?" "It's a beta! Do you know what beta means?" All such responses miss the point entirely. Tim Cook made it a trillion dollar company by building it ON THEIR MONEY AND SUFFERING -- THEY are the ones who made it a trillion dollar company -- and consumers happily dance off the cliff to his song.

Your point is a very good one.
If Apple were a private company, one wonders how different the narratives and support (or not) might be.

Making money off the stock performance very much muddies up any objective analysis purely about the products.
 
I have a Pixel 10 and the version of Android for that device is excellent; user-friendly, smooth, and fast. Even with a processor that's doesn't come close to matching Apple's A18 or A19, I don't feel the performance suffering at all.
I think you misunderstood the point I was trying to make or maybe I didn't word it quite right... The positive experience with the Pixel line of phones has been a result of Google creating their own hardware to compliment their software harmoniously hence why your experience is so good with your Pixel 10. Compare your experience to that of someone who owns a Samsung Galaxy S25 series and the experiences will vary person to person, some will say they can't notice the difference at all in performance or stability of Android with One UI on top, others will say they will move to a Pixel device because of the performance/overall experience with Samsung's One UI taking away from the pure Android experience while others will say they would never go to a Pixel device because their experience with Samsung and their One UI is the reason why they remain with the Galaxy line. This has a lot to do with the fact that the variety of phone brands (at least in the U.S) as far as Android is concerned has shrunk significantly enough to where the fragmentation is manageable for Google and their partners...
 
It’s also possible that Apple intentionally made OS 26 worse in order to draw attention away from the AI issues. Then they’ll fix the usability bugs and the design in OS 27, and everyone will praise how amazing and fast it is, thanking Apple for something that should have been there in the first place, but feels great in contrast. And suddenly all the AI problems are forgotten.
 
I don’t even think the software quality has degraded. There were TONS of bugs even during the Jobs era.

We sometimes dream back to Snow Leopard. But that was just one release. It was buggy before SL, and it was buggy after SL.
The first release of Snow Leopard had bugs too. Apple made a concerted effort to get them out though and I think by 10.6.4 it was pretty stable.
 
One factor that has allowed the decline to continue is that consumers have failed to punish Apple for it. There are various indications of this. For example, on this site if you were to complain about Apple, you'd always get responses like: "Are you Tim Cook?" "Tim Cook made it a trillion dollar company!" "Are you an Apple engineer?" "Do you think you know better?" "It's a beta! Do you know what beta means?" All such responses miss the point entirely. Tim Cook made it a trillion dollar company by building it ON THEIR MONEY AND SUFFERING -- THEY are the ones who made it a trillion dollar company -- and consumers happily dance off the cliff to his song.
But 99% of customers don’t think in those terms. They’re not punishing or helping Apple. They’re just buying what they like. A simple explanation is that Apple customers are not “suffering”. In fact, there are more happy Apple customers than ever. They don’t have a magic wand to attract more customers while creating worse products.
 


When macOS had a price tag, and why it mattered.


For most of the Mac’s history, Mac OS (later Mac OS X) was very clearly a product that complemented the hardware it ran on—with a price that reminded you of the work behind it.


.....


Then came:


  • OS X 10.7 Lion (2011):
    • USD $29 via the Mac App Store (The first digital download only release of Mac OS X)
    • USD $69 on a USB install drive, single license only
  • OS X 10.8 Mountain Lion (2012):
    • USD $19.99 via the Mac App Store
    • No USB option, no family pack—your Apple ID handled multi-Mac installs.

Then, in 2013, they drop the price down to $0:


  • OS X 10.9 Mavericksfree.
  • Every macOS release after that → free.

And right around that same time—autumn 2013—iOS 7 is released, with a massive redesign and massive instability to match.


You can draw a line right through 2011–2013 and watch the quality begin its roll downhill.

Two large things grossly missed here. The mac OS upgrade rate also radically changed as the prices went down. Lower prices meant that more users upgraded to the newer OS substantially quicker. Releasing a product and then folks don't buy it until 3-5 months before the next product comes out is also a problem in the OS space. It leads to highly fragmented user base. In a relatively small ecosystem that leads to development issues for the 3rd party developers since they are often aiming at the 'lowest common denominator' group of end users.

[ case in point. Microsoft had to flip Windows 10 as 'free' to try to get some folks off of Windows Vista (multiple versions back). ]

Second, macOS was not particularly licensed as a separate product at all. It was always tied to use on a Apple Mac hardware. ( Yes, there were Hackintosh folks in the Intel era , but that was NOT part of Apple's formal 'product' model. ) . Apple sells systems , not bare hardware or bare software. The macOS license is directly coupled to the system that end users buy. That initial version and the 5 +/- 2 years of updates Apple assigns are all charged up front. The OS costs get paid for by Mac (the whole thing including the preinstalled software) sales. The upgrades are pragmatically 'charged out' as they are rolled out.

The deeper Apple got onto a more commodity platform the more that was going to be case. Hackintosh folks waving their CD-ROM of Mac OS X at Apple saying " the software is entirely separate from the hardware ... it can run this on my Hackintosh." is entirely not Apple's business model. Nor is it the legal foundation they were/are trying to lay down. Their position is that the two are one thing. ( in part to get around 'bundling constraint' laws. and part because they are not trying to make everything for everybody).

Apple doesn't need a whacky DRM scheme because if someone is operating a Mac (of a supported age) then doesn't matter if they bought from Apple or it was transferred or whatever funky road of ownership flip-flopping. It dials home for an update and it gets it or not. Done.





From Complementary Product to “Part of the Package”


When macOS had a price tag:


  • There was a clear economic signal:
    “This took effort. It funds a team. It justifies taking our time.”
  • Apple could justify longer development cycles and refinement releases like Snow Leopard.
  • The OS was treated as a flagship product, not just a side menu item.

Nope. It is more trying to copy "Monkey see , Monkey Do" techniques over from Microsoft... When Apple is not Microsoft. Apple's OS only is suppose to run on Apple's hardware. Microsoft didn't particularly sell computers. So of course they had a 'separate' product. What is the point of a separate product when it only runs on the Apple product also sold them?

This became even more nonsensical as macOS was slimmed down and ported to the phones , ipads. Whose other OS going to run on this custom hardware? Nobody. And even bigger threat of a 'monopoly bundling' slap down potential in the iPhone market.



Once macOS became free:


  • It stopped being a revenue line.

Buzzz.... not true.


  • It became a support pillar for hardware and services.

Not really true. You are separating things that Apple is trying very hard not to separate.

  • macOS releases aligned more tightly with marketing dates and event calendars.

ROTFLMAO. Like macOS wasn't coupled to the WWDC all along. You are almost completely detached from history here. WWDC is a in part a marketing event. The first day of the conference certainly is. The rest is more technical. But hand waving that WWDC is all technical is smoke.

Go back and look at the macOS release dates and go back and look at WWDC dates. There is some +/- 2 months on the somewhat earlier on, but mostly talking about August - October for the releases.


[ NOTE: end dates are largely relative to WWDC also usually after new ones had a x.x.1 or x.x.2 bug release. ]


In June from 2003 forward ( typically a bit behind the big Taiwan computer show end of May. )

The initial version was March, but to a large extent it wasn't the main "macOS" at that point. 10.4 was the only other 'odd duck' ( April 2005 ). And surprise , surprise , surprise WWDC 2005 announced switching over to Intel and released a "Intel Development Kit".







iOS: “Free” on the Surface, Paid for Behind the Scenes

....
  • You couldn’t install iOS on anything but an iPhone (and later, iPad) Unlike the Mac which both on PowerPC and Intel you technically could install alternate operating systems.
  • The cost of iOS development is baked into the iPhone’s MSRP.
  • iPhones are deeply integrated with carrier financing, contracts, and upgrades.

Somehow phones can bake it in and impossible to do with Mac hardware... There is no difference in terms of real sales aligned with the terms of the license. Hackintosh sales are bigger on Mac, but that is not Apple's business model.



macOS, on the other hand:

  • Runs on machines people often keep for 7–10 years.
  • Lives in a world where customers might skip hardware upgrades.

Folks don't skip iPhone upgrades? Not really true. ( when the phones were only sold on 'loan' contracts it might have appeared so, but not true. )

Similarly the 7-10 year thing isn't true. Apple's support windows does not go 10 years. I can't find it at the moment but there was a WWDC in the past where Phil Schiller was bragging about how Mac upgrade cycles from average users was 1-2 years shorter than Windows PC users. Which is probably true in part because Apple doesn't directly sell into the lowest margins zone of the general PC market. Most folks with disposable income will upgrade on a cycle faster than that if product hardware that is substantively better. ( so 4-6 years. Sooner for some it the equipment is written off a business expensive and facing real computational constraints. )



When macOS stopped being a product you buy to compliment your Mac and became “a de facto default part of the Mac experience,” it lost the internal leverage that says:

There is no Mac without macOS. If Apple shipped just RAW hardware; nothing installed and you plugged it in ... it would not "Just work" at all. The only reason have a functional system for a nominal end user is that there IS software and hardware there. It is a system; not one other the other.






Annual Everything: When the Schedule Becomes the Master


We’ve had hints over the years, but let’s say it bluntly:




Once macOS went annual and iOS was already annual, it was only a matter of time before:


  • iPadOS
  • watchOS
  • tvOS
  • visionOS

…all fell under the same gravitational pull.

It has the effectively the same kernel . Most the same libraries.. The file system is converged. The GUI level gaps is more looking at the 'above the waterline' of an iceberg. The Finder and the window 'look and feel' are not an operation system. That is far closer to being a facade than a foundation.

WWDC isn’t just a developer conference anymore. It’s a deadline. The September iPhone event isn’t just a showcase. It too is a deadline.

It never was just a developer conference. And it very quickly fell into a deadline even before MacOS X came along. WWDC being in May-June goes all the way back to the early 1990's.





When the date is immovable, you only have two levers left:


  • scope
  • quality

And what we’ve seen is:


  • Scope rarely shrinks (because marketing wants “new, big, and exciting”).
  • So quality takes the hit.

Scope has at least two factors. The 'new and shiny" is more user facing. There is another substantive factor in the scope of the hardware have to cover with the OS that is being completely missed here. Bigger , broader user base with a wider set of devices ... is a scope creep. OS kernel running on 2-3 different CPU instruction set platforms... scope creep. Random, unvetted app software being downloaded into the system through web browser ... scope creep ( security vector expansion. )

The other issue is the size of the overall operating system. One that consists of 50 libraries and apps is an entirely different scope that one that consists of 500 libraries and apps. With one or two orders of magnitude more complexity the likelihood that all end up maturing at the same rate is small.









What Apple Needs to Do to Fix the “Haus of Apple OS"


If Apple truly wants to prevent their “Haus of operating systems” from crumbling, I think they need to:


1. Freeze earlier and harder


No more “stuff this huge feature in two weeks before .0.”


If it’s not ready, push it to 27.1 or 28.


A clean freeze buys QA and stability.

Some stuff may appear ready if the QA testing is not comprehensive. Apps A , B , and C depend upon library Y means that if you yank Y lots more than just Y collapses. And have to reschedule of those folks. Some aspects of the OS are more critical than others. If Y releases , but bugs in Apps A , B , and C can be cleaned up later 'freezing' A, B, and C gets questionable.







3. Developer betas should go back behind a paywall.


Not because Apple wants to be stingy or gatekeep, but because developer betas were never meant for hobbyists, clout-chasers, or the “I want the new emoji early” crowd.


They were intended for:


  • people who actually build apps,
  • who know how to file proper bug reports,
  • who understand logs,
  • who test on secondary devices,
  • and who won’t scream on social media when a beta (shocker!) acts like a beta.

Sorry, but more than a few "clout chasers" have money. The paywall isn't going to stop them. If get ad revenue of $200 and the beta access costs $20 ... done deal.

Also not going to get qualified beta testing from professional if they are on another schedule than Apple. If they have their own development cycle to contend with then farming out cycles mainly to test Apple's stuff probably won't happen. IF there is no education to go along with the new OS/libraries/etc ( e.g., WWDC) how are they going to use the new stuff to test it? Not really asking them to write more Q/A test inputs to a harness already giving them. Asking to test new code on new libraries. They have to write something to test something that didn't exist before.

At best getting some regression testing over 'old' stuff (or at least 'old' APIs).
 
One other thing, written from the perspective of a hardware guy (albeit not in the computer field; instead telecom which is more software than you might like these days.)

My observation is that deep in the software thinking spot, at least for management, there's the idea that you can always change things in simple updates that are downloadable. So, more features get shoved in and less testing is done before release, just because it's possible. Not because it's a good idea, but because it's possible. Downloadable software is disposable and there are no inventory costs for ones and zeros. There's no lead time for actual delivery of tangible stuff. That changes the thinking.

Hardware guys generally HAVE to get things right from the start. Why? Field recalls are painful and expensive. They're especially painful for the executives and bean counters because it affects the actual bottom line in a clear way. Want to bring fear into the guy's faces? Use the words "forklift upgrade". Customers hate that. That hate often comes with financial penalties of some kind. Because of the transitive property of hardware, everybody in the chain gets to share that pain. So, you tend to focus more on getting it right. Or, on looking for a new job.

When upgrades were delivered through physical objects like CD's, other disks, and USB thumb drives, that was more like hardware. The cost associated with pulling that all back, scrapping the obsolete stock, and so on was painful and awful. So, there was almost inherently more of an emphasis to get it right. With millions of discs being made and sent out, you couldn't very well change much at the last minute. With downloads, you can.
I've been saying this, too, for years. The ability for software updates to be delivered fast and at low/no cost has led to the reduction in quality. It started with Windows Update, and sadly, with ubiquitous broadband, has permeated the industry.

It's also tied in with modern Agile and CI/CD methodologies where there's no concept of finishing or "shipping" anymore. ("Good artists ship," right?) It's just constant iteration. But when nothing is ever "done," there's just this constant drumbeat of putting together "minimum viable product," which will assuredly be full of bugs, but it checks the boxes that marketing and sales said would be delivered. Building MVP now to make the sale has become the norm -- with unprofitable bug fixes taking a back seat because "why have an expensive SWE fixing bugs that the sales and marketing teams can't use to generate more revenue?"
 
I haven’t noticed any drop in quality over the recent years.

Apple have always been changing , adapting, removing, refactoring their software. Especially in the days when they were homogenising iOS and macOS.

It all works flawlessly for me. Are you sure you’re not just remember the golden years when everything was better than perfect?
It really depends on the hardware. I installed iPadOS26 on my ageing iPad Air 3. It ran really slow but there were hardly any bugs. Then I got an iPad pro M5 as a hardware upgrade and boy, did I see glitches & bugs on a regular basis, more than once daily. Same OS, same Software installed, different hardware. Half of them disappeared with 26.1. but there's still plenty around. And I don't mean visual annoyances like the background of the home app flickering when changing temperature of a heater. I mean things like the keyboard disappearing, window content suddenly appearing squashed or a hammering audio noise that randomly appears when playing back videos in Safari that only goes away when the iPad is force restarted.
 
  • Wow
Reactions: maxoakland
But 99% of customers don’t think in those terms. They’re not punishing or helping Apple. They’re just buying what they like. A simple explanation is that Apple customers are not “suffering”. In fact, there are more happy Apple customers than ever. They don’t have a magic wand to attract more customers while creating worse products.
But that's exactly what I mean by consumers failing to punish Apple. They are satisfied when they shouldn't be. They pay when they shouldn't pay. Mind you, I include myself in this -- not the satisfied part, but the paying part.
 
Apple’s software is incredibly more complex than what we had some years ago. Just think about anything regarding AirPods, Sidecar, many more viewports on iOS/iPadOS, etc. This results naturally in more bugs.

Also, they have maybe 30x more customers than 20 years ago. Naturally, more people will find bugs — even if software remains exactly the same.

Finally, a result of the first point: since they’re offering many more products, alignment becomes way more difficult. For example, Vision Pro was created as a “start-up” project and (besides other issues) it works as good as you could expect software-wise. You just can’t do that with iOS or macOS.

So, of course, there might be some degradation of culture (which is partially just a result of becoming big), but I think it’s still as good as it can get for a company the size of Apple. Also, I think there’s some memory bias when it comes to comparing how things were some years ago. I think it was Phil Schiller who mentioned some years ago that, when looking at cold data reported to Apple, the proportion of logged issues had decreased overtime.
 
  • Like
Reactions: MacCheetah3
But that's exactly what I mean by consumers failing to punish Apple. They are satisfied when they shouldn't be. They pay when they shouldn't pay. Mind you, I include myself in this -- not the satisfied part, but the paying part.
But what’s your theory behind this? They’re paying when they shouldn’t… does this imply they’re buying products they don’t like?
 
  • Like
Reactions: Silverstring
I've always blamed two factors (1) people seem to want their computers to talk to their "fill in the blank", and (2) the relative cost of software development versus hardware cost (the current costs lead to large cycle wise inefficient code on fast processors with enormous amounts of memory both of which largely wasted on the tasks at hand
 
But what’s your theory behind this? They’re paying when they shouldn’t… does this imply they’re buying products they don’t like?
Just speaking from my perspective, I keep buying Apple because I still like it more than the alternatives, but I am not totally happy with a number of things. It is like how one might continue going to a restaurant but he is not totally satisfied with the food or happy with the price. But he still goes. If everybody really stops going, the restaurant will either have to improve or go out of business. But if it's not bad enough or if it is still better than the competitors, then it's not going to happen.

I believe the best way to maximize profit is indeed this approach -- increase your margins to the point just before the consumers will be unwilling to indulge you any longer. The so-called equilibrium. But that point is quite a distance from where the consumers have stopped being totally satisfied. So business-wise, I think Apple did the right thing, but as a consumer, I don't like it. And even then, you can only do it so much and for so long before the consumers are finally fed up or a viable alternative appears.

The OP's topic is the decline of Apple quality, and my comment had to do with one reason why I think it continued.
 
Software like Keynote, Pages, Numbers, GarageBand, iMovie, Mail, etc. have clearly not seen much attention in a while. I don't have enough expertise in Final Cut Pro or Logic Pro to speak for them, but I suspect a similar problem affects them as well.

Actually, Logic is fine when compared to Final Cut Pro. Logic has a lot of "deep" functionality, a ton of features and gets significant updates every year. FCP not so much. After 14 years it is still lagging behind the competition, new features get added very sparingly and late, and when they are, they are behind and incomplete (e.g. can't really edit keyframes of a magnetic mask, the magnetic mask in-Window-UI looks like it was designed by software engineers and never crossed the desk of an UI-designer, keyframe editing in general is pretty lackluster, can't edit from transcribed text, ...)
 
  • Like
Reactions: maxoakland
Alan Kay famously said “People who are really serious about software should make their own hardware.” Microsoft took their shot at this with their Surface line of computers and it yielded them positive results.
All those countless computer manufacturers, parts suppliers, and so on work alongside Microsoft, not against it. They provide real-world feedback to Microsoft and profit because of it, creating a win-win situation. In contrast, there are no comparable computer manufacturers for Apple to offer feedback.

Microsoft's Surface series complements the vast Windows computer market, which, in turn, supports the sales of all types of Windows computers. Not only that, those "competitors" also contribute to the growth of Windows.
 
But that's exactly what I mean by consumers failing to punish Apple. They are satisfied when they shouldn't be. They pay when they shouldn't pay. Mind you, I include myself in this -- not the satisfied part, but the paying part.
It's not just about being satisfied, or not. There's another component involved, the ability to go elsewhere.

I was never satisfied with Apple's walled garden, but after Microsoft threw in the towel with Windows Phone, my choices became Apple or Google or become a luddite. Options 2 and 3 are out, so Apple wins by default. And as long as there aren't any other competitors, they will keep on winning by default.
 
I believe the quality started slipping soon after Jobs passed and then Forstall got kicked out. They were the ones who gave a hoot about the user experience.

Jobs who released the "you are holding it wrong" iPhone. And several "web services" updates that collapsed on day one.

Forstall and the Apple Maps dumpster fire was supposedly one reason he was let go (distancing from responsibility) . Apple Wallet (Passbook ) needed a rename after fumbled roll out. Forstall oversaw 'Aqua lickable' interface and shift to "skeuomorphic" revisions, but that is all user facing; not core foundational work. Forstall had a big win in leading the stripped down MacOS beating out the linux alternative for "iPhone/iPad operating system". The issue with the software stack ( iOS , MacOS ... eventually iPadOS , watchOS , etc) is whether can manage a larger group of people with less 'drama'. Forstall was 'drama'. Dust ups with other executives. Mainly fooling yourself if think drama queens/kings lead to higher quality software over time.


Then soon after Apple Silicon arrived and the rate at which new hardware was pushed out increased thereby lowering the amount of time developers had to get work done. Instead of every 18 months it was now every 12 sometimes less. So software lost its two main QA guys then the output cycle got ramped up so naturally quality is going to drop. It is now falling off a cliff.

There is no way something the size and complexity of Mac OS X could get by with two QA guys. Those two were not plugging the holes. Apple has scope creep problems more so than executive level doing all the 'work' problems.
 
  • Like
Reactions: Bungaree.Chubbins
It's not just about being satisfied, or not. There's another component involved, the ability to go elsewhere.

I was never satisfied with Apple's walled garden, but after Microsoft threw in the towel with Windows Phone, my choices became Apple or Google or become a luddite. Options 2 and 3 are out, so Apple wins by default. And as long as there aren't any other competitors, they will keep on winning by default.
Same for me. And around the same time, I became more conscious of the data harvesting behaviors that were still becoming prevalent among tech companies.

Nowadays, Apple is basically the only option for a privacy-centric platform. So, as others have alluded to, it's not that I'm completely satisfied with Apple as much as it being that it's the "least worst" option available to me.
 
There are also other factors that aid in growth, one of which is suppression. When there’s a lack of something, one strives to achieve it, and with a massive, ever-growing technological base, even better innovations emerge. When you're unable to obtain chips, you create them. When you're blocked from certain operating systems, you develop your own. Companies like Huawei, Xiaomi, and a few others have created their own chips and operating systems. Given their understanding of existing systems, they can develop improved alternatives. We mustn’t forget about HarmonyOS, HyperOS, ColorOS, OxygenOS, and similar platforms that are growing rapidly. Moreover, all of them are creating something that isn't even Android.

We say that Apple creates macOS, iOS, and so on, but we don't really know who the actual programmers, maintainers behind all of that are. I'm certain it isn't machines doing that work.
 
I guess I’ll be the dissenting voice then. There are paper cuts, and I understand that people don’t like certain aspects of Liquid Glass.

On the other hand, as someone who deals with many Macs on a daily basis, stability has never been better. We see very few kernel panics and the core of the os is very solid. That doesn’t mean it’s perfect, but it is much better than it was even 10 years ago.
 
  • Like
Reactions: Bungaree.Chubbins
The software quality has gone downhill for me on iOS.

I used to daily-drive an iPad Air 1. It was fine on iOS 12, and all of the features that that iPad supported did work, and as far as I can remember nothing that wouldn't work there would show up in the interface.

I now have an iPad 7 with iOS 18 on it. It's fine most of the time, if not a bit slow throughout the main interface and most apps.

But some of the built-in apps don't work right anymore. Measure will crash after a wee bit of usage. Magnifier wouldn't even load at one point. And Photos is slow and has some really bad frame drops when opening things like the search box.

It also seems Apple has left behind some elements that could have been removed. If you try and access Siri while offline, it tells you to "connect to the internet to download Siri data", a feature that isn't available on that iPad.

For my use of that iPad, it's fine. But it's annoying that some features it has won't work because Apple didn't put enough effort into ensuring they'd work across all of the devices they say the software supports.
 
  • Love
Reactions: michaelgavriel
I haven’t had the time to read the OP completely, so I’m not aiming this at the OP specifically, but there are some things I’ve been noticing lately. People are evaluating things in ever more forensic detail, and rose-coloured glasses are as strong as ever. I’ve seen people criticise iOS/iPadOS/macOS for having interface elements misaligned by a pixel, and that being touted as evidence of the end of Apple. Even discounting that as hyperbole, applying that level of scrutiny to the remembered halo versions of the past would, I’m sure, turn up much bigger problems.

Apple isn’t perfect, and criticism of their flaws and failures is valid, but the thing I keep coming back to is that I’ve read versions of these arguments for decades! It’s been the same story trotted out since the original iPod! If there were any truth to it, Apple would have been dead a decade ago.
 
Snow Leopard literally launched with a bug that completely erased people’s home folders if they logged out of a guest account. And that bug persisted all the way until 10.6.3.

Also, iOS 4 went all the way up to 4.3.5, and had on average an update every month, not to mention cutting off support for the iPhone 3G halfway through the update cycle at 4.2.1 because the experience was just that atrocious.
A couple fine points made, but overall quite a bit of rose tinted glass looking.
 
I guess I’ll be the dissenting voice then. There are paper cuts, and I understand that people don’t like certain aspects of Liquid Glass.

On the other hand, as someone who deals with many Macs on a daily basis, stability has never been better. We see very few kernel panics and the core of the os is very solid. That doesn’t mean it’s perfect, but it is much better than it was even 10 years ago.

I agree. The way I’d see it, where Apple currently has issues are usability and UI/UX consistency. Which is very noticeable, since these are their traditional strength.

Software stability or compatibility in the other hand has never been Apples strength, and have been improving in the recent years. Which is why I don’t quite understand the calls to adopt a Windows-like development model. Windows can afford backward compatibility because the base OS is very minimal and generally crappy. Apple ships a massive amount of frameworks and software as part of the standard OS distribution. These are not really comparable. Not to mention that Windows has the legacy market share. Nobody, literally nobody would use Windows if it were a new system on the market, because it offers nothing noteworthy besides legacy support. That’s not Apples game.
 
Last edited:
  • Like
Reactions: Bungaree.Chubbins
So many chicken littles here it's amusing. It doesn't matter HOW MUCH money a company has - there are always going to be problems & issues. I remember when Mac OSX first came out there were a ton of glitches & bugs, so much so that they issued Puma a mere six months later.

I recall (I think) Jim Dalrymple & John Gruber going over this and this question came up:

Is Apple's, software quality actually getting worse or is it because more and more people are coming over to the Mac platform and so various issues are more noticeable?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.