When macOS had a price tag, and why it mattered.
For most of the Mac’s history, Mac OS (later Mac OS X) was very clearly a
product that complemented the hardware it ran on—with a price that reminded you of the work behind it.
.....
Then came:
- OS X 10.7 Lion (2011):
- USD $29 via the Mac App Store (The first digital download only release of Mac OS X)
- USD $69 on a USB install drive, single license only
- OS X 10.8 Mountain Lion (2012):
- USD $19.99 via the Mac App Store
- No USB option, no family pack—your Apple ID handled multi-Mac installs.
Then, in
2013, they drop the price down to $0:
- OS X 10.9 Mavericks → free.
- Every macOS release after that → free.
And right around that same time—autumn 2013—
iOS 7 is released, with a massive redesign and massive instability to match.
You can draw a line right through 2011–2013 and watch the quality begin its roll downhill.
Two large things grossly missed here. The mac OS upgrade rate also radically changed as the prices went down. Lower prices meant that more users upgraded to the newer OS substantially quicker. Releasing a product and then folks don't buy it until 3-5 months before the next product comes out is also a problem in the OS space. It leads to highly fragmented user base. In a relatively small ecosystem that leads to development issues for the 3rd party developers since they are often aiming at the 'lowest common denominator' group of end users.
[ case in point. Microsoft had to flip Windows 10 as 'free' to try to get some folks off of Windows Vista (multiple versions back). ]
Second, macOS was not particularly licensed as a separate product at all. It was always tied to use on a Apple Mac hardware. ( Yes, there were Hackintosh folks in the Intel era , but that was
NOT part of Apple's formal 'product' model. ) . Apple sells systems , not bare hardware or bare software. The macOS license is directly coupled to the system that end users buy. That initial version and the 5 +/- 2 years of updates Apple assigns are all
charged up front. The OS costs get paid for by Mac (the whole thing including the preinstalled software) sales. The upgrades are pragmatically 'charged out' as they are rolled out.
The deeper Apple got onto a more commodity platform the more that was going to be case. Hackintosh folks waving their CD-ROM of Mac OS X at Apple saying " the software is entirely separate from the hardware ... it can run this on my Hackintosh." is entirely not Apple's business model. Nor is it the legal foundation they were/are trying to lay down. Their position is that the two are one thing. ( in part to get around 'bundling constraint' laws. and part because they are not trying to make everything for everybody).
Apple doesn't need a whacky DRM scheme because if someone is operating a Mac (of a supported age) then doesn't matter if they bought from Apple or it was transferred or whatever funky road of ownership flip-flopping. It dials home for an update and it gets it or not. Done.
From Complementary Product to “Part of the Package”
When macOS had a price tag:
- There was a clear economic signal:
“This took effort. It funds a team. It justifies taking our time.”
- Apple could justify longer development cycles and refinement releases like Snow Leopard.
- The OS was treated as a flagship product, not just a side menu item.
Nope. It is more trying to copy "Monkey see , Monkey Do" techniques over from Microsoft... When Apple is not Microsoft. Apple's OS only is suppose to run on Apple's hardware. Microsoft didn't particularly sell computers. So of course they had a 'separate' product. What is the point of a separate product when it only runs on the Apple product also sold them?
This became even more nonsensical as macOS was slimmed down and ported to the phones , ipads. Whose other OS going to run on this custom hardware? Nobody. And even bigger threat of a 'monopoly bundling' slap down potential in the iPhone market.
Once macOS became free:
- It stopped being a revenue line.
Buzzz.... not true.
- It became a support pillar for hardware and services.
Not really true. You are separating things that Apple is trying very hard not to separate.
- macOS releases aligned more tightly with marketing dates and event calendars.
ROTFLMAO. Like macOS wasn't coupled to the WWDC all along. You are almost completely detached from history here. WWDC is a in part a marketing event. The first day of the conference certainly is. The rest is more technical. But hand waving that WWDC is all technical is smoke.
Go back and look at the macOS release dates and go back and look at WWDC dates. There is some +/- 2 months on the somewhat earlier on, but mostly talking about August - October for the releases.
en.wikipedia.org
[ NOTE: end dates are largely relative to WWDC also usually after new ones had a x.x.1 or x.x.2 bug release. ]
In June from 2003 forward ( typically a bit behind the big Taiwan computer show end of May. )
en.wikipedia.org
The initial version was March, but to a large extent it wasn't the main "macOS" at that point. 10.4 was the only other 'odd duck' ( April 2005 ). And surprise , surprise , surprise WWDC 2005 announced switching over to Intel and released a "Intel Development Kit".
iOS: “Free” on the Surface, Paid for Behind the Scenes
....
- You couldn’t install iOS on anything but an iPhone (and later, iPad) Unlike the Mac which both on PowerPC and Intel you technically could install alternate operating systems.
- The cost of iOS development is baked into the iPhone’s MSRP.
- iPhones are deeply integrated with carrier financing, contracts, and upgrades.
Somehow phones can bake it in and impossible to do with Mac hardware... There is no difference in terms of real sales aligned with the terms of the license. Hackintosh sales are bigger on Mac, but that is not Apple's business model.
macOS, on the other hand:
- Runs on machines people often keep for 7–10 years.
- Lives in a world where customers might skip hardware upgrades.
Folks don't skip iPhone upgrades? Not really true. ( when the phones were only sold on 'loan' contracts it might have appeared so, but not true. )
Similarly the 7-10 year thing isn't true. Apple's support windows does not go 10 years. I can't find it at the moment but there was a WWDC in the past where Phil Schiller was bragging about how Mac upgrade cycles from average users was 1-2 years
shorter than Windows PC users. Which is probably true in part because Apple doesn't directly sell into the lowest margins zone of the general PC market. Most folks with disposable income will upgrade on a cycle faster than that if product hardware that is substantively better. ( so 4-6 years. Sooner for some it the equipment is written off a business expensive and facing real computational constraints. )
When macOS stopped being a product you buy to compliment your Mac and became “a de facto default part of the Mac experience,” it lost the internal leverage that says:
There is no Mac without macOS. If Apple shipped just RAW hardware; nothing installed and you plugged it in ... it would not "Just work" at all. The only reason have a functional system for a nominal end user is that there IS software
and hardware there. It is a system; not one other the other.
Annual Everything: When the Schedule Becomes the Master
We’ve had hints over the years, but let’s say it bluntly:
Once macOS went annual and iOS was already annual, it was only a matter of time before:
- iPadOS
- watchOS
- tvOS
- visionOS
…all fell under the same gravitational pull.
It has the effectively the same kernel . Most the same libraries.. The file system is converged. The GUI level gaps is more looking at the 'above the waterline' of an iceberg. The Finder and the window 'look and feel' are not an operation system. That is far closer to being a facade than a foundation.
WWDC isn’t just a developer conference anymore. It’s a deadline. The September iPhone event isn’t just a showcase. It too is a deadline.
It
never was just a developer conference. And it very quickly fell into a deadline even before MacOS X came along. WWDC being in May-June goes all the way back to the early 1990's.
When the date is immovable, you only have two levers left:
And what we’ve seen is:
- Scope rarely shrinks (because marketing wants “new, big, and exciting”).
- So quality takes the hit.
Scope has at least two factors. The 'new and shiny" is more user facing. There is another substantive factor in the scope of the hardware have to cover with the OS that is being completely missed here. Bigger , broader user base with a wider set of devices ... is a scope creep. OS kernel running on 2-3 different CPU instruction set platforms... scope creep. Random, unvetted app software being downloaded into the system through web browser ... scope creep ( security vector expansion. )
The other issue is the size of the overall operating system. One that consists of 50 libraries and apps is an entirely different scope that one that consists of 500 libraries and apps. With one or two orders of magnitude more complexity the likelihood that all end up maturing at the same rate is small.
What Apple Needs to Do to Fix the “Haus of Apple OS"
If Apple truly wants to prevent their “Haus of operating systems” from crumbling, I think they need to:
1. Freeze earlier and harder
No more “stuff this huge feature in two weeks before .0.”
If it’s not ready, push it to 27.1 or 28.
A clean freeze buys QA and stability.
Some stuff may appear ready if the QA testing is not comprehensive. Apps A , B , and C depend upon library Y means that if you yank Y lots more than just Y collapses. And have to reschedule of those folks. Some aspects of the OS are more critical than others. If Y releases , but bugs in Apps A , B , and C can be cleaned up later 'freezing' A, B, and C gets questionable.
3. Developer betas should go back behind a paywall.
Not because Apple wants to be stingy or gatekeep, but because
developer betas were never meant for hobbyists, clout-chasers, or the “I want the new emoji early” crowd.
They were intended for:
- people who actually build apps,
- who know how to file proper bug reports,
- who understand logs,
- who test on secondary devices,
- and who won’t scream on social media when a beta (shocker!) acts like a beta.
Sorry, but more than a few "clout chasers" have money. The paywall isn't going to stop them. If get ad revenue of $200 and the beta access costs $20 ... done deal.
Also not going to get qualified beta testing from professional if they are on another schedule than Apple. If they have their own development cycle to contend with then farming out cycles mainly to test Apple's stuff probably won't happen. IF there is no education to go along with the new OS/libraries/etc ( e.g., WWDC) how are they going to use the new stuff to test it? Not really asking them to write more Q/A test inputs to a harness already giving them. Asking to test new code on new libraries. They have to write something to test something that didn't exist before.
At best getting some regression testing over 'old' stuff (or at least 'old' APIs).