Historically devices running a desktop OS have had longer support. This is why 5 years of support on a Mac is considered bad, but 5 years on a smartphone is considered good.
Phones start to fall apart in one way or another after year two, on average. 5 years of support for phones is fine. The forced heavy usage is what does it. What's not good is having to pay $1000 every couple of years. That's not good. But that's got nearly nothing to do with support.
I know people look at Windows 10 that easily supports 10 year old devices and say this should be the norm. But I am not so sure, I've seen problematic laptops and I personally have a Dell Latitude that even though it runs the last (latest) Windows 10 build, I can't install all drivers, and Dell doesn't have a Windows 10 section at all, stopping at Windows 8, and sometimes I get bluescreens especially when trying to use standby. And there is no assistance from Dell since that machine officially never received Windows 10. So I have to rely on the drivers installed by Windows Update.
If the laptop vendor's driver page for that laptop omits Windows 10, then that laptop doesn't truly support Windows 10. That said, any Ivy Bridge or newer laptop will have a drivers page for Windows 10. Generally, you'll get better mileage on Haswell and newer laptops for Windows 10, but Ivy Bridge is truly the oldest where you'll get support. A Sandy Bridge laptop will meet Microsoft's requirements for Windows 10, but not be truly supported by the vendor. And, still, an Ivy Bridge laptop will still be able to run a supported version of Windows until 2025. 13 years for a computer is outrageously amazing. No one come anywhere near close to that anymore.
So technically a 10 year old Windows laptop will run an up-to-date Windows build, but 100% functionality is in no way guaranteed.
Actually, you're wrong about that. Again, 13 year old laptops are supported (by the vendor and with drivers) for Windows 10.
The effort is about the same as installing Ventura on my no longer supported 2015 MacbookPro. Except that the MBP won't bluescreen and sleep is also reported to actually be working just fine. And I think older (retina) MacbookPros that don't have Nvidia graphics also support Ventura just fine.
It's more work to get Ventura running on a 2015 Mac. Windows 10 will install on hardware whose manufacturer doesn't have drivers for it. Like you said, the experience isn't going to be great and stability issues are likely to arise. But you don't need to do any kind of tweaking of anything to get to that point. Just install Windows and go. The vanilla Ventura installer will bark at you by default.
So if anything, I am happy by how well MacOS runs on older Macs, and I'll take my slow hot-running dualcore 2015 MBP (still with Monterey) over a Windows laptop from that era any day. Looks brand-new too and better than most Windows laptops today (topcase replaced with the battery in 2020).
You have clearly not run Windows 10 on a Haswell or Broadwell laptop then. I've NEVER had issues. Mind you, all of my PCs are business class and fetching drivers is a breeze. But the mileage is generally smoother and, unlike the rather inefficient Monterey, Windows 10 is smooth as butter and I'm not having the fans roar every time I do something that wouldn't have incurred that same mileage on Big Sur or earlier.
The OP is asking for advice. The history supports 5 - 7 years of OS support. As we only have evidence of Apple supporting this, we can only base the future on this history. Thus, the OP is asking if his M1 will be OS supported in 4 years, the historical evidence supports it will. Apple can change anything they want, but this is the imperial evidence to support a recommendation of at least 5 years.
Imperial evidence? Like evidence given down from the emperor himself?
Yes, Apple generally supports things for AT LEAST that long. But basing it on that alone is unscientific at best. Even basing it on whether or not Apple still sells products with the M1 is also unscientific at best. Apple makes the chips, and in all cases where that's true (i.e. iPadOS, iOS, watchOS, tvOS), Apple is going to keep supporting the OS until they deem that they can't anymore due to the hardware actually being lacking. A8 is still supported for tvOS and HomePod software. A8 support was dropped in iOS back in iOS 13 in 2019. It was finally dropped from iPadOS in iPadOS 16. Clearly, Apple hasn't yet implemented anything in tvOS mandating that they finally nuke support for the Apple TV 4th Gen/HD.
Agreed. My point is that people who are using machines in the Pro (CPU / GPU intensive tasks) world upgrade 2 -3 years. Again, this is not for a typical user but for the Pros.
It's not even typical for MOST Pros anymore! We're talking about people for whom a 28-core Mac Pro (2019) wasn't enough power for in 2019! That's a really tiny percentage of even the community of high end video/graphics/scientific professionals! VERY few people need an Mx Max SoC. Even fewer need an Mx Ultra. Even fewer need something beefier than that. And EVEN FEWER THAN THAT need to upgrade every three years because the software (that itself hasn't been optimized for those high end configurations yet) is somehow demanding it. My point is that your take on how often high end users should upgrade is either (a) skewed to the REALLY small number of absolute highest-end use cases or (b) several technology generations out of date.
I recommend replacing consumer grade machines only when needed, typically 5 - 7 years.
To clarify, when I say "Pro" I mean people who are using their machines for pro specific tasks. A secretary or office administrator does not need a pro-level machine. A graphic designer, video editing, animator, etc. likely does.
I know and work with graphic designers, video editors, and animators. None of them rocking an M1 Pro (let alone M1 Max) will need to upgrade when the M3 Pro and M3 Max inevitably become available. Hell, I'm sure they're still not even pushing the M1 Pro to its limits yet! Again, you're talking about extremely high-end use cases where 28 cores of Xeon or 20 cores of M1 Ultra is already not enough. That is such a small percentage of people working in those fields and those are workstations that are very unlikely to even be owned by individual users rather than businesses anyway.
Again, I structured my comments by stating that upgrading 2 -3 years is not meant for most. However, with rendering times, graphics, even application boot time, a 2 chip iteration upgrade often makes sense in the professional world where time is money.
It doesn't make the sense in 2023 that it would've in 2018. The gains are not great enough to justify the spend without the spend being a serious waste. No one with an M1 Max 16-inch MacBook Pro needs to get an M2 Max 16-inch MacBook Pro. And there won't be enough of a difference between the M1 Max and the M3 Max for that to really justify the expense except for those that should've gotten an M1 Ultra Mac Studio anyway. But even then, performance gains are not projected to be that significant.
Your stance made way more sense back in the PowerPC G5 days or even the early days of the Mac Pro where the generational processor boosts mattered over that short of a time. Two processor generations is nothing nowadays. Even still, there's no serious jump coming down the road that compares to the jump from, say, 9th Generation H-series Intel processors and M1 Pro and M1 Max. We're back to minimal jumps.
The OP asked for advice. I am providing advice based on my 20 year experience as a video editor and IT admin for an office with nearly a dozen, non-editing machines. You may disagree; that's OK. But, again, in my experience, non-editing machines get upgraded every 3 - 5 years and the editing machines get upgraded every other chip iteration.
Yes, but just how many post-production houses have you worked for? I don't doubt that one place upgrades that aggressively. But I will absolutely argue that is not the average at all and I've worked for a bunch of them. IT shops will generally support a machine at least through the end of its warranty period; past that they won't keep it past the point where it needs a paid repair (because that is wasted money). For Apple, up until recently, that was a hard three years. Now, if you are an enterprise, it's four. And regardless, they'll renew to support you for up to five.
Interestingly, as a side note, the organization I work with has offices around the country with hundreds of employees doing fairly basic office work (educational events and fundraising). The policy is that employees are entitled to a new machine every 3 - 5 years, budget dependent. For the few of us doing video work, we must apply for a special exception as we don't get the standard machines offered (Dell, Lenovo, MPA or MBP).
Again, that's standard IT practice at most places. 3-5 years unless your needs justify getting it sooner.
Wild take: M1 support will end together with M2.
M3 will be on another support timeline.
You're welcome.
Considering how similar M2 is to M1, this is not that wild of a stretch. There are only very few obvious differences in the base M1 that could possibly make it get dropped earlier. But even in that case, I see no reason why M1 Pro, M1 Max, M1 Ultra, M2, M2 Pro, and M2 Max wouldn't then still lose support together. Base M1 has DDR4 RAM, all other Apple Silicon Mac SoCs have DDR5. Base M1 doesn't have any ProRes encode/decode engines, M1 Pro, M2, and M2 Pro have one, while M1 Max and M2 Max have two (and M1 Ultra with four). I have no clue what OS change Apple would introduce that would necessitate either of these things, but if that's where they draw the line, then I could see M1 getting dropped before the others. But I otherwise totally buy that M1 Pro/Max, M2, M2 Pro/Max, and M1 Ultra will all lose support at the same time.
The Intel Macs were supported on average for 5 years after Apple stopped selling them:
https://arstechnica.com/gadgets/202...es-than-they-used-to-heres-why-its-a-problem/
I believe that average is higher. A lot of them had 8 years of the latest release and 10 years of releases getting security updates.
https://arstechnica.com/gadgets/202...es-than-they-used-to-heres-why-its-a-problem/
If anything, Apple now has the means to support their devices for a longer time.
Apple GENERALLY drops support when at least one of three things are true:
- A component is no longer supported by the vendor and Apple needs to update the driver for it
- Apple changes the OS in a fundamental way such that a key hardware feature is required (for Ventura, that seems to be hardware-based HEVC decoders on the processors)
- It is more trouble than it is worth to keep supporting hardware and/or the user experience sucks.
This has been consistently true (possibly with the one exception of where Apple drew the line with macOS Monterey; allowing some Haswell systems, but not others). This has especially been true on the iOS and iPadOS side where there is only Apple Silicon and no third party CPU or SoC.
Apple is going to drop support for the M1 SoC when there is a hardware feature that it deems vital for a future macOS release that the M1 SoC doesn't have. What that feature is? Who knows. M1 isn't that old, and Apple still has to figure out when it's dropping Intel support first.
And, although they've been doing this at each release since Catalina, it's not always a given that Apple will change the minimum system requirements. Mountain Lion (10.8.x), Mavericks (10.9.x), Yosemite (10.10.x), and El Capitan (10.11.x) all had the exact same minimum system requirements. Similarly, Sierra (10.12.x) and High Sierra (10.13.x) both had the exact same minimum system requirements. Even Mojave (10.14.x) and Catalina (10.15.x) were virtually the same on minimum system requirements (with the only exception being Apple dropping support for the 2010 and 2012 Mac Pro towers, presumably due to the aftermarket video card requirement being unwieldy to support). From Catalina to Ventura, Apple has moved system requirements up. One can only presume that part of that is to eventually work towards macOS being Apple Silicon only (as supporting both architectures probably costs extra development resources as well as results in an OS that takes up more space on disk).
Although with the iPhone they only offer the latest iOS for 5-6 year from time of release either, so Apple could actually reduce support times if they wanted to, unless Tim Cook happens to stop by and clarify this we won't know for sure.
Apple drops iPhone models from support for the same reasons. The iPhone 6S and 6S Plus held the record for longest supported iPhone and were dropped from support at the exact same time as the iPhone 7 and iPhone 7 Plus. And that's because features in A11 became a must, not because the phone reached a certain age.
In any case, expect your M1 Mac to run the latest apps until 2030, maybe a couple years more. Personally, even though it's obviously important to provide new software for older hardware, I replace my Macs after 5-6 years, 7 years at the very latest, so I've never had any issue with this. My nowadays very slow Mac from 2015 with a dualcore Intel is still on Monterey and that poor CPU is really struggling. I don't see any point in upgrading to Ventura (which is possibly with OCLP though not supported by Apple), it struggles with playing Youtube videos and creates excessive fan noise in summer.
Monterey was not the most efficient macOS release. Big Sur performed WAY better on hardware that tops at Monterey. In fact, machines that top out at Big Sur run Big Sur way better than machines that top out at Monterey run Monterey. That's a byproduct of Monterey more than it is the hardware.