Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Interestingly, it's a really easy fix but locked down corporate computers won't allow users to apply this themselves.
 
Crowdstrike has a posted a preliminary explanation about what happen.

https://www.crowdstrike.com/blog/technical-details-on-todays-outage/

As I suspected earlier, this was a NOT a sensor update, it was a bad patch definition. These definition are issued multiple times a day to all devices to address new security threats. So this would be similar to how Apple issues patches to XProtect, although way more frequently. That is why there was there no phased rollout.

The commonly accepted practice is for no public phased rollout for definition files. But they can be and should be and probably is doing an internal phased rollout. Should have been caught on their internal machines first. Unless it was a bug or misconfiguration in their distribution system that mistook a test or not yet tested patch.

Crowdstrike and Microsoft both will need to look deeply at what happened. Why did a definition update bring down the whole system? What can be done to protect against that in the future?

As I explained earlier "Microsoft's attitude has been more of "if it worked 30 years ago it should keep working now". They don't want to change the APIs that is being used by so many vendors, which will require a rewrite of millions of pieces of software.

Crowdstrike is the easy to blame one.
 
Interestingly, it's a really easy fix but locked down corporate computers won't allow users to apply this themselves.
The “problem” is that if the machines are well configured that “easy fix” shouldn’t be able to be applied by end users. So it is a lot work.
 
The commonly accepted practice is for no public phased rollout for definition files. But they can be and should be and probably is doing an internal phased rollout. Should have been caught on their internal machines first. Unless it was a bug or misconfiguration in their distribution system that mistook a test or not yet tested patch.
Yea, there is still a lot to be determined. I am certain the Crowdstrike is going to be doing a top down review of their release process. There has to be some form of automated testing before being released to the public. Was there a break down in the process? Did something get changed between the test and final release?

However, this is a dual edge sword. There is a reason that their definition updates are released multiple times a day. The security landscape is way too fluid these days. The idea that you can protect yourself by "just not clicking unknown URLs" is not realistic in modern computing environment. if I am a bad actor, I would be looking to target organizations that use Crowdstrike right now as they are likely going to slow definition updates, resulting in computers being susceptible to new malware.

As I explained earlier "Microsoft's attitude has been more of "if it worked 30 years ago it should keep working now". They don't want to change the APIs that is being used by so many vendors, which will require a rewrite of millions of pieces of software.

Crowdstrike is the easy to blame one.
Total agree. Windows is definitely dragged down by technical debt. And Microsoft takes a lot of deserved blame for it. But vendors and customers are also a part of this. The time and effort to adapt new APIs is not cheap. And, customers don't like having to switch.

Let's take Apple's opposite approach. Apple is not afraid to deprecate (and randomly removing) APIs in the name of security. That is, overall, a good thing. But, it often leaves vendors having to make major changes and customers having to have hold off on upgrades. The obvious example is the transition from KEXTs to SysExts. Unfortunately, many large vendors did not have SysExts ready for Ventura. As a result many organizations, including mine, had to delay adoption. It is easy for Apple to say "Find a new vendor", but that is not often realistic. Many organizations use tools that cross platform. Running two different security tools is not a realistic option. Funny thing is the Crowdstrike is generally consider one of the best EDR security solution, especially for macOS. It was small, efficient, and generally stayed out the way.

(P.S. Thank for bringing an intelligent conversation to this thread.)
 
It was only a few months ago when untold thousands of people had their iCloud accounts suddenly lock them out and require resets. I was lucky and resolved it within an hour or so, but I heard people on these forums who were locked out entirely for week of going back and forth with Apple. Not great!

... and still, to my knowledge, Apple never acknowledged the incident or said what happened. That still rubs me the wrong way as a 30-yr cybersecurity veteran.
 
Mac people here ridiculing Windows apparently don't realize it's not a Windows problem, it's a Crowdstrike problem.
The fact that Mac computers aren't affected is by the grace of Crowdstrike, not Apple.

Go on now! Off your high-horse.
Regardless of whether you want to say it’s a fault of Microsoft or Windows or not, at the end of the day it affected Windows and SOLEY Windows. People are gonna ridicule that, and rightly so. Why shouldn’t they? You think affected users feel any better hearing “it’s not Microsoft’s fault”? They do not. All they know is they are being affected, and their competitors are not. And you may say it’s not Microsoft’s problem, but when so many of their users are experiencing an issue I would say it BECOMES a Microsoft problem, whether it’s their fault or not. I’m pretty sure Microsoft would consider that their problem.
 
  • Disagree
Reactions: triptolemus
What I find insane is the number of backend systems running Windows when something like Linux would be better suited. Not entirely a Mac vs PC thing but a world where companies use Windows instead of working on better solutions.
CrowdStrike botch Linux about a month ago. So its not like they are much better.
 
I think Microsoft definitely needs to be looking at what happened and why their O/S failed. From what I have seen, this appears to have been a definitions update (which is automatic) and not an agent patch (which CS admins can control in their own environment). Crowdstrike and MS will have to work together for the root cause.

For those saying companies should never apply patches without testing first. If this, as I suspect, was a definitions update, then it makes sense that it was silent. XProtect on macOS is the exact same. Those updates are silent as well. (Granted, they should never take the whole system down.)
the CROWDSTRIKE update had a bad memory pointer in it they filed to validate in it before they deployed it. This is 100% on CROWDSTRIKE. Microsoft did not update or push anything.

If CrowdStrike tested their code, or used RUST instead of C++ this wouldn't have happened.
 
it’s actually insane how much the world relies on Microsoft. some people will take a cheap kick at them (“buy a Mac” “apple for the win”) but Apple stands no chance of ever coming close to Microsoft’s dominance. Entire countries would grind to a halt without them. This is just a taster of what could happen

I know for a fact my work would never switch to Mac. They are using the bare minimum specs to run Windows 10. So yeah we aren’t about to buy Macs for everyone lmao. Also, it would be such a headache. People freak out at the slightest change so switching to a completely different OS sounds like a nightmare. I’m dreading the day when we move to Windows 11 (I actually like W11 and use it as my main OS).

My PC at work was fine today thankfully! Glad it’s the weekend so hopefully no issues on Monday.
It's kinda weird that so many use Windows for everyday work PC/laptops. But what's rather concerning is the server and "embedded" computers that are running windows. Linux is a better choice for servers, flat out. Barring any weird hardware, it's also likely a better choice for most embedded systems.
 
Crowdstrike has a posted a preliminary explanation about what happen.

https://www.crowdstrike.com/blog/technical-details-on-todays-outage/

As I suspected earlier, this was a NOT a sensor update, it was a bad patch definition. These definition are issued multiple times a day to all devices to address new security threats. So this would be similar to how Apple issues patches to XProtect, although way more frequently. That is why there was there no phased rollout.

Crowdstrike and Microsoft both will need to look deeply at what happened. Why did a definition update bring down the whole system? What can be done to protect against that in the future?

And hats off to my Windows counterparts that had to deal with it. I am sure many of you will be working the weekend to bring systems back up.

At our org, we just moved off Crowdstrike earlier this year, so the damages was limited to a small percentage of our computers and servers that had not yet been migrated to Defender. And, as the Mac Admin, I didn't have anything to worry about.
It's already known. The file tried to write to memory it isn't allowed to access. As a result Windows shut it down which was the BSOD. If Crowd Strike had validated their code, or used Rust instead of C++ there wouldn't have been an issue.
 
  • Disagree
Reactions: Caviar_X
It's kinda weird that so many use Windows for everyday work PC/laptops. But what's rather concerning is the server and "embedded" computers that are running windows. Linux is a better choice for servers, flat out. Barring any weird hardware, it's also likely a better choice for most embedded systems.
But it's just as fallible. It's not like similar issues have not hit Linux servers in the past.
 
  • Like
Reactions: AF_APPLETALK
Regardless of whether you want to say it’s a fault of Microsoft or Windows or not, at the end of the day it affected Windows and SOLEY Windows. People are gonna ridicule that, and rightly so. Why shouldn’t they? You think affected users feel any better hearing “it’s not Microsoft’s fault”? They do not. All they know is they are being affected, and their competitors are not. And you may say it’s not Microsoft’s problem, but when so many of their users are experiencing an issue I would say it BECOMES a Microsoft problem, whether it’s their fault or not. I’m pretty sure Microsoft would consider that their problem.
Back in April, the same thing happened with Linux servers (Debian + Rocky) and CS. Is this now also only a Linux problem?
 
Both Intel and Microsoft have "always" spent A LOT of resources preventing competiton such as the Intel compilator scandal where the code limited AMD performance, the document format war to prevent "office competition", killing of competitor contracts, Mono, buying GitHub and so on.
You talk like this is limited to MS and Intel. Apple spent a metric f-ton of resources preventing hackintoshes from ever becoming a viable market and they continue to do everything they can to restrict competition on software distribution on their platforms.

They did the same stuff with gaming back in the day.
Like what? The PC gaming market "back in the day" was thriving with a huge array of game studios and hardware manufacturers making all kinds of gaming peripherals. If anything, the gaming market of today has far less competition, and Microsoft is only one player among the dozen or so that still exist. They certainly aren't the largest.

Did I mention BIOS?
Not sure what you mean by BIOS. That was originally IBM's thing in hoping to keep the PC market to themselves. Unless you're talking about UEFI and Secureboot which, being that this is an Apple forum and given Apple's own locking down of their hardware functionality with EFI interlocks, makes that particular "lock-in mechanism complaint" a bit rich. As was expected, Secureboot turned out to be a nothingburger and every major operating system released today runs perfectly fine on systems secured with it.

What it did was slowing down the competition which gave them time to do their own stuff without changing much. For instance making it very difficult to use Linux clients with Microsoft servers, which owns e.g. public sector. The only reason I have had to use Windows the last 25 years is document formats and public sector Microsoft servers.
Not sure what you're talking about here. It's never been difficult to use Linux clients with Microsoft servers for anyone not completely inept at using both Linux and Windows. Heck, Microsoft themselves have ported some of their most critical server products (ie, SQL Server) to Linux and are working at making most of their other offerings OS agnostic. Whatever point you're trying to make here is based on things that haven't been an issue for over 20 years now. Your own last paragraphs contradict your first one - obviously there is no problem using Linux and Windows systems together, because you obviously do so yourself.

One consequence is the vast numbers of professionals working with Microsoft software, and they would of course not want to reeducate themselves on open source, Linux, Apple stuff and so on, which pretty much serves as barrier against competition.
Yet Google docs has been making massive inroads into both the public and private sectors. Perhaps the problem has never been people not wanting to "re-educate" themselves on a new toolset. It has always been the lack of functionality in that software, particularly the absolutely abysmal spreadsheet offerings that exist in both the FOSS and Apple worlds. Perhaps when LibreOffice can come up with a featureset that actually does more than my 2003 copy of AppleWorks, they'll be taken seriously by the corporate world. Until then, it's Microsoft's market to lose and apparently Google's market to gain.


As for Intel, the number of lies they have used to sell their ovens (performance, heat, power consumption and what not) are nothing less than fantastic.
Yes, Intel's offerings have been abysmal as of late, but they still have something that nobody else, including Apple, has - their own in-house fab. People were predicting the demise of Intel for about as long as they've been predicting the demise of Apple. Be it losing to RISC, PPC and the Apple/IBM/Motorola alliance, or Alpha, MIPS in the late 90's to losing to the AMD x64/Athlon in the early 00's. Perhaps the M-Series and ARM will be the final nail in Intel's coffin. But I highly doubt it.

Both Microsoft and Intel have been too full of themselves, confident they could keep on playing their game to stifle competition. Intel and Microsoft remains irrelevant for pads/phones, the rumors about Apple developing their own hardware was going on for years, but still Intel was to full of themselves to realise what was going on.
Not sure why "rumors of Apple developing their own hardware" was anything Intel should have been worried about. Apple was never a large enough customer of Intel's for Intel to be all that worried about losing them as a customer. Even today, AMD and Qualcomm remain more serious threats to Intel than Apple does.

One can start anywhere, but Windows 3.1 was heavily critisised by their users (including yours truly). Then came Vista, which was so bad that the critics of Windows 3.1 started glorifying it. Windows still isn`t good.
Windows 3.1 is a pretty weird place for you to start, actually. Prior to 3.1's release, Windows was as big a failure as every other PC-based GUI on the market (Geos, TopView, GEM, DVx). Windows 3.1 was the first version of Windows to see mass-adoption and basically solidified Windows as the PC desktop GUI standard. Yes, there was definitely a lot of (very, very valid) criticism thrown its way, but it was good enough that everyone who saw it wanted it.

As far as today goes - Windows is a perfectly fine operating system. It gets its share of dirt thrown its way primarily because it is so widely used that every flaw discovered in it affects 100x as many people as a flaw found in Linux or macOS. But that doesn't mean that there are more flaws in it than other OS's - it only means that those flaws actually have consequences because they are affecting computers that people actually use.

(EDIT: A word)
 
Last edited:
  • Like
Reactions: triptolemus
Linux is a better choice for servers, flat out. Barring any weird hardware, it's also likely a better choice for most embedded systems.
As long as the software that companies want to run is available for Linux, this is 100% true. Unfortunately that isn't always a luxury that companies have. Most companies are not software companies. They don't develop systems in-house and they don't (and shouldn't) consider host platform before considering functionality and support. And the fact is that even today, even in the "everything is in the cloud and 100% hosted" world that so many people think we live in, hundreds of critical systems are built for Windows and only Windows. That isn't likely to change anytime soon.
 
  • Like
  • Disagree
Reactions: ct2k7 and olavsu1
Yes, Intel's offerings have been abysmal as of late, but they still have something that nobody else, including Apple, has - their own in-house fab. People were predicting the demise of Intel for about as long as they've been predicting the demise of Apple. Be it losing to RISC, PPC and the Apple/IBM/Motorola alliance, or Alpha, MIPS in the late 90's to losing to the AMD x64/Athlon in the early 00's. Perhaps the M-Series and ARM will be the final nail in Intel's coffin. But I highly doubt it.

We're getting a bit off topic here but I want to add that Intel's latest chips are also catching up. All else being equal I would probably choose an M4 over a Core/Ultra but the latter have models with NPU in the 7-9 W range and models with comparable performance at the high-end. They don't quite match performance/watt but the gap is not quite like it was when the M1 first came out.

Looking at year-over-year improvements, it appears Intel completely stalled from ~ 2013 - 2018 (release years so I guess the engineering stalled back ~ 2010). 5 years is a long time in the chip industry though and they didn't leap forward right after that so incremental gains on top of 5 years of almost flat performance was not good. The only way people got performance was turning up the clock/heat and so our laptops began to boil.

However, it was that stalling that's what gave Apple the opportunity. If Intel's 10th geneation chips were 2x faster in 2019, Apple would have had a lot harder time making the case for an architecture switch.

Points are 1) we need to refresh our view of Intel since 2020 and 2) as long as new entrants can break into the market it will all work out. That's why I am adverse to anything that resembles of lock-in.

One can start anywhere, but Windows 3.1 was heavily critisised by their users (including yours truly). Then came Vista, which was so bad that the critics of Windows 3.1 started glorifying it. Windows still isn`t good.

Not sure how why you jumped from Windows 3.1 to Vista? Many bad and a few not terrible OS in the middle. Windows 3.1/Windows for Workgorup were okay. Then came Windows 95/98 and Windows NT. Windows NT was pretty solid though of course rarely found in the home. I want to say not great for laptops either.

Windows ME quite terrible. An exercise in subsuming engineering to marketing and monopoly instincts in my opinon.

Windows 2K -- finally a solid OS for everyone. I used this until I switched to MacOS X circa 10.1 (which I liked because as a UNIX person it was a pretty verison of UNIX that ran Office).

Windows XP -- fine once the cartoon interface was turned off

Windows Vista -- skipped over like many others did

Windows 7 -- also solid like 2K/XP and the UI was reasonable with a few tweaks

WIndows 8 -- also skipped over

Windows 10 -- I think the UI is uglier than 7 and the home verison bloated with adware and the like. Plus they also seemed to be trying to foist their cloud and the like on me

Windows 11 -- Haven't used but from what I've read only worse as far as trying to turn my computer into a node on their network (and otherwise a channel for their content/services)


Anyway, at any given moment since the MacOS X 10.1 days, I've liked the available MacOS better than the available Windows though I dislike both for trying to push me to being a node on their network. While Crowdstrike is an enterprise product, the same risks are there with any platform when you give up control. While the trade-offs may be better for many people (possibly most people) that node on a network/locked-in ecosystem is not for me.
 
Windows 10 -- I think the UI is uglier than 7 and the home verison bloated with adware and the like. Plus they also seemed to be trying to foist their cloud and the like on me
See, and I find the Win 7 UI just dated compared to Win 10. I've only ever used Pro and Enterprise versions, so I'll take yours (and others) opinions on the adware, but I haven't really found too many areas where I'm seeing a ton of Microsoft ads (of course, I have turned off search and widgets, which I think were the biggest areas where you would see those ads).

Windows 11 -- Haven't used but from what I've read only worse as far as trying to turn my computer into a node on their network (and otherwise a channel for their content/services)
I moved to Win 11 from Win 10 for only one reason - there is an integration with Teams that made providing software training slightly easier on Win 11 (you have the "Share this Window" link for applications when sharing a window on Teams that makes it easy to switch which application you're sharing). Overall, I haven't seen too much of a difference between 10 and 11 outside of 11's terrible Start menu and the changes they made to Contextual menus. The Start menu thing I fixed using Start11 and the hidden items in the Context menus I've just started to get used to. Would be nice if MS would bring back the Windows 10 Start menu, but otherwise I have no real complaints with Windows 11.
 
world that so many people think we live in, hundreds of critical systems are built for Windows and only Windows.

I would argue.
The more critical the system/service is, the more it (more precisely, the core of the service, most often it's just a database/datacenter) runs on something that is rarely used, Linux, *BSD.
For security reasons, the exact details are always hidden. If someone asks, they will be lied to. for example HospidalsOS ver: 2.0567 (all data bluffed).
The operating system that everyone has falls away, because the OS that everyone has, is is more massively attacked.

//I went to school when lecturers taught that, M$ Windows is not for mission critical systems.
 
I would argue.
The more critical the system/service is, the more it (more precisely, the core of the service, most often it's just a database/datacenter) runs on something that is rarely used, Linux, *BSD.
Again, that's great, if the system you want to run is available on that platform. In many cases it is, in some, it isn't.

I'm not arguing about how things "should" be. I do agree, Linux, BSD are far better server operating systems than Windows. Given the choice, I'll take Linux over Windows any day. The reality is, however, that the choice of platform is going to be down the list a bit, with overall system functionality and alignment with business goals being far above it.

For security reasons, the exact details are always hidden. If someone asks, they will be lied to. for example HospidalsOS ver: 2.0567 (all data bluffed).
The operating system that everyone has falls away, because the OS that everyone has, is is more massively attacked.
I mean, I understand that, this being a Mac forum, security by obscurity is something that protects a lot of people who like less-common operating systems. But it only goes so far in actually offering any real protection against attacks, and as soon as the obscure becomes the common, it is no longer providing that security.

//I went to school when lecturers taught that, M$ Windows is not for mission critical systems.
I'm pretty sure most of us in this forum went to school at some point or another and heard that same lecture from those professors who get to teach students how life would be in an ideal world. Some of us then spent a few decades in the real world, where we learn very quickly that striving for that scholarly kind of "ideal" comes with some very real costs and provides diminishing returns.
 
  • Like
Reactions: ct2k7 and cyb3rdud3
Again, that's great, if the system you want to run is available on that platform. In many cases it is, in some, it isn't.

I'm not arguing about how things "should" be. I do agree, Linux, BSD are far better server operating systems than Windows. Given the choice, I'll take Linux over Windows any day. The reality is, however, that the choice of platform is going to be down the list a bit, with overall system functionality and alignment with business goals being far above it.

It doesn't matter, users will only see the client application anyway. Depending on the benevolence of the system developers, it may exist for absolutely any OS, any computer-like thing you find in the IT store.

But the service core, it only runs on strictly selected servers. These aren't windows based.

I have seen both iMac and HP pc computers side by side in companies. to the question of how you can adapt when one is a "bitten apple" the other is Hewlett Packard. the answer was that it doesn't matter. The main thing is that the necessary client application works.
 
I came here to make all sorts of snarky comments about Windows but then I thought of the very real pain this debacle has caused people around the world, including IT professionals in this thread, and then I decided to pull my head out of my ass and do better.

I do have to say how disappointing it is for the media to use the words “Microsoft“ and “Windows” in practically every headline I read yesterday. It was almost clickbaity, using the name of the OS more people knew to whip up more hysteria.

Bottom line: I want to give thanks for everyone who found themselves giving everything they had to try to rectify this situation. You all are the best.
 
the CROWDSTRIKE update had a bad memory pointer in it they filed to validate in it before they deployed it. This is 100% on CROWDSTRIKE. Microsoft did not update or push anything.

If CrowdStrike tested their code, or used RUST instead of C++ this wouldn't have happened.
People are pointing out this should not take down the entire OS. There should be a protected mode to fall back to to self correct. And don't you find this a bit of an issue? I assume you probably have dozens of applications installed, maybe even an antivirus yourself. All it takes is one bad pointer and its game over?

I mean Windows can recover from some GPU driver crashes without BSOD, why can't it recover from something like this?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.