this post was submitted on 01 Aug 2024
328 points (99.4% liked)
Technology
58303 readers
16 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
You highlighted the wrong portion of this article.
If the CEO is making claims that the software is tested and certified, then the CEO should be able to prove that claim, no matter where the software lives. It is very reasonable to say, at face value, the CrowdStrike testing pipeline was inadequate. There is a remote possibility that there were mitigating factors, eg some other common software update released right before from another vendor that contributed; given CrowdStrike’s assurances and understanding of where it falls in most supply chains I consider that to be bullshit. I personally haven’t seen anything convincing that shows a strong and robust CI pipeline magically releasing this issue.
Now shareholder lawsuits are bullshit in general and, as someone constantly pushed to release without fucking any confidence, I think it’s really fucking dumb to ever believe any software passes any inspection until you have actually looked at the CI/CD process in-depth.
I mean it was true. It's just that here was a bug with the automated testing software that let the bogus file go through.
They could have shown their testing/certification pipeline to investors, but it wouldn't have changed anything unless investors would have somehow been able to figure out there was a bug in what they showed them.
To add to that, I very much doubt any big company tests and verifies anything anymore.
Boeing ships planes with missing bolts and proper software, Crowdstrike pushes updates with no testing, we've all seen Microsoft push updates that break stuff because there's no testing, and that's just what comes to mind.
That's how they maximize profits - get rid of testing environments, do minimal checks, and have the one guy doing 3 jobs at once just push it to production.
I've been in IT for the banking industry for over a decade and I promise you, we're all a missed cup of coffee or a comma away from another massive outage due to a program or network misconfig.
As long as business culture is set to maximize profits for one quarter, I wouldn't trust a sales website about "verification" or "disaster recovery backups" any more than I trust a used car salesman.
That goes for Crowdstrike, but also all of their competitors.
I’ve got friends at Boeing on DoD contracts. Not only is it waterfall, it gets tested hardcore. My experience in private industry is the exact opposite. A consultancy I know of just lost (pretty sure) a state contract because they opened shit up to the public because, surprise surprise, they didn’t test their infra changes.
Now I will say that when I have had to manage client SLAs and there is a cost to post-release defects and change requests, testing increases. Not to the level I’m super comfortable with (which is well below perfect, mind you; I like shipping more than once in a lifetime), but a bit more.
The CTO of a competitor, Sentinel one, was just on the security podcast Risky buisness. He went deep into how his company does this.
Apprently, their client touches the kernel much less, so it is less likely to cause issues. They also have a large internal test bed that updates have to pass to go out at all, and then if they have a 2% failure rate during the wide deployment, the update is automatically stopped.
Crowdstrike does almost none of this. There core client is deep in the kernel, making it powerful and dangerous. They apprently do test on their local machines, but the company is an apple shop, so none of the windows updates was tested locally. The updates pushed out started crashing computers immediately, but weren't stopped for 78 minutes by manual intervention. That was long enough to crash 8 million computers across the world.