I did not mean to imply this, as there's a very long culpability chain. For this reason, I'm not sure if it makes any sense to imprison individuals for this. A lot of people playing a part in this causing such chaos.
But it is something to be very aware of for those of us who develop software run in e.g. hospitals and airlines, and should receive more attention, instead of only bringing up financial losses which is what usually happens. I noticed the same with the big ransomware attacks.
Indeed, pity that we need major failures like these, for goverments to finally start paying attention to give the same kind of laws as anything else, instead of careless EULAs and updates without field testing.
It's very bizarre to me how normalized we have made kernel-level software in critical systems. This software is inherently risky but companies throw it around like it's nothing. And cherry on top, we let it auto-update too. I'm surprised critical failures like this don't happen more often.
I can't tell if you're serious or sarcastic, but there is such a thing as criminal negligence.
CrowdStrike knows that their software runs on computers that are in fricken hospitals and airports, they know that a mistake can potentially cause a human death. They also know how to properly test software, and they know how to do staggered releases.
Given what we know now, it seems pretty likely that to any reasonable person, the amount of risk they took when deploying changes to clients was in no way reasonable. People absolutely should go to jail for this.
This more or less originated with the unfortunately named MS Herald of Free Enterprise sinking (https://en.wikipedia.org/wiki/MS_Herald_of_Free_Enterprise) - after that incident, regulators decided that maybe they didn't want enterprise quite as free as all that, and cracked down significantly on shipping operators (though the attempt to prosecute its execs for corporate manslaughter did fail).