I know blaming everything on LLMs is in vogue right now; but this is much more to do with Microsoft very publically firing the QA department[0][1] as a cost savings measure and claiming developers will do their own QA (long before LLMs were on the scene). It started in 2014 and the trickle never stopped.
Microsoft has a cultural problem; it went from an "engineers" company to an MBA directed one, trying to maximize short-term shareholder value at the cost of long-term company reputation/growth. It is very common and typical of US Corporate culture today, and catastrophic in the long-run.
The arstechnica article was very good as a history of waterfall v sprint using MS as a case study. However the firing the QA department narrative is not supported:
Prior to these cuts, Testing/QA staff was in some parts of the company outnumbering developers by about two to one. Afterward, the ratio was closer to one to one. As a precursor to these layoffs and the shifting roles of development and testing, the OSG renamed its test team to “Quality.”
Two QA per dev?? That seems ginormous to me. What am I missing about the narrative about evil corp sending all of QA packing, that seems not supported here?
The second, Reuters article seems like it's saying something different than the QA firing narrative - it seems to talk about Nokia acquisition specifically and a smattering of layoffs.
Not supporting layoffs or eliminating QA, and I'm deeply annoyed at Windows 11. I just don't see these as supportive of the narrative here that QA is kaput.
> Two QA per dev?? That seems ginormous to me. What am I missing about the narrative about evil corp sending all of QA packing, that seems not supported here?
I think you're underestimating the QA burden for large parts of the company. When I worked in payments at MS, the ratio of QA to dev after the cuts was probably on the order of dozens to one, if not a hundred or more once you threw in Xbox/Windows/etc accessibility QA from across the organization and all the other people like lawyers involved in handling over a hundred jurisdictions. I was little more than a frontend line cook and even I had three QA people reporting directly to me; two of them helping write tests so they ostensibly should have been automating themselves out of a job.
There is a lot of manual testing when you have a complex system like that where not everything can be properly stubbed out, emulated, or replaced with a test API key. They also have to be kept around to help with painful bursty periods (for us it was supporting PSD2, SCA, or 3DS2, forgot which). Payments is obviously an outlier because there is a lot of legal compliance, but the people I knew in Cloud/Windows also had lots of QA per dev.
I wouldn't be surprised if the degradation in feature parity of newer Windows software was a result of this loss of QA. Without the QA, the developers have to be less ambitious in what they implement in order to meet release schedules, and since they don't have experienced QA they can't modify the older codebases at all to extend them.
Remember also, they were doing an enormous amount of testing with third-party devices and software*. Which is what seems to keep blowing up most spectacularly.
Even if something works on 99.9% of computers, with a billion installs that's a few million dissatisfied customers
In writing life critical systems like the Space Shuttle's operating system, effectively 99.9% of all work is QA.
MS had the dominant operating system in the world, and keeping its userbase and its ~monopoly dividend would have been more profitable as a business than doing... everything it's done in the past twenty years. Selling software that all the people use all the time just has a lot less opportunity for growth than making new software, according to Investor Brain.
The Windows ecosystem is insanely complex. And they supported it, because of the focus on QA and testing the company adopted 20 years ago after the Blaster worm.
I have a few pretty awesome teams stuck managing windows. They find bugs all of the time. The process of fixing them now practically requires a detachment of druids and Stonehenge to track where in the windows/lunar/solar cycles we are and how to deal with the bullshit & roadblocks the support and product teams throw up. If you fall for their tricks, you’ll miss the feature window… no fix for 18 months.
It used to be much easier as a customer in ye olden times, and I never felt that the counterparty at Microsoft was miserable or getting punished for doing their jobs. We feel that now as customers. You didn’t establish relationships with engineers like with other vendors, but there was a different vibe.
The focus of the company moved in to Azure, service ops, etc.
I worked in the windows org around that time and the Dev/QA ratio there was closer to 1:1. QA did both manual testing and much of the automation, quality gates, and did regression testing against older versions of windows. Given the complexity of the product is is fairly easy for an inexpensive change to require an expensive test effort.
And honestly, that person deserves the same pay grade as a "normal" engineer. But sadly, most QA staff are underpaid and somewhat even an inferior class.
Instead, if the QA role was the dominant and better paid title, you'd immediately see an improvement in that partnership. I don't think that you need subordinate staff in the QA role at all.
And for what its worth, I'm that guy. I am a strong technical software developer, but I would much rather test and poke at code bases, finding problems, working with a "lead" developer, and showing them all their quality mistakes. If I could have that role at my pay grade, I'd be there.
In the chip design world, 2:1 for design verification to design is on the low end of normal.
Some organizations have gone as low as 1:1 but that is considered an emergency that must be fixed. It’s so important that designers will be intentionally underworked if there are not enough validation engineers on staff.
When you can’t fix bugs in the field, quality is important.
QA is definitely one of those "you get what you pay for". A dev just bangs out code on what is assumed "happy path" which means the user uses it as the dev expects. QA has to some how think of all the inane ways that a user will actually try using the thing knowing that not all users are technically savvy at all. They are actively trying to break things not just feed in clean data to produce expected outputs. Let's face it, that's exactly what devs do when they "test". They are specifically trying to get unexpected outputs to see how things behave. At least, good QA teams do.
I worked with a QA person who I actively told anyone that listened that the specific QA person deserved a higher salary than I did as the dev. They caught some crazy situations where product was much better after fixing.
> QA has to some how think of all the inane ways that a user will actually try using the thing knowing that not all users are technically savvy at all.
The classical joke is: (this variant from Brenan Keller[0])
A QA engineer walks into a bar.
- Orders a beer.
- Orders 0 beers.
- Orders 99999999999 beers.
- Orders a lizard.
- Orders -1 beers.
- Orders a ueicbksjdhd.
First real customer walks in and asks where the bathroom is.
I feel that not only should QA staff outnumber developers, but QA staff should have access to development time to design and improve QA tooling.
If you're doing an OS right, the quality is the product. I think MacOS prior to the launch of the iPhone would be the gold standard the kind of product design I'm talking about. At that time they were running circles around Windows XP/7 in terms of new features. They were actually selling the new OSes and folks were happy to pay for each roughly annual upgrade. Often the same hardware got faster with the newer OS.
Lately Microsoft and Apple are racing to the bottom, it seems.
I'm pretty sure that the recent shitshow (at least in iOS land) is the failure to have tentpole Apple Intelligence features, so scraping the bottom of the barrel and shipping things that were in no way finished (e.g. Liquid Glass UI/X).
1. SDETs (software design engineer in test) - same pay scale and hiring requirements as SDEs, they did mostly automated testing and wrote automated test harnesses.
2. STEs (software test engineer) - lower pay scale, manual testing, often vendors. MS used to have lots of STE ftes but they fired most of them in the early 2000s (before I joined in 2007).
An ideal ratio of SDETs to SDEs was 1 to 1, but then SDET teams would have STE vendors doing grunt work.
Having STEs as full time employees benefited MS greatly. They knew products from the end user and UI/UX perspective inside and out in ways even the SDETs didn't.
UI/UX quality in MS products dipped noticeably after the STE role was eliminated.
Imho, there are two key values that I've seen QA bring to software companies.
1. Deep user/product expertise. QA (and support) almost always knows more about how users (including expert users) actually use the product than dev.
2. Isolation of quality from dev leadership politics. It should be unsurprising that asking an org to measure and report the quality of its own work is fraught with peril. Even assuming good intentions, having the same person who has been developing and staring at a feature for months test it risks incomplete testing: devs have no way to forget all the insider things they know about a feature.
The best places I've worked were places where QA reported up an entirely different leadership chain than engineering, and where they got their own VP with equal power as the engineering VP, and their own seat at the same decision-making table.
When QA is subordinate to engineering, they become a mere rubber stamp.
A good question to ask when joining a software company is "Does QA have the power to block releases over the objection of engineering?" I have found companies who can answer YES to this put out much better products.
There was a real problem of QA becoming bloated and filled with less than qualified people. The really good engineered would transfers out to SDE orgs and so the senior ranks of QA tended to be either true believers are people who weren't good enough to move to SDE orgs.
Especially with QA outside of Microsoft at the time paying so much less, it was a wise long term career move to move to SDE as soon as possible.
2 people doing QA per dev seems insane even if it’s a lot cheaper. M$ is hardly know for being obsessed with quality, they’d rather have 2 sales per dev (sales is even cheaper, basically pays for itself)
That was in 2014, doesn't explain the timing of these increasingly common broken patches. I had never gotten as many calls over Windows Update messes from my non-techie family as last year.
The lack of QA isn't felt right away. They are accumulating tech debt, which mean problems are becoming more frequent and harder to solve over time until they fix the fundamentals, and it doesn't feel like they intend to.
1. "isn't felt right away" then what's the correct timescale? Is it 2 years? Is it 5 years? We are looking at 10 years now. Do you have any studies on this that you can quote to prove that at Microsoft scale and for the product they develop, 10 years is the time when things go bad?
2. "becoming more frequent and harder to solve" how much more frequent and harder? Things works pretty fine during Windows 10, but these days I run into a bug in Windows 11 every other day myself.
It would be a surprise if this has more to do with QA from 2014 than vibe coding.
Updates breaking stuff already started when they moved from the security/bugfix-only updates to the add-new-features-into-the-mix model with Windows 10. That was roughly 10 years ago.
These are multipliers. First, the QA left, but nothing major happened for years, automated tests did suffice. Then, vibe code happened, that with the lack of QA, led to disaster.
I doubt "studies" exist and proving every little assumption takes too much effort as per Brandolini's law.
Windows is like a fractal layer of progressive enhancements. You can drill into esoteric windows features and almost physically see the different decades windows has existed in, not unlike a physical tree (with leaves).
They won't fix the fundamentals, the next API layer will just be built over the broken one.
I'm waiting in morbid anticipation of the obvious next broken layer: They'll rename Windows to CopilotOS, and 90% of how you interact with the OS is through a LLM chat box. Of course, as is historically the case with Windows, there will be that 10% not brought into the new way, so you'll need to launch a traditional windows desktop+start menu to access that stuff. Just like 90% of the system today uses modern UI, but there's still that 10% using the legacy Windows look and feel, like the Run dialog and the Disk/Device manager.
Oh boy, in 2015 Windows 10 was released, and it was extremely broken, including endless reboot loops, vanishing start menu and icons, system freezes, app crashes, file explorer crashes, broken hardware encryption and many broken drivers – so really it was about the same as now. Embracing LLMs and vibe-coding all around made this even worse of course
Oh, Yes. Windows 10 had big issues on arrival. But this is also selective Amnesia. The Windows 8 UI was nearly unusable on release. Windows Vista was so legendarily broken on release, that even after it became stable, the majority of technical users refused to give up Windows XP went straight to Windows 7. And even Windows XP that everybody fondly remembers was quite a mess when it came out. Most home users migrated from the Windows 9x line of Windows, so they probably didn't notice the instability so much, but a lot of power users who were already on Windows 2000 held up until SP2 came out. And let's not even talk about Windows ME.
The only major Windows version release that wasn't just a point upgrade that was stable in the last century was Window 7 and even then some people would argue this was just a point upgrade for Windows Vista.
I'm sure that Microsoft greatly reducing their dedicated QA engineers in 2014 had at least some lasting impact on quality, but I don't think we can blame it on bad releases or bungled Patch Tuesdays without better evidence. Windows 10 is not a good proof for, consider Vista had 10 times as many issues with fully staffed QA teams in the building.
It also doesn't matter. It doesn't feel like it, but Win11 released almost 5 years ago (October 5, 2021) and there's already rumors of a Win12 in the near future.
We're way past the "release issues" phase and into the "it's pure incompetence" phase.
Oh wow, I hadn't even paid any attention to that. To me Windows 11 was released on October 1, 2024, when the LTSC version came out, and is roughly when I upgraded my gaming PC to the said LTSC build from the previous Windows 10 LTSC build.
> Windows Vista was so legendarily broken on release, that even after it became stable
Vista is different. Vista was _not_ bad. In fact, it was pretty good. The design decisions Microsoft made with Vista were the right thing to do.
Most of the brokenness that happened on Vista's release was broken/unsigned drivers (Vista required WHQL driver signing), and UAC issues. Vista also significantly changed the behavior of Session 0 (no interaction allowed), which broke a lot of older apps.
Vista SP2 and the launch version of 7 were nearly identical, except 7 got a facelift too.
Of course, the "Vista Capable" stickers on hardware that couldn't really run it didn't help either.
But all things considered - Vista was not bad. We remember it as bad for all the wrong reasons. But that was (mostly) not Microsoft's fault. Vista _did_ break a lot of software and drivers - but for very good reasons.
Vista was good by the time it was finished. It was terrible at launch. I bought some PCs with early versions of Vista pre-installed for an office. We ended up upgrading them to XP so that we could actually use them.
Yeah. I challenge the idea that Vista was terrible but 7 was peak. 7 was Vista with a caught-up ecosystem and a faded-away "I'm a mac, I'm a PC" campaign
I have this vague memory of people being shown a rebranded Vista and being told it was a preview of the next version of Windows, and the response was mostly positive about how much better than Vista it was. It was just Vista without bad reviews dragging it down.
> The only major Windows version release that wasn't just a point upgrade that was stable in the last century was Window 7 and even then some people would argue this was just a point upgrade for Windows Vista.
IIRC Windows 7 internally was 6.1, because drivers written for Vista were compatible with both.
Windows 8 was an insane product decision to force one platforms UI to be friendly to another (make desktop more like tablet). Mac is doing this now by unifying their UIs across platforms to be more AR friendly
Speaking of XP. Windows XP SP2 is really when people liked XP. By the time SP2 and SP3 were common, hardware had caught up, drivers were mature, and the ecosystem had adapted. That retroactively smooths over how rough the early years actually were.
Same thing with Vista. By the time WIndows 7 came out, Vista was finally mature and usable, but had accumulated so much bad publicity from the early days, that what was probably supposed to be Vista SP3 got rebranded to Windows 7.
It's a very superficial "truth", in the "I don't really understand the problem" kind of way. This is visible when you compare to something like ME. Vista introduced a lot of things under the hood that have radically changed Windows and were essential for follow-up versions but perhaps too ambitious in one go. That came with a cost, teething issues, and user accommodation issues. ME introduced squat in the grand scheme of things. It was a coat of paint on a crappy dead-end framework, with nothing real to redeem it. If these are the same thing to you then your opinion is just a very wide brush.
Vista's real issue was that while foundational for what came after, people don't just need a strong foundation or a good engine, most barely understand any of the innards of a computer. They need a whole package and they understand "slow" or "needs faster computer" or "your old devices don't work anymore". But that's far from trash. The name Vista just didn't get to carry on like almost every other "trash" launch edition of Windows.
And something I need to point out to everyone who insists on walking on the nostalgia lane, Windows XP was considered trash at launch, from UI, to performance, to stability, to compatibility. And Windows 7 was Vista SP2 or 3. Windows 10 (or maybe Windows 8 SP2 or 3?) was also trash at launch and now people hang on to it for dear life.
It delivered a terrible user experience. The interface was ugly, with a messy mix of old and new UI elements, ugly icons, and constant UAC interruptions. On top of that, the minimum RAM requirements were wrong, so it was often sold on underpowered PCs, which made everything painfully slow.
Every version of Windows released was an unusable piece of garbage, back to the beginning. MS put it out, it was crap, but somehow managed to convince users that they needed to have it, patched it until it was marginally usable, then, when users were used to it, forced them to move on to the next.
There's a great talk that explains how code structure ends up looking like the org chart, and every subsequent organization chart layered on top producing spaghetti code. Windows is now old and full of spaghetti code. Then Microsoft layed off all the expensive seniors who knew the stack and replaced them with cheaper diverse and outsourced staff. Then the people who can't maintain the code use AI and just ship it without any testing.
> but this is much more to do with Microsoft very publically firing the QA department[0][1] as a cost savings measure and claiming developers will do their own QA (long before LLMs were on the scene). It started in 2014 and the trickle never stopped.
We know this was the correct move because Microsoft's stock price has gone up tremendously since 2014, those in the c-suite received massive bonuses and the worlds most efficient system for resource allocation has deemed it so.
It has been an MBA company for most of its life. If I had to draw the line, IMO seems Windows 2000 was the last engineer-driven product, and by then it had already developed predatory habits.
> It’s an opportunity for other companies to take over imo.
This is a feeling commonly shared here.
I'd like to point out that IBM still dominates the large, billion-dollars worth mainframe market, almost 70 years after it invented it, despite continuous mismanagement for probably 40 years.
Microsoft dominates the PC market 40 years after taking it over with MS-DOS, and despite multiple debacles (Windows Millennium, Windows Vista, now Win 11, probably others I'm forgetting).
Microsoft dominates the office suite market 30+ years after taking it over with MS Office, despite some huge controversies (the Ribbon still annoys nerds, to this day). More than that, Microsoft has leverage MS Office to become the close second cloud provider after AWS despite starting far behind it.
Google and Apple will probably dominate the smartphone and tablet markets for a long time, after taking over those markets 10+ years ago.
The market can stay irrational longer than you can stay solvent and a company with a massive moat can outlive most of us. I'd actually turn this on its head by saying that assuming a new comer will topple the incumbent "any day now" is the irrational approach to a market.
> I'd like to point out that IBM still dominates the large, billion-dollars worth mainframe market
Companies continue to pay the IBM tax, but the way IBM writes support contracts incentivizes customers to work very hard at moving workloads to Windows/UNIX. IBM is choosing "Better to reign in [mainframe], then serve in [commodity compute]."
More competitionis better. If you take the market share and revenue off the table and spread that around in a competitive market you'd be in a much more interesting spot with respect to technology advancements. Instead we continue to stagnate with bullshit like Windows 10 --> Windows 11. Windows 11 was never supposed to exist, but $$$$$. There's literally nothing worth paying for in that upgrade. But Microsoft knows it can milk businesses and schools out of ridiculous profits for, essentially, the same garbage and also collude with hardware manufacturers to sell more PCs.
> There's literally nothing worth paying for in that upgrade.
Well there is the violations of Fitts law with the movement of the start button to the centre of the bar?
But it does make it look slightly more Mac! They should make sure the next upgrade moves the corner to grab away from the actual corner, and that the cursor change for grabbing it doesn't always trigger if they want to really rip it off.
I think Windows 11 in particular is a confluence of two other problems with respect to competition:
1. Subscriptions instead of discrete paid versions removes the incentive to put out a good product. In the past, if the new version was bad it was a direct financial hit. But now there's no direct financial feedback loop, as long as it's not so terrible that you leave the subscription entirely
2. I think Windows 11 is the first time there's no other version of Windows still in support you can use to "ride it out"
I think all companies eventually mutate into a MBA company. For MSFT there was a culture from very early that PMs should lead the project instead of engineers. I read in "Showstoppers" that Cutler was very against of the idea and he pushed back. So that means even in the late 80s MSFT was already a MBA-centered company. The only reason that it has not degraded yet, was because it has not achieved the monopoly position. Once it does it started to chew on its success and quickly degraded into a quasi-feudal economic entity.
There seems to be a lot of internal factionalism that's showing up in the final product. I think this is a chromic disease that flares up every couple of years and is then clamped down on... but for whatever reason the lessons are never learned for long.
Some useful tech has come out of the development of VS Code that every other editor has been able to benefit from but I don’t rate it much as an editor any more.
It’s rare for MS to do just the embrace and extend part of EEE, unless Copilot is the latent implementation of ‘extinguish’.
Other than what they're doing to the whole Open Source ecosystem by buying github, stealing all the code for their AI regardless of license, renaming multiple adjacent things to "Github *".
> I know blaming everything on LLMs is in vogue right now; but this is much more to do with Microsoft very publically firing the QA department.
Yes, yes, "agile" everything...
I remember clicking on a perfectly honest button in Azure Dev Ops (Production) and it told me that the button is completed but the actual functionality will be probably delivered in Sprint XY.
On a contrary note: if LLMs really are that helpful why are QA teams needed? Wouldn't the LLM magically write the best code?
Since LLMs have been shoved down everyone's work schedule, we're seeing more frequent outages. In 2025 2 azure outage. Then aws outage. Last week 2 snowflake outages.
Either LLMs are not the panacea that they're marketed to be or something is deeply wrong in the industry
Yes, it is both. If something is forced top down as a productivity spike then it probably isn't one! I remember back in the days when I had to fight management for using Python for something! It gave us a productivity boost to write our tooling in Python. If LLMs were that great since the start, we would have to fight for them.
> but this is much more to do with Microsoft very publically firing the QA department[0][1] as a cost savings measure and claiming developers will do their own QA (long before LLMs were on the scene).
I will never ever understand this. Development and QA are two different mindsets. You _can_ do both, but you* can't be great at both.
Microsoft fired their QA because at the end of the day, they are beholden to shareholders. And those shareholders want higher profits. And if you want higher profits, you cut costs.
It's not a culture problem. It's a 'being a business' problem, which unfortunately affects all publicly-traded companies.
Shareholders are, on average, not this activist. A CEO can in fact run a public company with a long-term outlook instead of pumping the numbers for just the next quarter.
That’s a cop-out though. Company boards are legally required to act in the best interests of shareholders, and plenty of shareholders would agree that running a business in a sustainable way that can deliver profits over the long term is more in their interests than a business trading its future for some short term profits.
It’s a cultural problem really, where too many people who study business and economics have been taught this idea that it’s a moral necessity that businesses maximise profit for shareholders (to the point where plenty of people even wrongly believe that’s a legal requirement!), but it’s an ideological position that has only caused once great companies to fail and huge damage to our economies.
I can't wait until we can live in a better era where we look back with collective disgust at the blatant white-collar crime time period that was ushered by Friedman and Welch.
That, plus the current era, feels to me like a massive dog whistle for people who can't read satirical stories like A Modest Proposal without taking them as instructions.
Microsoft has a cultural problem; it went from an "engineers" company to an MBA directed one
Every simplistic analysis of failing company X uses a hackneyed cliche like this. But in the case of MS, this is completely ridiculous. MS has been renowned for shitty software, since day one. Bill Gates won the 90s software battle based on monopoly, connections and "first feature to market" tactics.
If anything, the heyday of MS quality was the mid 2000s, where it was occasionally lauded for producing good things. But it was never an engineers company (that's Boeing or whoever).
Microsoft has a cultural problem; it went from an "engineers" company to an MBA directed one, trying to maximize short-term shareholder value at the cost of long-term company reputation/growth. It is very common and typical of US Corporate culture today, and catastrophic in the long-run.
[0] https://arstechnica.com/information-technology/2014/08/how-m...
[1] https://www.reuters.com/article/business/microsoft-expected-...