81 Comments
User's avatar
Sharon Meyers's avatar

I spent my entire career in software quality because that’s where thye rubber met the road. My most gratifying stretch was when I had a manager who told developers, if she says it doesn’t go out, then it doesn’t go out—you fix it. Yes, we started working on some tools to automate some of the QA checkin,. But I maintained that software would not solve the “bugs that are “hard to find,” the corner cases. That would take a creative human brain. I eventually quit over such issues. Completely quit—the company (a very large, very well known one), the discipline. I became a computer science teacher, and taught my “value beliefs” along with the details of coding and getting something to work right for a user. I have railed about testing problems for years, still do.

Expand full comment
Denis Stetskov's avatar

That deserves real respect and an example more people should remember.

You stood for something most teams forgot: that quality isn’t a phase, it’s a value system.

Expand full comment
Udayan Patel's avatar

Code Coverage became a metric and not process. That propagated culture of, instead of writing code to satisfy tests, now tests are written to satisfy code. All too familiar.

I really appreciate the fact that you are actually out there producing engineers that has some level of focus on quality at the beginning of their careers. Otherwise I usually see them come to work with an attitude of who are you to tell me anything! Or when recommended books written by people who changed the way we write software, their response is, I graduated and left books behind.

Expand full comment
Denis Stetskov's avatar

That’s one of the best summaries of where we went wrong.

When coverage became the goal instead of confidence, we lost the essence of testing—to prove something works under pressure, not just under syntax.

The same happened with documentation, architecture, and even reviews—process became theater.

And you’re right about the new generation’s attitude. It’s not arrogance; it’s absence of lineage.

They inherited systems without inheriting the discipline that built them.

Expand full comment
Jose's avatar

A competent, motivated QA engineer calling BS on my broken crap is such a joy and a pleasure. I never understood engineering managers (and software engineers) who treat QA as the enemy. These people are trying really hard to help you. You should listen to them.

And it's not like any of this is new. Brooks extolled the virtues of QA in the Mythical Man Month, if memory serves.

Expand full comment
Denis Stetskov's avatar

Agreed—but QA isn’t immune to the same problem.

There are brilliant testers who understand business logic better than PMs, and there are others who just file noise to stay busy.

Quality isn’t about role titles—it’s about shared intent.

Expand full comment
throwaway's avatar

The pipeline problem isn't just a problem in one niche industry. Its a problem everywhere, and the timeline is actually quite a lot shorter than people think.

There are many problems with AI, you've discussed a number of them quite well. What you don't really touch on is the timeline which comes down to economics. If you are not economically compensated, and there is no perception that you ever will be compensated for something, no one goes into the industry. No one invests at the individual level.

AI is a ponzi that eliminates value. It does so in the same ways that Mises write's about with regard to the inevitable failures of socialism (defined as a centrally organized economy where the means of production are owned or controlled by the state). Money-printed environments meet the definitional requirements for this and I've seen nothing that refutes the core failure domains he provides. While the route is circuitous and indirect, the incentives of negative feedback systems being turned into positive feedback systems drive this forward and the trajectory is set with the outcome being just a matter of time.

If value is only ever based on potential human action, and only one party (producers) are enriched in an economy (sieving wealth into few hands) while demand for work is reduced to zero. No one gets hired. No one can pay for anything. Distortions warp the very perception and economic calculation fails to chaos.

Demand isn't need. Its the intersection of supply and demand where the perceived market supply crosses at a price point where money will change hands. Adam Smith covers this requirement in an economic context in his books (1776). You can have great need, and no demand if no one can or is willing to pay for something.

AI ensures there is no demand in factor markets both in terms of cost per hour in throughput, and in the transparent imposition of additional costs caused by jamming communications networks with slop to tortuously interfere with legitimate workers and producers finding a equitable match up. The same thing happens in cellular networks with RNA interference.

A good portion of the job market cycles out every 10 years in every industry. People retire, are injured, have life changes, or die. Its a pipeline that constantly empties.

If you can no longer find work because the competition is high and interference suppresses wages, to the point where you have submitted well over 10,000 applications of customized resume, have a decade of direct experience, and it gets to the 2 year mark where you only got 12 interviews (no offers) as a return on that investment of time and effort. The trend is set. (This has happened to multiple people I know since 2022).

What happens? The competent most intelligent people retrain to something with economic benefit. They abandon their experience as a write-off because they have to. Its a sticky psychological decision, and while they won't stop, or turn down a job its still a lost investment. This has already happened in IT starting in late 2022, the 2 year mark was last year; people are retraining.

Resilient Systems break on a lag, Money-printing has decoupled the need to act, this is a cascade failure that's been happening, and I've been talking about it but all communications are jammed with AI sentiment slop because tech systems have isolated everyone from the reality.

At year 5, you have no replacements. You can't really find qualified help. Brain drain is in full swing. Year 7-10, knowledge is lost. Your semi-competent, bottom run people burn out, it all gravitates to the lowest common denominator just like any other industry that is fueled by money-printing (i.e. Education/Government). No replacements. If you take a lens of economics from Mises, you'll recognize this is the ECP in factor markets. The other problems too all stem from money-printing. Cooperation based on ties to money prevent true economic calculation, GDP growth without datacenters stalls (0.1% for the year). Stagflation takes hold, and the only solution is helicopter money; but that's no real solution. The sieves also happen on the producer side, outsized concentration leads to corruption and large impacts from bad decisionmaking.

Money-printing leads directly to hyper-inflation and then to deflation, or straight to deflation when it doesn't exponentially increase. Prices distort. Banking drives all this. If you look at the changes they made to banking in 2020, interest rates no longer control the bad actors in banking. Fractional reserve was silently discarded in 2020 for Basel3 which is a flawed system based in objective valuation or value (something that's been disproved). I.e. its called a capital reserve system, but there is no reserve. They set deposit reserves to 0% in 2020.

The History of Collapse and Money-printing all cover what we are seeing now. The effects are a problem domain where it lag ahead of any indication of the cause similar to avalanches or dams.

Stage 3 ponzi is here, and these are just the symptoms. What happens to food production when you can't exchange anything for food because currency is worthless. Those that print money steal slave labor from those that hold it. People stop having children when they cannot support the raising of them. Lots of serious serious consequences. By the time the average person figures it out it will be too late to act to change outcomes (hysteresis). Intelligent people should have caught this, but there's been a war against the intelligent for quite some time to the point very few exist compared to previous years. They used to raise everyone up around them, but because of sophisticated techniques to fractionate, and disunify the masses at scale and other things (originating in torture which destroys such people's minds through trauma).

The canaries have largely died silently with little notice. Its a spectrum sure, but the concentration has decreased precipitously. There are quite dark times ahead.

I've been thinking this series of events through for quite some time now and its probably the highest probability we've had for malthusian collapse we've ever had in documented history; though time will tell (and that's just my opinion).

Expand full comment
Denis Stetskov's avatar

Impressive depth, seriously. Most people wouldn’t spend this much time tracing the system-level incentives, and you actually did the math behind it.

You’re not wrong, the collapse won’t come as an explosion, it’ll come as exhaustion.

AI didn’t start it, but it’s accelerating every failing mechanism we already had: overproduction of noise, underproduction of value, and capital allocation detached from utility.

Money printing masked inefficiency for a decade.

Now energy limits, compute costs, and the hollowing of human expertise are removing that buffer, one industry at a time.

The scary part isn’t that collapse is coming.

It’s that it’s already been profitable for years.

Expand full comment
throwaway's avatar

Thanks. I'm a pragmatic System's Engineer at heart, and I just followed and applied a lot of the same mental frameworks, methodology, and skills of what I learned over the past decade doing this work to tracking down the anomalies related to experiences conveyed by a number of good friends over the past few years (and some personal experiences too).

All in all, It is quite concerning and I would love to be wrong about this, but I cannot discount the objective support as well as anecdotal primary experiences.

From my experience, most of the systems today seek a level of control that snuffs itself out, to the exclusion of everything else. The incentives seemingly follow to leave the individual in a perpetual state of total compromise tainted by a corruption by dependency; which is just madness because it induces learned helplessness.

> It's that its already been profitable for years.

Yes, but as with all ponzi's and deception, it may be profitable right up until the moment its not and that moment cannot be predicted, and is by some fairly basic systems properties unknowable in advance. A small change in inputs leading to dramatic changes in outputs; definitionally mathematical chaos.

As a general rule, I always look at systems taking the entirety of the lifecycle into account. I also don't think this cycle will complete for another 20 years, give or take, but it is now well past the point where we can influence the trajectory (i.e. in this case: loss of control, FMEA).

I intend to have children if I can find the right partner, and when I do; I want them to survive and thrive.

Expand full comment
Kevin's avatar

Laying off experienced Devs to import cheaper less qualified H1B's to save money or off shore to folks on a different timezone with no skin in the game also has hit quality hard.

Expand full comment
Denis Stetskov's avatar

Exactly. You can’t outsource ownership.

When cost efficiency replaces accountability, quality becomes a rounding error.

Expand full comment
Chris's avatar

Software companies are locked in a race to the bottom that requires them to ship fast and check later. Anyone trying to follow the 'path forward' will, alas, be out-competed (reference: the entire history of Microsoft).

The solution is to make software makers liable for damage in the same way that car makers or toaster makers are. Once there are financial penalties for knowingly shipping broken products the whole game changes - but currently they hide behind 100 page 'licences' etc...

Expand full comment
Denis Stetskov's avatar

Yeap. The lack of liability created a perverse incentive structure, “ship now, patch later” became the default business model.

In any other industry, externalities are priced in. In software, they’re written off as “technical debt.”

Until that changes, quality will remain optional and costless failure will stay the norm.

Expand full comment
Natalie's avatar

CrowdStrike “created” tens of billions of dollars by doubling in market cap in the year after blue screening the world.

Cloud capital is completely speculative. I tried for years to pretend that my job as an engineer was to build products that solve problems. It wasn’t. The job of tech engineers is to tinker over vaporware that provides a plausible cover for the hype cycle gambling.

The actual product of a cloud firm is the stock ticker.

Expand full comment
Denis Stetskov's avatar

That’s precisely why I’ve never worked for galley-style companies, where engineering turns into hype manufacturing.

If mine ever becomes one, I’ll be the first to leave.

Expand full comment
AA's avatar

The problem is deeper than this. "Accept that quality matters more than velocity" is not a solution. Because the businesses that are succeeding do not value quality. We need to change the system of commerce to fix this. Consumers need transparency about quality issues and need to be taught to _care_ about quality issues. CrowdStrike's stock is doing great, and that's the problem. The incentives are broken because the system is corrupt and allows a company to put it's customers at risk without consequences.

Expand full comment
Denis Stetskov's avatar

That could’ve been true, but sadly, it’s not.

Just look at today’s AWS outage. The reaction on social media? Memes and jokes.

No one’s asking serious questions anymore.

Expand full comment
Philippe Cloutier's avatar

Well, encyclopedia are a more promising place than social media to look for useful reactions: https://en.wikipedia.org/wiki/Amazon_Web_Services#Significant_service_outages

Expand full comment
Philippe Cloutier's avatar

Right, they need transparency and they need *to demand* transparency. Demand ITSs, and demand reliable reviews (no, not just a comparison of antivirus software by Consumer Reports).

Expand full comment
Jose's avatar

One thing worth mentioning is that we're all pretending Moore's Law still holds. It doesn't. Hardware isn't going to save us anymore.

https://cap.csail.mit.edu/death-moores-law-what-it-means-and-what-might-fill-gap-going-forward

Expand full comment
Denis Stetskov's avatar

Yep, the illusion that Moore’s Law still scales is the foundation of today’s software excess.

We kept writing heavier code, assuming hardware would bail us out.

Now both curves, compute and efficiency, are flattening at the same time. That’s when reality hits.

Expand full comment
WhyNot?'s avatar

Consumption drives world peace, isn't it? The more capacity you buy, the more must be manufactured, the more workers get paid. Optimization? Less spending, less workers needed, more of hungry people moving around and fighting for their future. Big Techs accepting and supporting unoptimized SW are those who pays for world peace.

Your observations are absolutely correct. As IT veteran I observe same. Meltdown is coming, either from physics or humans. Or maybe not. Industrial revolution in 19th century could be seen similarly to our ancestors. Did the world collapse? Well, not yet.

BTW wondering to know where Substack stores all these posts? How many abstraction layers between this web site and HW it runs on? :-)

Expand full comment
Denis Stetskov's avatar

Yeah, we’ve pretty much already hit that wall.

For a decade everyone acted like compute, bandwidth, and electricity were infinite—the cloud made it feel that way. But AI broke that illusion. The models now consume resources faster than we can physically scale generation.

We’re not running out of innovation—we’re running out of power. That’s why new data centers are popping up in Africa: cheap land, cheap energy, and no one protesting environmental impact. The expansion isn’t about progress—it’s about finding the last places left where you can still plug in.

Expand full comment
Missy Trumpler's avatar

Best article I’ve read on this topic yet. Lays out the facts with real world examples of the current consequences we endure then ignore and those coming we won’t be able to.

AI hasn’t taken over yet…we are in a window of opportunity to do it right, to demand we do it better. I taught sustainability in business operations for nearly two decades.

The biggest threat to our environment, as software is literally everywhere, our very ability to exist and function; the biggest threat is complacency.

Quality in software is not a luxury, it’s an existential imperative. As a collective we must do better, demand better from our suppliers, employers, and ourselves. AI isn’t going to guardrail itself. That’s our job. We’ve taken up that mantle at AgileAI Labs. Who’s with us?

Expand full comment
Denis Stetskov's avatar

That’s beautifully said — and I couldn’t agree more.

Complacency is exactly what turned “technical debt” into a business model.

Glad to see people like you pushing for sustainability in both systems and thinking.

Expand full comment
Alan's avatar

"Single most important factor is cost" to the developers (perhaps).

What about the cost to users running the software?

Personally I would rate the latter as more important.

Developers affect the cost to users, not the other way around.

And if the software is good and not throwaway, the usage time by all users will far exceed the development time. So developers should minimize the cost of use by minimizing bugs (test thoroughly?) and reducing resource use.

Expand full comment
Denis Stetskov's avatar

Totally agree. The real cost curve extends far beyond development.

When performance, reliability, or energy efficiency degrade, the “cheap to build” software becomes expensive to run — just not for the company that shipped it.

We externalized technical debt to users and infrastructure. That’s the hidden part of the quality collapse.

Expand full comment
Andre Pierre Normand's avatar

we are also seeing this in the current video game industry

Expand full comment
Haitch Five's avatar

It's so much worse than that if you consider the various types of pointless integration, projects that fail and restart, the great re-writes, and so on.... there's a whole lot more under the rug and it's in the order of magnitude of $1+ trillion globally. We burn the GDP of a middle sized country every year in incompetence alone.

Expand full comment
Denis Stetskov's avatar

The real cost isn’t just bad code, it’s the endless rewrites, failed integrations, and systemic waste we treat as “business as usual.”

We burn trillions not on innovation, but on redoing what should’ve worked the first time.

Expand full comment
TechFacts's avatar

Great article, I really enjoyed reading it. I agree with many of your observations. It feels like a lot of software creators haven’t just stopped caring about performance optimization, but have even abandoned testing it. That’s how we end up with embarrassing issues like massive memory leaks in something as basic as a calculator app or in Spotify.

FYI, I also referenced your article in one of the videos I recently made: https://www.youtube.com/watch?v=7Ox286TCjtU

Expand full comment
Denis Stetskov's avatar

I appreciate that and thank you for referencing it in your video. I'm glad to see this topic gaining traction beyond engineering circles; it really needs more public attention.

Expand full comment
HelloItsMe's avatar

Great article. One question - where is the 322% sourced from? The linked article may be incomplete or incorrect. Thanks!

Expand full comment
Denis Stetskov's avatar

hello, ty. https://apiiro.com/blog/4x-velocity-10x-vulnerabilities-ai-coding-assistants-are-shipping-more-risks/ Apiiro is a respected cybersecurity company founded by former Microsoft security director Idan Plotnik, known for its industry leadership and trusted research.

Expand full comment
HelloItsMe's avatar

Thank you!

Expand full comment
Miles Fidelman's avatar

Massively concurrent, agile, on steroids.

Billions of monkeys, pounding virtual keyboards, moving really fast & breaking lots of things, delivering continuously.

The results are inevitable.

Expand full comment
Denis Stetskov's avatar

AWS Fired 40% of Its DevOps Team :)

Today outage good example what can go wrong :)

Expand full comment
cbkwgl's avatar

There is one more aspect of this. The office computer where the development happens has at least 32 GB RAM but the laptop in the market is still suffering at 8 GB RAM. What works smoothly on your development machine and satisfies all tick marks simply fails in the market because there is no sufficient memory. Unless we are assuming hardware obsolescence due to mandatory software updates is actually a thing, this disconnect between office world and the market is one another thing which is causing this problem.

Expand full comment
Denis Stetskov's avatar

Yeap, works locally attitude :)

Expand full comment