AI code assistants will cause great harm to business in the next ten years or so. Why? Because nobody will understand how their software works, what it’s doing exactly, if it’s actually secure, or in what ways it may fail to scale and so many more particulars. To some approximation AI could answer these questions, but good luck with the maintenance. Over years, widespread and deep employment of AI coding will devastate the software development ecosystem companies rely on for healthy cyber-infrastructure – not just application code but operations and security.

What’s the problem with AI coding? It’s not only that the developers never wrote the code for a specific application, and therefore don’t understand it. Younger competent developers may not show up to even make the attempt – over use of AI coding assistance everywhere will prevent skill development in general, shrinking the pool of new hires. That’s fine at first, of course, because AI is there to fill the gap. But it won’t end well.

Theory Building

A while back I wrote a rather unfocused post “Software Development is Developing Knowledge.” In my post I discussed why Agile methods work – if they’re truly “agile” in the literal sense – and what that really tells us about the nature of software development: It’s what we learn while building, not before.

Later, I learned about Programming is Theory Building. It’s a must-read. I can’t do “Programming is Theory Building” justice in a few sentences. From the author:

The present discussion is a contribution to the understanding of what programming is. It suggests that programming properly should be regarded as an activity by which the programmers form or achieve a certain kind of insight, a theory,of the matters at hand. This suggestion is in contrast to what appears to be a more common notion, that programming should be regarded as a production of a program and certain other texts. – Peter Naur, PROGRAMMING IS THEORY BUILDING

For a related take, see Software Design is Knowledge Building . “Software Design is Knowledge Building” frames it’s title conclusion as a story of maintaining a fairly new, good piece of software and why it can fall apart: It was the knowledge built and learned during the course of building the application that allowed the original developer to be so much more effective than their replacements.

Large scale use of AI to code will remove the most important part of the activity for the health of the software and the people who rely on it. It’s not the writing itself, but the theories and tacit knowledge that gets built along the way. Documentation can help, but typically the documentation doesn’t capture the right set of theories and tacit knowledge. The story in “Software Design is Knowledge Building” illustrates this point well. It’s a pattern we all know, that plays out year after year – a real problem to which documentation isn’t a good solution, or we’d know by now how to prevent it.

It’s like renting a robot to run on a treadmill for you. No lung capacity or muscle got built. Now, this isn’t a perfect analogy, since the result of AI coding is (apparently) working software, whereas the robot “exercise buddy” is ridiculous (except if you attach a pedometer or your iPhone to it to cheat on your health care discount by getting steps for you!)

No, the AI assisted software is much, much worse. With the robot, we’d all know we were deceiving the health insurance company, and we wouldn’t be fooling ourselves. AI coding, on the other hand, produces real output that makes it so seductive – you’re given what looks like a partial good solution or even full working program in no time, but they’re empty calories. If used judiciously, it’s probably fine: Nobody learned much from directly converting 100 two-line formulas from Scala to Rust if you already know those languages. It’s simply tedious. (You’d better have tests though.) But use AI too often and it’s like never walking more than required to get to your car.

Like consuming empty calories (all white bread, let’s say,) you can coast along for a while before malnutrition sets in. By using AI wherever possible, companies will pile up a debt that will be very pricey to pay off. At some point they will hit a wall and it will be as if they had fired all their developers and hired all fresh ones. Anyone in the business knows this will not go well. By definition no person wrote their code, so there’s nobody to call up and get back as an emergency consultant “sorry we let you go, um would you be interested in helping us out with a little problem…” There’s some question of how soon the technical debt will register at the top executive level. I think it could be several years.

I have no good estimate of how many serious service outages AI rot will cause. How many life-threatening situations will it contribute to? I’m not sure. I am certain it will lose business a lot of money, cause a lot of headaches and delay projects.

But those are just the first-order effects.

New Languages Can’t Take Hold

LLM models do best at really mainstream languages and frameworks with lots of content to learn from. By definition, new languages and libraries don’t have a lot of content out there to instruct AI. It’s the quality of what they offer over existing technology that drives their adoption, but this decision is a very human type of choice. The AI isn’t going to go out of its way to try to use a new esoteric language or convince a human to use it.

In any case, people want good results and won’t ask AI to use a technology that they know the AI does poorly with.

Nobody New Learns

Popular AI chatbot applications have already disrupted education, especially writing heavy subjects and computer science code related courses. Cheating rates are huge. Of course, instruction styles will adapt. But even after that, people will know they can rely on an AI assistant once they get out of school. Skills atrophy quickly. Use it or lose it.

On the job, how do new developers learn? If AI use is encouraged, they probably need to show a high level of output, meaning they’re indirectly pressured into using AI whenever they can to meet deadlines and metrics, shortcutting any real learning.

In isolation, productivity tools are really attractive: Think of how much easier this task will be. But what about when everyone you know has access to the same tools? Sure the total output will be more, but your level of daily grind won’t go down (which often is the initial selling point – “you’ll have more free time!”.) It’s a lie.

That’s fine for new ways of laying out a newspaper editorial page – it was easier on a Mac than the old way with all physical media. Losing the older skills wasn’t a big liability. The difference with AI coding assistants is that to keep the productivity gains up, AI is preventing developers from learning how to develop software in the old way, meaning new developers won’t understand the outputs of the AI assistants well. I think this is a bad thing, but it’s debatable as long as AI is always with us from now on. Worse, AI prevents the new developers from developing their own theories of the software they “write” with AI – really understanding it on a deep level. This understanding is a huge part of the value of the software, just a hidden one to many people.

That’s not going to be true of every new developer, of course. Those who manage to learn development the old fashioned way may (eventually) find themselves in high demand once the reality of the catastrophe hits.

The AI has Nobody to Learn From

With fewer people producing all human made software, and without databases like Stack Overflow for new content, stagnation sets in. AI learns from other mediocre AI content.

Senior developers may end up spending most of their time reviewing AI generated code and gluing together AI output. That’s not why we got into programming – it’s the most draining and dull part for most of us. At best, this style of working means there’s not a lot of new output from those developers to teach new things to the future AI coding assistants. At worst, developers may drop out of the field entirely. Maybe they’ll come back to pick up the pieces.