In 1979, a Harvard MBA student named Dan Bricklin sat in a lecture hall watching his accounting professor erase and recalculate an entire financial model on a blackboard — column by column, row by row — because a single input had changed. Bricklin thought: what if there was a program that could do this automatically?
He and his friend Bob Frankston built VisiCalc over a summer. It shipped for the Apple II at $100. Within two years, VisiCalc was selling 12,000 copies a month and had essentially created the personal computer market. People weren’t buying Apple IIs for word processing or games. They were buying them to run spreadsheets.
The accounting profession was supposed to be finished.

Here’s what actually happened. Between 1980 and 2010, the United States lost roughly 400,000 bookkeeping and accounting clerk positions. Those jobs genuinely vanished. But over the same period, the country gained about 600,000 management accountant and financial analyst roles. The accounting profession didn’t shrink — it roughly quadrupled in economic value. Today there are about 1.4 million accountants in the U.S., up from around 339,000 before the spreadsheet.
NPR’s Planet Money put it simply:
“Accounting basically became cheaper, and sometimes, when something gets cheaper, people buy a lot more of that thing.”
I touched on this pattern briefly in part 1 of this series. But this time I want to go deeper — because understanding the economics of what happens when a core input to production suddenly gets cheaper isn’t just academic. It’s the single most important question for every person writing software today.
The coal miner who predicted the AI debate
In 1865 — yes, 1865 — a British economist named William Stanley Jevons published a book called The Coal Question. His observation was counterintuitive enough to still confuse people 161 years later: when James Watt’s improved steam engine made coal usage more efficient, total coal consumption didn’t decrease. It skyrocketed.
Why? Because cheaper energy opened up entirely new applications. Factories that couldn’t afford to run on coal suddenly could. Industries that didn’t exist before the efficiency improvement sprang up around the newly affordable resource. The savings from efficiency were overwhelmed by the explosion in demand.
This became known as the Jevons Paradox, and it shows up everywhere. LED lightbulbs use a fraction of the energy of incandescents — and global lighting energy consumption has gone up, because we now put lights in places nobody would have bothered before. Bandwidth gets cheaper every year, and we use exponentially more of it.
Applied to software: if AI makes code dramatically cheaper to produce, the Jevons Paradox predicts that the world won’t buy less software. It’ll buy vastly more. And more software means more people needed to build, maintain, and evolve it.
That’s the optimistic case. But I’m not here to sell optimism. I’m here to figure out if it’s actually true.
The evidence for explosion
Let’s start with what the market data actually shows. Morgan Stanley projects the software development market will expand from roughly $24 billion to $61 billion by 2029 — about 20% annual growth. That’s not a market bracing for contraction.
GitHub reported 180 million developers on its platform as of late 2025, with 36 million new accounts added that year alone. Even accounting for bots and inactive accounts, that’s extraordinary growth during a period when headlines screamed about developers being replaced.
The Bureau of Labor Statistics projects software developer roles will grow 17% through 2033, significantly faster than the average for all occupations. Stack Overflow’s analysis from February 2026 argues bluntly that “demand for code is infinite” — every reduction in the cost of building software reveals a new layer of unmet demand.
And then there’s a16z’s Steven Sinofsky, who frames it at a higher altitude:
“We need vastly more software, not less.”
Think about that for a second. How many small businesses still run on spreadsheets and email because custom software is too expensive? How many processes in education, healthcare, and government are still manual because no one can afford to automate them? How many internal tools never get built because the development team is already backlogged for eighteen months?
(I once worked with a mid-size logistics company that had a whiteboard — a literal, physical whiteboard — tracking their delivery routes. In 2023. Not because they didn’t want software. Because they couldn’t justify the $200K quote from a development shop. If AI cuts that cost to $40K, they won’t pocket the savings. They’ll finally build the thing.)
The demand backlog is staggering. Every business I’ve ever consulted for has a list of things they wish they could build but can’t afford. AI doesn’t eliminate that list. It makes more of it financially viable.
But then there’s the ATM
This is where I need to complicate my own argument, because the Jevons Paradox isn’t a law. It’s a tendency. And tendencies have limits.
ATMs are the most instructive case study I’ve found. When automated teller machines rolled out across the United States in the 1970s and 1980s, the initial effect was almost comically counterintuitive: teller employment doubled. From roughly 250,000 workers in 1970 to about 500,000 by 2010.
The mechanism was pure Jevons. ATMs reduced the number of tellers needed per branch from about 20 to about 13. That made each branch cheaper to operate. Cheaper branches meant banks could afford to open more of them. More branches meant more total tellers, even with fewer per location. The cost savings didn’t eliminate jobs — they funded expansion.
James Bessen of Boston University documented this meticulously, and the American Enterprise Institute confirmed the pattern with federal employment data. It’s one of the cleanest real-world demonstrations of the Jevons Paradox in action.
But here’s the part of the story that optimists conveniently leave out.
After 2010, mobile banking arrived. And teller employment didn’t just plateau — it collapsed. Down over 30% and still falling. ATMs had shifted what tellers did (less cash handling, more relationship selling), but mobile banking eliminated the reason to visit a branch at all. The Jevons expansion ran for about 40 years. Then a second wave of technology addressed the remaining human tasks, and the paradox broke.
The Jevons Paradox isn’t permanent. It works when demand is elastic — when cheaper production reveals new uses. It breaks when demand saturates — when there’s nothing left to expand into.
This is the crucial distinction nobody’s making clearly enough. The question isn’t “will the Jevons Paradox apply to software?” It almost certainly will, for a while. The question is: how long before the second wave hits?
The BLS split that tells the real story
Here’s a data point that most commentators either miss or deliberately blur. The Bureau of Labor Statistics tracks two separate categories: “computer programmers” and “software developers.” They are not the same thing.
“Programmer” roles — the category that historically covered people who primarily write and maintain code to specification — have declined 27.5% since their peak. That’s not a blip. That’s a structural contraction.
But “software developer” roles — the broader category that includes architecture, design, project coordination, and system thinking — declined only 0.3%. Essentially flat. And that same category is projected to grow 17% over the next decade.
ADP Research’s data tells a similar story: the developers who are primarily code producers are being squeezed. The developers who are primarily system thinkers are not.
This is the Jevons Paradox in granular action. The spreadsheet didn’t kill “people who work with numbers.” It killed bookkeepers and created financial analysts. The category survived and grew. The specific role within it that was most routine did die.
VisiCalc automated arithmetic, not judgment. It automated production, not accountability. AI is automating code generation, not system thinking. The pattern is consistent. But so is the uncomfortable corollary: the Jevons Paradox doesn’t mean YOUR job survives. It means the profession survives. The 400,000 bookkeepers who lost their jobs were not the same people who filled the 600,000 new accountant roles.
Let me say that again, because I think it matters: net job growth in a profession does not mean individual job security. The macro trend can be positive while your personal trajectory is negative, and telling a displaced programmer that “the field is growing” is about as helpful as telling a drowning person that ocean levels are stable on average.
What kind of work expands?
If cheaper code produces a Jevons-style demand explosion, what exactly does “more software” mean for the people building it?
In the VisiCalc era, the arithmetic went away and the analysis expanded. In the ATM era, the cash handling went away and the relationship selling expanded. The pattern isn’t just “some work disappears and other work grows.” The work that grows tends to be higher on the abstraction ladder — more context-dependent, more judgment-heavy, harder to standardize.
For software, I see the split forming along these lines:
What shrinks: boilerplate code, CRUD endpoints, configuration scaffolding, standard UI implementations, basic data transformations, straightforward API integrations. The stuff experienced developers already found tedious.
What grows: architecture decisions, security review, AI orchestration and evaluation, system integration at scale, performance optimization for unusual constraints, product thinking, verification and testing strategy. The stuff that requires understanding why, not just how.
There’s also a category I didn’t expect: AI wrangling itself. Someone has to decide which AI tools to use, how to integrate them into workflows, how to evaluate their output, how to handle the cases where they fail. That’s a new kind of work that didn’t exist three years ago, and it requires deep software knowledge to do well.
HackerRank’s research on the productivity paradox of AI finds that teams using AI tools often produce more code but not better outcomes — because the bottleneck shifts from writing to reviewing, testing, and integrating. The developers who thrive in that world aren’t the fastest typers. They’re the ones who can look at a wall of AI-generated code and immediately spot what’s wrong with it.
As Tim Harford wrote about the spreadsheet era:
“There are more accountants than ever; they are merely outsourcing the arithmetic to the machine.”
Replace “accountants” with “developers” and “arithmetic” with “boilerplate,” and you have a reasonable prediction for 2030.
The honest prediction
I’ve spent weeks with the economic literature, the employment data, and the historical case studies. Here’s where I actually land — which is, predictably, somewhere messier than either the doomers or the optimists want.
Short term (2025-2028): The Jevons Paradox is already visible. AI is making software cheaper to produce, and demand is expanding to meet the new capacity. Total developer employment will likely grow, driven by the massive backlog of unbuilt software. But the growth won’t be evenly distributed. “Programmer” roles will continue to contract while “developer” and “engineer” roles expand.
Medium term (2028-2032): The expansion continues, but the nature of the work shifts significantly. Most developers will spend less time writing code from scratch and more time orchestrating, reviewing, and integrating AI-generated code. Junior roles will evolve (as I discussed in part 3 of this series). The profession looks recognizable but the daily work doesn’t.
Long term (2032+): This is where I genuinely don’t know. If AI capabilities plateau (as some researchers expect), the new equilibrium holds — more developers doing different work, Jevons Paradox in full effect. If AI achieves genuine autonomy in software development (which I think is much further away than the hype suggests, but I can’t rule out), we’re in the mobile banking scenario — the second wave that breaks the paradox.
I don’t know which it’ll be. Nobody does, despite what they claim on LinkedIn.
So what do you do with this?
If the Jevons Paradox holds — and I believe it will for at least the next five to seven years — the strategic move isn’t to panic about AI replacing developers. It’s to position yourself on the growth side of the split.
That means investing in the judgment layer: architecture, systems thinking, security, evaluation, the human-context skills that expand when production gets cheaper. It means getting fluent with AI tools, not because they’ll replace you, but because the developers who can leverage cheaper code production to deliver more value will be the ones who thrive.
It also means watching carefully for the second wave. The ATM tellers who assumed the good times would last forever got blindsided by mobile banking. The developers who assume the Jevons Paradox is permanent will get blindsided too — eventually. The paradox buys you time. It doesn’t buy you immunity.
The spreadsheet didn’t kill accounting. It killed one kind of accounting and supercharged another. AI won’t kill software development. But it will absolutely kill one kind of software development — and if that’s the kind you do, the macro-level job growth statistics won’t save you.
The bookkeepers who survived weren’t the ones who fought the spreadsheet. They were the ones who learned to use it and shifted their value proposition upward. That’s not a metaphor. It’s a data point.
Sources
- NPR Planet Money: “How Machines Destroy (And Create!) Jobs”
- Tim Harford: “What the birth of the spreadsheet teaches us about generative AI”
- TinyComputers: “What VisiCalc Teaches Us About AI”
- Morgan Stanley: AI Software Development Industry Growth
- Stack Overflow: “Why Demand for Code Is Infinite”
- GitHub Octoverse 2025
- James Bessen: “Toil and Technology” (IMF Finance & Development)
- BLS: Occupational Outlook — Software Developers
- ADP Research: “The Rise and Fall of the Software Developer”
- a16z: Big Ideas 2026, Part 2
- AEI: “What ATMs, Bank Tellers, and the Rise of Robots Tell Us About Jobs”
- HackerRank: “The Productivity Paradox of AI”