In 1949, a Black mathematician named Dorothy Vaughan was managing a segregated computing unit at what would become NASA’s Langley Research Center. Her job title was “computer” — not the machine, the person. She and her team performed trajectory calculations by hand, thousands of them, feeding numbers into the space program one pencil stroke at a time.
Then IBM showed up with actual computers. Electronic ones.
Most of Vaughan’s colleagues saw the writing on the wall and panicked. She didn’t. Instead, she got her hands on a FORTRAN manual and taught herself to program — before management asked her to, before anyone created a training program, before the transition was official. Then she taught her entire team. By the time NASA formally shifted to electronic computing, Vaughan’s group wasn’t obsolete. They were the only ones who knew how to operate the new machines.

I think about Dorothy Vaughan a lot these days. Not because her story is heartwarming (though it is), but because the pattern she lived through keeps repeating — and we’re living through another iteration right now. Every generation of knowledge workers eventually faces a moment where the tools shift underneath them. Every generation believes its shift is unprecedented. History disagrees.
So before we talk about what AI means for software development — which is what this series is about — I want to ground us in what actually happened the last several times a technology ate a profession’s lunch. Not the myths. Not the LinkedIn takes. The data.
Here are the patterns I keep finding.
Pattern 1: Cheaper production creates more demand (the Jevons Paradox)
In 1865, economist William Stanley Jevons noticed something counterintuitive: as steam engines became more fuel-efficient, coal consumption increased rather than decreased. Cheaper energy didn’t mean less energy — it meant people found more uses for it.
The same thing happened with spreadsheets and accountants. When VisiCalc launched in 1979, followed by Lotus 1-2-3 and eventually Excel, the conventional wisdom was clear: accountants were finished. Why would you pay a human to do arithmetic when a $99 program could do it faster and without errors?
Here’s what actually happened. Between 1980 and 2010, the U.S. lost roughly 400,000 bookkeeping and accounting clerk positions. But it gained about 600,000 management accountant and financial analyst roles. The profession didn’t shrink — it roughly quadrupled in economic value. As Tim Harford put it:
“There are more accountants than ever; they are merely outsourcing the arithmetic to the machine.”
The spreadsheet didn’t replace the accountant. It replaced the arithmetic, which freed accountants to do more analysis, more advising, more of the judgment work that clients actually valued. And because analysis got cheaper and faster, companies that never would have hired an accountant before suddenly could afford one. Demand expanded to fill the new capacity.
Sound familiar? Right now, AI is making certain kinds of code dramatically cheaper to produce. The Jevons Paradox suggests that could mean more software gets built, not less — which would mean more developers needed, not fewer.
But don’t get comfortable. The Jevons Paradox has an expiration date.
Pattern 2: The paradox expires — ask a bank teller
ATMs are everyone’s favourite example of technology not replacing workers. And the first half of the story is genuinely encouraging. When ATMs rolled out in the 1970s and ’80s, the number of bank tellers in the U.S. actually doubled — from roughly 250,000 to 500,000 by 2010. The logic was pure Jevons: ATMs made it cheaper to operate a branch, so banks opened more branches, which required more tellers for sales and relationship work.
Then mobile banking arrived.
From 2010 to the present, teller employment has collapsed by over 30%. It turns out ATMs only shifted the job description; smartphones eliminated the need to visit a branch at all. The Jevons Paradox bought tellers about 40 years. Then a second wave of automation — one that addressed the remaining human tasks — finished the job.
For developers, the uncomfortable question is: are current AI tools our ATM (shifting what we do) or our mobile banking app (eliminating the visit entirely)? I think they’re closer to the ATM stage today. But I’m watching.
Pattern 3: Routine work dies, judgment work thrives (the Bifurcation)
This is the through-line in almost every historical case. What gets automated isn’t the profession — it’s the routine substrate of the profession. The parts that are predictable, repeatable, rule-bound. What remains (and often grows) is the judgment layer: the decisions, the taste, the context that machines can’t yet replicate.
The spreadsheet automated arithmetic, not accountability. CAD automated drafting, not design. ATMs automated cash dispensing, not financial advice.
“VisiCalc automated arithmetic, not judgment; it automated production, not accountability.” — TinyComputers.io
The pattern for software development seems clear enough: AI is automating code production — the boilerplate, the CRUD endpoints, the glue code, the stuff experienced developers already found tedious. What it’s not automating (yet) is the judgment layer: architecture decisions, trade-off analysis, understanding what to build and why, debugging the weird stuff that only happens in production at 3 AM.
If your job is mostly routine code production, you should be worried. If your job is mostly judgment and decision-making that happens to involve writing code, you should be paying attention but not panicking.
Pattern 4: Incumbents suffer, new entrants adapt
This one’s hard to talk about honestly, because it involves real people getting hurt. But the data is unambiguous.
In 2024, economists James Feigenbaum and Daniel Gross published a meticulous study in the Quarterly Journal of Economics tracking what happened to telephone operators after the dial phone was introduced. At its peak, AT&T alone employed over 350,000 operators. (The U.S. Senate was so alarmed by the dial phone that Senator Carter Glass submitted a resolution against it in 1930 — senators apparently couldn’t stand the idea of dialing their own calls.)
What Feigenbaum and Gross found was a clean split. Operators who were already working when automation arrived suffered permanently — lower earnings, fewer hours, diminished career trajectories. But people who entered the workforce after the transition had already adapted their skills and expectations. They didn’t suffer the same penalty because they never built their identity around a role that was disappearing.
The transition took about 60 years. That’s a lifetime if you’re the incumbent.
This is probably the most relevant pattern for working developers right now. If you’ve spent 15 years mastering a particular way of building software — shipping hand-crafted code as your primary deliverable — the shift will feel threatening, personal, existential. If you’re entering the field today and you’ve never known development without AI, it’ll just be how things work. (I’m somewhere in between, which is its own kind of disorienting.)
Pattern 5: Quality temporarily declines (the ransom note effect)
On January 23, 1985, Apple introduced the LaserWriter printer alongside Aldus PageMaker. It was, according to industry historians, “the day the typesetting industry died.” Within about ten years, professional typesetting went from a thriving trade employing hundreds of thousands to near-total annihilation.
A New York Times printer captured the feeling in 1978, already sensing what was coming:
“All the knowledge I’ve acquired over my 26 years is all locked up in a little box now called a computer.”
But here’s the part people forget: when desktop publishing first put professional tools in amateur hands, the output was terrible. Designers coined the term “ransom note effect” to describe what happened when people with no training got access to 500 fonts — they used all of them, on the same page, in the same paragraph. The average company newsletter in 1987 was a typographic war crime.
Quality recovered, eventually. A new generation of designers learned the tools. Best practices emerged. But there was a genuine valley of garbage between “professionals do it expensively” and “everyone does it well.”
We’re in that valley right now with AI-generated code. The tools can produce vast quantities of functional-looking code, and people with limited experience are shipping it. Some of it is fine. Some of it is the software equivalent of a ransom note — technically readable, structurally incoherent, maintained by nobody. If you’ve reviewed a pull request lately that was clearly AI-generated and clearly not reviewed by a human who understood the codebase… you know exactly what I mean.
Pattern 6: Transitions take decades, not years
This is the one the hype cycle always gets wrong. Every historical parallel I’ve studied took far longer to play out than contemporaries expected.
Spreadsheets took 30+ years to fully reshape accounting. ATMs took 40 years before the second wave hit. Telephone operators declined over 60 years. Even typesetting — the fastest, most brutal case — took about a decade from the LaserWriter to near-total displacement, and another decade for quality to recover.
The AI transition in software development started in earnest around 2022 with GitHub Copilot. We’re roughly three years in. If history is any guide, we’re barely at the beginning. The profession will look very different in 2040, but probably not in 2027.
That doesn’t mean you should be complacent. It means you have time to be strategic.
Pattern 7: Early adopters win (the Dorothy Vaughan strategy)
This brings us back to where we started. Across every case I’ve studied, the people who fared best shared a common trait: they adopted the new tools early and aggressively, not because they were forced to, but because they were curious.
Dorothy Vaughan didn’t wait for a corporate retraining program. She got the FORTRAN manual and taught herself. The accountants who thrived in the spreadsheet era were the ones who learned VisiCalc in 1980, not the ones who were dragged into Excel training in 1995. The designers who survived desktop publishing were the ones who embraced PageMaker and brought their existing taste and knowledge to the new tools.
The pattern is consistent: competence with the old skill plus early fluency with the new tool equals outsized career advantage. Neither alone is sufficient. The person who knows nothing about software architecture but can prompt an AI isn’t a developer. The person who’s an expert architect but refuses to use AI tools is increasingly inefficient. The sweet spot is both.
Pattern 8: Demand elasticity determines everything
Here’s the question that actually matters, the one that will determine whether software development goes the way of accounting (Jevons Paradox, net growth) or typesetting (near-total displacement).
Is the demand for software elastic or inelastic?
If it’s elastic — meaning cheaper software leads to proportionally more software being built — then AI will create more developer jobs, not fewer. If it’s inelastic — meaning the world only needs so much software regardless of cost — then we’re looking at the typesetting scenario.
My bet, and it’s only a bet, is that demand for software is extraordinarily elastic. Every business I’ve worked with has a backlog of projects they can’t afford to build. Every non-profit, school, and small business has problems that software could solve but doesn’t because it’s too expensive. If AI cuts the cost of building software by 50%, I don’t think companies will fire half their developers. I think they’ll build twice as much.
But I could be wrong. And the demand elasticity will probably vary by sector, by company size, by the type of software being built. There won’t be one answer.
What this means for you
If I had to compress these eight patterns into a single piece of advice, it would be this: be Dorothy Vaughan, not her colleagues who waited.
Learn the AI tools now, while you still have the luxury of exploring them out of curiosity rather than desperation. Lean into the judgment work — the architecture, the trade-offs, the understanding of why — because that’s what survives automation in every historical parallel. Expect the transition to take longer than the hype cycle suggests, but start positioning yourself today.
And stay skeptical of anyone who tells you this time is completely different. The technology is new. The pattern is ancient.
In the next part of this series, I’ll dig into what’s actually happening to developer productivity right now — the real data, not the vendor benchmarks. Because the gap between the marketing and the reality is… interesting.
Sources
- NPR Planet Money Episode 606: “How Machines Destroy (And Create!) Jobs”
- Feigenbaum & Gross, “Automation and the Fate of Young Workers,” QJE 2024
- Tim Harford: “What the birth of the spreadsheet teaches us about generative AI”
- TinyComputers: “What VisiCalc Teaches Us About AI”
- James Bessen: “Toil and Technology” (IMF Finance & Development)
- HBR: “Kodak’s Downfall Wasn’t About Technology”
- Smithsonian: “The History of Human Computers”
- NASA: Dorothy Vaughan Biography
- WhatTheyThink: “The Day the Typesetting Industry Died”
- Wikipedia: Ransom Note Effect
- U.S. Senate: “Senators Balk at Dial Telephones”
- AEI: “What ATMs, Bank Tellers, and the Rise of Robots Tell Us About Jobs”
- BLS: Occupational Outlook Handbook — Tellers