- Unsupervised Learning
- Posts
- Cory Doctorow is Not Even Wrong About the So-called "AI Bubble"
Cory Doctorow is Not Even Wrong About the So-called "AI Bubble"
People also used to think the printing press was just going to bring people more Bibles
Doctorow just wrote an uncharacteristically shortsighted essay asking whether there would be anything left over after AI’s bubble bursts.
Here’s an excerpt:
I’ve had to make rule for my events: The first person to mention AI owes everyone else a drink.
It’s a bubble.
Tech bubbles come in two varieties: The ones that leave something behind, and the ones that leave nothing behind. Sometimes, it can be hard to guess what kind of bubble you’re living through until it pops and you find out the hard way.
He makes some decent points about silly AI startups in the piece, and I, of course, agree that most of those will die off soon.
The scale of inefficiency
But if you agree with him that this entire AI boom is actually a bubble, and would like to know why I think differently, ask yourself this question:
What are the obstacles that stop the current economy from being 10x, 100x, or 1000x the size it is today?
In other words, what challenges currently limit the number, size, efficiency, and scale of startups and enterprises—especially those pertaining to creating and executing business ideas?
It’s hard to capture perfectly, but let’s make a quick list:
Most businesses do not have standardized internal processes
Even for companies that do, it’s almost impossible to implement those processes at scale across the company
Most companies cannot scale their sales operations
Most companies cannot scale their support organization
Most companies aren’t good at finding and keeping the right people
Most companies cannot scale their marketing efforts
Most companies cannot scale their vision/strategy leadership talent that come up with innovative ways to deal with market conditions
So let’s further break this down as:
Strategic Cohesion
Keeping everyone updated on direction
Keeping everyone motivated on direction
Process Standardization
Policy creation
Policy following
Keeping everyone updated and synchronized
Ease of updates for new processes and policies
Hiring and Performance Management
Finding the right people
Vetting them properly
Keeping them incented to do quality work there
Getting rid of those who stop performing well
Sales Operations
Everyone knows what we sell
Everyone knows the customer they’re selling to
Everyone knows how our product can help them
Consistent testing and iteration process
Consistency, quality, persistence
Ease of updates for new products
Marketing
Everyone knows what we sell
Everyone knows the customer they’re selling to
Everyone knows how our product can help them
Creativity in marketing copy
Consistent testing and iteration process
Consistency, quality, persistence
Ease of updates for new products
Support
Enough people
Proper training
Consistent improvement process
Ease of updates
I’ve been consulting and advising for startups for over 15 years, and it’s always some combination of these things that either limits growth or outright destroys a company. And for larger companies, these factors limit the scale and quality of their output.
My argument is simple: Most of these elements, such as sales, operations, marketing, internal processes, etc., are operating with like 5-30% efficiency. It’s mostly waste. And the percentage of waste grows as the teams get bigger.
If you know, you know.
It’s virtually impossible to run a sales team. Most companies aren’t doing it well. Same with marketing. It’s voodoo magic, and inconsistent, more art than science, and very few companies thrive there.
And internally, most companies are horribly run from a process and communication standpoint. Too much bureaucracy. Too many inefficiencies. Too many people doing useless things, not enough people doing the right things. And the broken processes and structure create broken incentives. Then the whole broken system starts working toward the wrong things.
Well, these are the engines that power companies, including all of our startups, medium-sized companies, and giant corporations.
If I had to guess, I’d say we’re getting 5-20% efficiency from most companies of any significant size. Let’s call it 20%. That’s 80% waste across all those areas added up that produces friction.
And that’s just for the percentage of 100% that’s possible with the current number of workers at the company.
Enter AI
Now let’s add AI. The right way to think about AI is to ask:
To what degree does AI allow us to remove the waste in internal processes, hiring and firing, sales, marketing, and other core components of a company’s success?
Here’s my answer:
AI will massively improve not just the efficiency in all those areas, but also the scalability.
Combining efficiency with scale
Imagine you have a sales team that’s 7% efficient. If you don’t believe the numbers are that low, think about the amount of time they spend working (and being paid) vs. how many calls they have and how many deals they close.
7% might be generous.
Anyway, now imagine you have a 20-person sales team. Think about what happens when that efficiency goes from 7% to 45%—or 70%.
But instead of having 20 salespeople, now you have 1,500!
That’s what AI is going to bring us. It’s going to get us more effectiveness for every person we have, combined with being able to add so many more to your team.
And we’re not just doing this for sales operations. We’re doing this for marketing as well. And hiring and firing. And all the other areas we mentioned that are holding back companies.
This is what makes Doctorow’s essay so shortsighted. We’ve just seen the ability to improve the efficiency and scale of the core muscles of capitalism itself. And all he sees are gimmicky companies on billboards.
The real impact of AI on the economy
I said back in February of 2023 that AI would pull us out of a recession. Who knows how much of an effect it was, but I think it was likely significant. A couple of months later, I said AI would massively raise the US’s GDP within a few years.
I think I was too timid in those predictions. I think, given the analysis above, that we’re likely to see global productivity multiply by 2-5x in the next 10 years due to AI taking off, and much more than that in the next 10 years.
The company that reads you walnut recipes in the voice of a walnut tree are not going to last.
But so what?
What I don’t know—and nobody can know—is how that will mix with the tens of millions of jobs that will also be lost at the same time. Like, who is buying all this new and better stuff if AI is also removing human jobs? Hard to say what the net effects will be.
Magnifying our total output using agents
Anyway. The point is that AI is not a thing. It’s a magnifier. And the things it magnifies are creativity and consistent, high-quality human work output.
Let’s say our total output for planet Earth is currently N
.
When you very quickly add billions upon billions of AI of powered agents, and systems of agents, that are capable of producing creativity at some level, as well as producing extraordinarily consistent and high-quality output, you take that N value and multiply it by 2, 3, 5, 10, 100, and 1000.
It’ll take a while to spin up, but the ceiling is extraordinarily high, and that’s not even accounting for AI getting smarter than GPT-4, which it inevitably will. And if it ever becomes even smarter than humans (not even ASI) by any significant margin, the multiples go even higher.
Focus on the right thing
Stop thinking about the silly applications of AI. The company that reads you walnut recipes in the voice of a walnut tree are not going to last. So what?
The impact of the printing press was not the manufacturing of religious books. The impact was introducing billions of people to entirely different worlds and different ways of thinking.
Similarly, the impact of augmenting humanity with artificial intelligence is not the micro-companies that do kitschy things with a side feature of AI.
The impact is dramatically multiplying the output of humanity.
So yes, Mr. Doctorow, there will indeed be something left after “the AI bubble” bursts.