The AI effort trap
Plus, why your washing machine explains the future of work
Whenever we invent a technology that makes a task easier, we face a fundamental choice: do we use the efficiency gain to do the same quality of work with less effort, or do we use it to do a higher quality of work with the same effort?
With AI tools, this choice is becoming incredibly stark. Let's say you're a decently competent professional. An AI tool can help you write a report or code a feature faster. You now have three options:
Maintain effort, increase quality: You spend the same amount of time you normally would, but use the AI to augment your work, explore more possibilities, and produce a much better final product.
Reduce effort, maintain quality: You use the AI to get to your usual standard of "good enough" in half the time and knock off early.
Reduce effort, reduce quality: You fall into what I'm calling the "effort trap." You use the tool to make the work feel easier, but because your own critical effort was a key ingredient in the quality, the final output is actually worse.
The key thing to understand about the effort trap is that the tool isn't the problem. The problem is that sometimes the effort itself — the process of struggling with the material, thinking through edge cases, and revising your own work — is a crucial ingredient in the final quality. Outsourcing that struggle can lead to a subtly worse outcome.
Your washing machine explains everything
This dynamic, by the way, isn't new. It's a version of a well-known economic concept called Jevons Paradox. The paradox, first observed with coal in the 19th century, is that when technological progress increases the efficiency with which a resource is used, the total consumption of that resource often increases rather than decreases.
The writer Tim Harford has a great modern example involving the washing machine. The invention of automated laundry machines drastically reduced the labor required to wash a load of clothes. Did people take that saved time as leisure? Mostly, no. They just started washing their clothes much more frequently. The standard of what constituted "clean" went up. The efficiency gain was consumed by increased demand, not by reduced labor.
This is the optimistic path for AI. It doesn't lead to mass unemployment, but to a world where we expect a higher standard of work. The memo that used to be acceptable now looks sloppy. The app that used to be fine now seems buggy. The efficiency gains are reinvested into higher quality. But that only happens if we make a conscious choice to do so.
A U-shaped curve of benefits
So who actually benefits from these new tools? The early empirical evidence suggests a kind of U-shaped curve.
The people at the bottom of the skill distribution get a big boost. An AI tool can help a poor writer produce a basically competent email or help a novice coder write functional script. It raises the floor, getting them to a level of "good enough" they couldn't reach on their own.
The people at the very top of the skill distribution also benefit. A great writer or an expert programmer can use AI as a high-powered assistant to augment their already-formidable skills, automating tedious parts of their work so they can focus on the highest-leverage tasks.
But the people in the middle? That's where it gets tricky. If you're a reasonably competent professional, AI offers a tempting path to coasting. You can use it to hit your existing targets with less work. The risk is that you fall into the effort trap, where your skills begin to atrophy and your work gets subtly worse because you're no longer engaged in the difficult parts of the process that produce real quality. For most of us in the broad middle of the professional world, this is the real danger.
The upshot is that the big question with AI isn't really about the technology itself. It's about the choices we make when we use it. The path to broad-based prosperity is one where we leverage these tools to demand more from ourselves and our work, not less.