Whether or not that forecast proves accurate isn’t really the point. What interests me more is what that kind of growth implies: abundance.
In that world, everything becomes abundant. Knowledge‑economy services. Financial advice. Legal services. Insights of every kind, delivered instantly by my little mates, Claude and GPT. On the surface, it sounds amazing… right?
But what if it isn’t?
What if it actually kind of sucks?
What if an abundance of free, AI‑generated “stuff” erodes the value of things altogether until we collectively stop trusting or engaging with it because we know it lacks authenticity? Because we know it’s just AI slop (just like the image at the top of my blog)?
That’s where the idea of value comes in. We value things because they help us, what economists would call “utility.” But when those same things are endlessly abundant, that lessens their value in a different way, because we also value HOW they are produced. We also start to be distracted by too much information and it becomes noise - so in fact, negative value (disutility - I think I just invented a word). We get bombarded with the same content from every direction, and suddenly abundance becomes overwhelming rather than liberating.
When everything costs almost nothing to produce, it becomes almost impossible to value anything. We know we’re being manipulated. The game shifts from one where quality matters most to one where quantity wins, because shouting the loudest is the only way to be heard when everyone is using the same AI to say the same thing.
If you want to see this dynamic play out in real time, look at the job market. Candidates use AI to write their CVs, then use AI to apply for hundreds of roles. Employers use AI to screen those CVs, rejecting them by the hundreds in return. Before long, everyone starts wondering whether automating CV screening was such a great idea after all.
There’s even a name for this phenomenon: the Red Queen hypothesis. In the age of AI, it’s on steroids. Every idea, marketing strategy, content puff piece, LinkedIn article, or even product now has effectively zero marginal cost (technical term: SFA to produce). The treadmill just keeps speeding up.
Honestly? Screw that world.
When we rebranded Evolved Thinking, we spent a lot of time thinking carefully about what we actually wanted to stand for. That process led us to this statement of our mission and values:
“As change accelerates, creating meaningful connections between brands and people has never been harder. True ingenuity is building authentic and memorable human experiences — this is where Evolved Thinking operates.”
I’m proud of that line, not least because we developed it more than a year ago by listening carefully to what people were saying and sitting with what it really meant.
Abundance sounds wonderful, but maybe it isn’t. Maybe what really matters is meaning. Authenticity. Human experiences and content has special value.
Maybe market research and insights aren’t about generating more data, but about finding edge cases, about ensuring we build a world that works for humans, not just for AI or the people who deploy it. Ironically, AI may still have a role to play in creating that world, only if it’s used thoughtfully, in specific ways, to amplify human experience rather than replace it.
Maybe how things are produced still matters, precisely because it grounds them in real, lived experience.
And maybe the best use of AI isn’t to speak louder, instead to help us listen better.