
We've amplified movement and computation. Now we're amplifying thought itself. What that means for how we work, build and decide what's worth doing.

TL;DR: AI does not replace thinking. It raises the stakes for it. The gap between people with something genuine to say and people producing noise is widening fast, and the machine is the one holding the magnifying glass.
The first great amplification was physical movement. Horses gave way to steam engines. Steam gave way to internal combustion. Internal combustion gave way to electric motors and high-speed rail. Each shift did not simply do the same thing faster. You did not take a steam locomotive to the same places you took a horse. You went to places that did not exist yet. New cities. New trade routes. New ways of organising labour and daily life. The technology did not just accelerate the world. It restructured it. The second great amplification was cognitive throughput. Steve Jobs described the computer as a bicycle for the mind, and it remains one of the most accurate metaphors in the history of technology. A bicycle extends what your body can do without replacing the effort behind it. A computer did the same for thought. Calculations that took teams of people weeks could be done in seconds. The worldwide web turned individual computers into a single collective brain. Each time, the technology changed the nature of what was worth doing at all.
The amplification of thought itself. Not the processing of information we have already organised. The transformation of low-signal intent into high-resolution output. You bring a rough shape, a half-formed idea, a question you are not quite sure how to ask, and the machine helps you find out what you actually meant. It extrapolates the signal. It returns something more complete than what you put in. A spreadsheet amplifies your ability to calculate. A search engine amplifies your ability to find. A thought amplifying machine responds to the texture of your intent, not just the content of your instruction. The difference between those two things is significant.
AI has moved in two directions at once. It has raised the floor of what is acceptable and the ceiling of what is imaginable. Consumer expectations are already shifting, even while the technology is still finding its feet. Once someone has experienced AI-assisted drafting or search, their baseline shifts. They do not decide to want more from every product they touch. They just do, because they have seen what better feels like. The bare minimum is moving. Slowly, and then quickly. The ceiling shift is the more interesting one. More is expected of you. More is available to you. Both are true at the same time.
William Stanley Jevons was a nineteenth century economist who noticed something counterintuitive about efficiency. When Watt's improved steam engine arrived in 1865, the expectation was that coal consumption would fall. Instead it rose. Efficiency did not reduce demand. It made new use cases viable that had simply been too expensive to attempt before. The same logic is playing out in software right now. Only 0.04% of the world has tried AI coding tools. Only 0.3% has paid for any AI product at all. Sixteen percent have used a free tool at least once. Eighty-four percent have never touched any of it. The number of developers and GitHub contributors is growing exponentially, but the demand for what software can do is growing faster still. We are at the very beginning of an entirely new demand curve. AI literacy is developing the way reading literacy did: a small group currently fluent, a much larger population still to come, and the skill improving meaningfully with practice. Knowing what to ask and how to build is learnable. The 84% is not a fixed number.
A hairdresser running their own salon is not going to build their own booking and client management system. Not because they lack the capacity to learn, but because they have a salon to run. They want something that works, something configurable and something that stays out of their way. The idea that AI will turn everyone into a software builder, that friction will fall so low your local florist spins up a custom inventory system on a Tuesday afternoon, is still a long way off. Possible eventually. Not the story of now. The tools are getting much better at serving people who know what they want to achieve but not how to build the thing that achieves it. That is a large, underserved population. The companies paying attention to them, rather than chasing the builder-first narrative, are pointing at something real.
Removing a barrier to doing something is not the same as creating a desire to do it. A designer who can now build technical infrastructure, because the tools have made it accessible, does not suddenly want to spend their days thinking about infrastructure. A biologist who can now prototype a human-computer interface does not automatically find that work compelling. The capability was the barrier. The interest was never the problem. This matters because a lot of the conversation around AI assumes that as friction falls, people will naturally expand into adjacent domains. Some will. But most people will use the newly available headspace to go deeper into what they already care about, not broader into what they do not. Agents will eventually help bridge some of this. You will be able to find and deploy specialised models to handle the parts of a problem that fall outside your interest, in the same way you currently hire people whose expertise you do not share. But that is a coordination model, not a transformation of human motivation. Someone directing agents to handle the parts they find uninteresting is still a person with interests, pointing technology at the gaps. Our underlying human biology lags the technological shift, and that is not a flaw in the system. Interest is what gives a thought its signal in the first place. The machine can only amplify what you actually care about. Which also means the gap between curious people and incurious ones is widening. The tool does not make that gap. It reveals it.
Too many organisations are sprinkling AI at the feature level and calling it transformation. A smarter search bar. An autocomplete field. A chatbot bolted onto a help desk that was already not working. The more significant opportunity is rethinking what the business is actually trying to do. Martin Eriksson shared an example that has stuck with me. Intercom's founder returned to a plateaued business, saw what large language models would do to the help desk category and did not add a chatbot to the product he already had. He reinvented the product from the top down. Different model, different assumptions, different answer to the question of what the business was for. Harder to do. Requires breaking something that is working reasonably well in service of something that could work extraordinarily well. The organisations that manage it over the next few years will build structural advantages that are very difficult to close.
Amplification is only as valuable as the signal behind it. You can already see the split happening. The internet is filling with what you might call dead text: senseless, obviously generated content that has the shape of an idea without the substance of one. Alongside it, a smaller volume of alive text, writing that feels like it came from somewhere, from a person who actually thought something and used the tools to say it better. The machine did not create the difference between those two things. The person behind it did. There is a small irony in how this article came to exist. I spoke these words out loud. A machine transcribed them. A different machine stretched those transcriptions into long-form prose. I then sat down, pulled it apart and rebuilt it by hand, using a third model to sharpen the language as I went. The point of view, the specific way I see this particular moment, comes from the combination of things I have read, built, failed at and cared about. No model generated that from scratch. So the question is not whether to use the tools. It is whether you have a thought worth amplifying. Most people do. The distance between having a thought and making it matter is shrinking very fast.
This article began as a voice note. It was transcribed by one model, drafted by another and edited by hand, with a third model helping sharpen the language throughout. That process is itself the argument.

Heldiney Pereira – Co-founder & CEO
I'm building Flynt because I believe in a human-centred approach to making professional connections. Our technology is designed to make people feel more connected, less isolated on their career journey and inspired by all the great people they meet.
Connect with inspiring professionals once a month. Apply below and we'll be in touch when a spot opens up.
