AI Is Making Us Faster. But Is It Making Us Better?
What gets rewarded gets amplified — and right now, we’re rewarding fear.
Real John here - I had very busy week doing a lot of AI projects and wanted to talk about some of the learnings and some of the headlines I keep seeing. This week is a little different but very important. I am hoping that you can pause in your life, get some silence to think about what is truly important to you and the world, and how that affects what you are building with AI.
A little side story, I went to a team dinner this week and we took all of our phones and locked them up. Not in our pockets, not face down on the table, but in a box. What happened? a great deal of connection, communication, and a little card magic. Such an amazing experience. I didn’t know what time it was, if anything was happening outside of that dinner, and for a moment everyone was laughing, sharing, and having an amazing experience. Highly recommend to have that as a rule for every meal.
Oh, Ace of Hearts - if you know you know. Thanks again for giving me a little of your time, I do appreciate you sharing your precious commodity with me. Let’s get into it.
Block just laid off nearly half its staff. Not because the company was struggling. Not because revenue was down. Because they adopted AI tools and said, plainly, that they no longer needed those workers. The stock jumped 15%.
Let that sink in for a second. (No Elon reference, but funny if you know)
The market celebrated. Investors cheered. The message sent to every boardroom in America was loud and clear: cut humans, get rewarded.
That triggered something in me. Not panic — I’m not here to write doom-and-gloom. I’ve been building software for 30+ years and I’ve watched automation reshape the industry in waves. This isn’t new. What is new is the speed. And the part that’s keeping me up at night isn’t the layoffs themselves. It’s what we’re choosing to amplify.
What Gets Rewarded Gets Repeated
I was watching a Chase Hughes video recently and he said something that stopped me cold:
“Whatever gets rewarded gets repeated. Whatever gets ignored disappears.”
Block cut half its workforce and the stock surged 15%. That’s a reward signal. Every company watching just got the same memo. Not “use AI to build better products.” Not “use AI to serve your customers in ways you couldn’t before.” The memo was: use AI to cut, and Wall Street will cheer.
Now think about your content feed. Think about what goes viral. Think about what gets the most clicks, the most engagement, the most shares.
Fear. Outrage. Doom.
The same behavioral conditioning that drives corporate decisions drives your feed. What gets engagement gets amplified. What gets ignored disappears. So people simplify themselves, exaggerate themselves, perform urgency — because that’s what the system rewards.
And now AI is producing that content at industrial scale.
Information vs. Content
Hughes drew a distinction in that same video that I think is the most underrated idea in tech right now:
“Information answers questions and content stimulates responses.”
He went further: most of what fills your feed isn’t meant to be important. It’s designed to occupy your attention, trigger emotion, maintain engagement — and keep silence out of your life entirely. Because as he put it, “silence is where thinking happens.”
That one hit me. Because that’s exactly what fear-based AI headlines do. The Block layoff story, the “AI will destroy your job” posts, the doom forecasts — you feel engaged, you feel aware, you feel like you’re staying ahead of something. But as Hughes put it, nothing ever resolves. Because resolution was never the point. Engagement was.
AI is making it cheaper and faster to produce both information and content. The problem is fear-based content converts better. It gets more clicks, more shares, more followers. The algorithm rewards it. And we now have tools that can generate it at industrial scale, around the clock, forever.
So the question I keep asking myself is this: are we building a faster world, or a better one?
We Have a Choice in What We Amplify
Here’s where I’ll push back on the doom crowd: this is not inevitable.
I built Cash Critters — a financial literacy app for kids — for about $50 a month using AI tools. No VC funding, no team of 20, no press release. Just a real problem I wanted to solve and the tools to solve it. That’s what AI amplifying joy looks like. Amplifying creativity. Amplifying the scrappy builder who would have been priced out of the game five years ago.
I also just shipped a coding agent at work that takes a Jira ticket all the way to a pull request — GitHub Actions, Jira automation, a carefully crafted Claude prompt. Fully automated. Not to replace anyone. To free my team up to do the work that actually requires a human brain.
That’s a choice. The same tools that let a company cut half its workforce can let a solo developer build something that genuinely helps people. The technology is neutral. We are not.
Every piece of content you publish, every decision you make about what to build, every time you choose fear over value — that’s a vote. What are you voting for?
So What Do We Actually Do?
Start with the question Chase Hughes doesn’t ask explicitly, but his whole work points toward: what are you reinforcing?
When you share that fear-based AI headline, you’re reinforcing it. When you build a product designed to exploit anxiety instead of solve a problem, you’re reinforcing it. When you celebrate a company for cutting humans and calling it “efficiency,” you’re reinforcing it.
Here’s the counter-move:
Build things that help people, not just things that capture them. There’s a difference between a product that makes someone’s life better and a product designed to be addictive. Know which one you’re making.
Call out the distinction between information and content — in what you consume, what you share, and what you create. One builds people up. The other just burns their time.
Reward the builders who are doing it right. Subscribe to the newsletters that give you something real. Share the posts that make you think, not just react. The algorithm responds to what you amplify.
Ask “better” before “faster.” Before you deploy that AI feature, before you automate that workflow — is this making something genuinely better for a human being? Or is it just cheaper and faster?
AI isn’t going to decide whether the next decade is defined by fear or by genuine human progress. We are. The tools are in our hands. The question is what we’re going to build with them.
Faster is easy. Better takes intention.
Go build something amazing — but make it worth building.
John Mann is a software engineering executive, CTO, and founder of Startups and Code LLC. He writes weekly about AI, startups, and tech leadership — for builders who care about doing it right.



