When the System Found Its Believers
Brian Merchant and Jacob Silverman on the belief system powering modern tech
I’ve written before about my admiration for Blood in the Machine, Brian Merchant’s history of the Luddite movement and its eerie parallels with today’s tech culture. That book stayed with me, the way it treated resistance not as nostalgia but as a form of clarity. It led me to follow Merchant’s Substack of the same name, where he keeps tracing the human edges of automation and capital. Every so often he posts video podcasts, and it was through one of those that I found my way to Jacob Silverman’s Gilded Rage: Elon Musk and the Radicalization of Silicon Valley.
Silverman’s book reframes the usual story of Silicon Valley’s shift to the right into something deeper: a convergence of power, grievance, and utopian belief. It is not just about a few unruly billionaires; it is about a class that sees its own ideology as infrastructure. Merchant, with his mix of historian’s precision and quiet disbelief, teases out how that mindset formed and what it means when technology becomes a moral project.
The conversation sent me down a rabbit hole, through Silverman’s Substack, interviews, and into the book itself. It made me look harder at the systems I write about and think about how the same billionaires shaping culture are also engineering the technical foundations beneath it, how their private faith in progress keeps rewriting the public machinery of media and AI.
The Gospel According to the Oligarchs
If you have time, go listen to the full conversation. It’s worth it. Merchant and Silverman cover a lot of ground, and it’s better heard than summarized. What struck me most was how little of it was about Elon Musk as a person. The focus was on the patterns (likely the exact reason this all resonated so much).
They talk about what Silverman calls the new tech oligarchy, a class of founders and financiers who move as one, politically, culturally, and financially. Men (and it’s really almost entirely men) who once sold disruption as liberation now talk about it as control. The idea of building the future has turned into a defense of the present they already own.
They trace how grievance became a kind of organizing principle. COVID lockdowns, criticism in the press, social movements around labor and identity, all of it feeding a shared sense of persecution among the ultra-rich. Musk, Thiel, Andreessen, Lonsdale, the A16Z orbit. The same names and the same posture, power wrapped in the cloak of victimhood.
They then move into the deals behind it. Pentagon contracts for AI research. SpaceX as both private company and national infrastructure. Saudi and Emirati sovereign wealth funds quietly underwriting the next wave of compute. Silicon Valley as a hub for autocratic money, building the very systems those regimes use to surveil their citizens.
Merchant keeps looping back to the psychology of it all, the midlife disillusionment of men who already have everything. “They had the world at their fingertips and couldn’t stand the touch.” Silverman circles the same idea, that the utopian story of tech progress collapsed under the weight of its own expectations. What is left is the faith that AI, or some new system of their own making, can redeem it all. They also point out that beyond AI, and specifically AGI, there is no real plan for what comes next, only the hope that a machine can carry their ambition further than they ever could.
That is the shape of their talk: wealth as ideology, control as salvation, belief as business model. It is not a debate about politics so much as a portrait of a class inventing a religion to explain its power. Well before the podcast was done, I had the growing sense I needed to connect this to my own writing and examine how that same ideology was shaping the tools that now define our media infrastructure.
The Machinery of Belief
What Silverman and Merchant are really describing isn’t a political shift. It’s the moment when infrastructure becomes ideology. In Musk’s case, the tools themselves have become the weapon. Satellites, platforms, and data centers are all repurposed to enforce his own worldview. The same pattern runs through the rest of the oligarch class, with systems built not to serve the public but to secure belief.
These founders no longer see technology as an extension of human creativity. They see it as a mechanism of enforcement, belief engines that turn ideology into automation. In their hands, innovation has become a form of governance that decides what we see, how we work, and who gets to participate at all.
Media now runs on these same circuits of control. Recommendation systems tuned for outrage, AI models trained on stolen work, content pipelines optimized for performance over meaning. The optimism of early AI demos and the corporate talk of creative empowerment all fold into the same story. The machine will save us, even as it quietly teaches us what to believe.
The conversation stayed with me because it stripped away the pretense of neutrality. I work around these systems every day—the tools, the infrastructure, the contracts—and it is easy to start treating them as inevitable. Listening to Merchant and Silverman reminded me that inevitability is a design choice.
When founders talk about AGI, they are not predicting the future, they are performing belief. When platforms treat attention as proof of truth, they are codifying theology. The radicalization that Silverman traces is not confined to politics; it runs through product roadmaps, content pipelines, and AI research labs. The faith is not only in Musk or Thiel but in the idea that bigger models mean better worlds.
That is what stuck. The conversation put language to an undercurrent that has been shaping the media and AI feedback loop for a while now. The same conviction that built the rockets and the data centers can now script the stories, select the thumbnails, and tune the feeds. What looks like innovation is often ritual, a system repeating its own beliefs until they sound like facts.
The Next Layer: Inside the Engines of Change
Merchant and Silverman’s discussion diagnoses a cultural drift and gave voice to some of the unease I’ve felt with the direction the tone has taken across both the tech industry and the culture around it. But they also mapped the fault lines of a new operating system. Listening to them made it clear that the story of AI and media may no longer be about tools or trends. It is about belief, how ideology turns into infrastructure, and how the political radicalization of Silicon Valley is now shaping the systems that tell we use to tell our stories.
That conversation became the spark for a new run of work in Engines of Change. Over the next two months, I’ll be publishing a four-part Deep Cut series that looks at how faith, power, and ambition fused into the architecture of technology itself.
The first piece, The Faith Layer, will be published next week. It explores how AI has become a form of absolution, with belief written into every product roadmap. From there, The Power Layer will dig into the money, defense contracts, and sovereign wealth shaping the physical layer of control. The Control Layer will follow the rise of “founder logic,” the idea that a single vision should govern both product and people. The series will close with The Collapse Layer, an examination of what happens when the faith that built these systems begins to fail.
If this side of tech culture interests you, subscribe and read along. These four essays will close out the first year of Engines of Change by tracing how AI’s promise of progress became a kind of faith, how that faith fuels power, and what happens when it finally breaks.
I started Brian’s episode out of curiosity, but by the time it ended, it felt less like commentary and more like a postmortem of a system that already knows what it has become.


