A User’s Guide for Engines of Change
What this became. How to read it. Where it’s going.
Why this exists
Engines of Change started as a way for me to keep writing after leaving Microsoft. At the time, it was less about publishing and more about staying in motion. I needed a place to think in public while recalibrating what came next.
Over the past year, it evolved. This Substack shifted from personal expression to an attempt to make sense of what is playing out in real time, and how we got here. In many ways, it was a selfish gift to myself. A place to put my thoughts without PR or legal telling me I had no business saying them out loud.
As generative AI and media technologies moved out of discovery and demos and into systems people actually have to live with, that need only intensified.
I lean heavily on past technology epochs for this work. Not to predict outcomes or draw neat historical parallels, but to recognize patterns. How societies react to new tools. Where power concentrates. How incentives distort behavior. Where responsibility gets blurred or quietly reassigned. Those patterns show up again and again. The names change. The dynamics rarely do.
The name Engines of Change did not start as a Substack title. It began as the name of a book I was writing in my head in 2024, and talking through with my friend Kreig DuBose, who is now also writing on Substack. I had a working theory that the way we were collectively reacting to AI was not that different from how we’ve reacted to previous technology shifts. The same anxieties. The same overconfidence. The same rush to declare inevitability.
That thinking shaped some of the earliest writing here, even before I called this Substack by that name. For a while, I kept the title reserved for “the book,” as if this work needed to stay separate until it was finished. About ninety days in, I realized the obvious thing. This Substack had become the book. The work was already happening in public, and it deserved the name.
Engines of Change is now where I track those dynamics as they emerge. I look at AI and media tech not as features or products, but as systems that shape judgment, labor, and culture. Systems where choices are still being made, even when they are presented as inevitable.
It’s also worth saying plainly. AI doesn’t make those choices. People do. We rush to blame “AI” as if a robot caused someone to lose a job, when it’s usually an executive decision wearing a technical mask. The technology matters, but the real source of anxiety is how it gets weaponized and justified without accountability. I still believe in cultural determinism. We will shape the tech we want and need. We just shouldn’t allow it to be dictated to us.
This piece is not a recap of the year, and it is not a manifesto or a roadmap. Think of it as an orientation. A guide to how this work is structured, how to read it, and how to make use of it as these systems continue to take shape.
From commentary to pattern tracking
Over the last year, I’ve tried to move away from reacting to individual stories and toward tracking what repeats across them. Less emphasis on having a hot take in the moment, more emphasis on understanding where things are actually heading. I’ve found that direction often matters more than arrival.
That shift came from watching AI and media technologies leave the demo phase and settle into real production systems. Once that happens, novelty stops being the interesting question. Infrastructure takes over. Decisions get embedded. Workflows harden. And the cost of getting it wrong becomes much harder to undo.
As a result, the writing moved away from capability claims and toward coordination problems. Not “what can this do?” but “where does it belong?” Less excitement, more accountability. Who owns the outcome. Who reviews it. What happens when it fails.
At the same time, the pace of innovation has outstripped governance, shared language, and trust. Defaults began shipping faster than institutions could keep up. That gap became impossible to ignore. Many of the most consequential decisions were being made quickly and wrapped in the language of efficiency or inevitability, long before anyone agreed on what responsible use even meant.
The pace of the writing slowed on purpose as the systems themselves sped up. Individual posts became less important than what accumulated across them. Engines of Change now functions like a working notebook I return to over time, rather than a feed to keep up with.
How this way of seeing took shape
The through line behind Engines of Change is pattern recognition, not novelty. I’m less interested in what is new than in what is familiar, recurring, and reasserting itself under new names.
That requires treating technology as a social and institutional force, not a neutral tool. AI, like earlier technological shifts, doesn’t just add capability. It reshapes incentives, redistributes power, and changes what kinds of decisions feel normal or inevitable.
Several thinkers have helped shape how I approach that work.
Lewis Mumford
Mumford is the foundational anchor. He rejected the idea that technology is neutral or self-directing. Machines, in his framing, are systems that reorganize values, labor, time, and authority. They don’t just enable behavior. They encourage certain kinds of people, institutions, and outcomes. This perspective underpins how I write about AI as infrastructure rather than product.
Tim Wu
Wu sharpened how I think about power and cycles. His work makes clear that centralization and consolidation are not aberrations. They are recurring phases. Platforms expand, harden into chokepoints, extract value, and eventually provoke resistance. The actors change. The dynamics persist. That influence shows up whenever Engines of Change treats AI platforms and media companies as control layers, not isolated innovations.
Amanda D. Lotz
Lotz reframed media for me as structure, not just content. Her work clarified why distribution models matter more than format debates, and why economic arrangements shape culture long before audiences or executives articulate what has changed. Media is not just what we watch. It’s how value moves. That lens runs through my analysis of streaming fragmentation, FAST channels, and the slow unraveling of legacy bundles.
Brian Merchant
Merchant keeps the human cost visible. His reporting on automation and creative labor is a reminder that efficiency always lands somewhere specific. In jobs, in time, in precarity, in who absorbs friction when systems are “optimized.” This influence shows up whenever the writing pulls back from abstraction and asks who is being asked to adapt, who bears the risk, and who gets to opt out. By the way, If you have never read his Substack, please give it a go.
Jacob Silverman
Silverman sharpened my skepticism toward narratives. His work on tech ideology and incentive structures makes it easier to see when language is doing more work than reality. Especially around inevitability, progress, and moral cover for extraction. That perspective informs pieces where I interrogate how AI is framed, not just how it functions.
Taken together, these influences shape how Engines of Change approaches AI and media technology. Systems over stories. Coordination over capability. Accountability, boundaries, and boring progress as signals of maturity.
The goal isn’t prediction. It’s orientation. Paying attention while choices are still being made, before they harden into defaults.
The User’s Guide
Early on, I tried to impose some structure on how this work shows up. Not everything serves the same purpose, and not everything should be read the same way. Engines of Change appears in different formats because it serves different kinds of attention and intent.
Deep Cuts
These are long-form essays. They step back from the news cycle and are written to age. This is where I slow the pace and focus on structure rather than immediacy. Power, incentives, institutional behavior, and long-term shifts surface more clearly when you stop reacting and start comparing. And every Deep Cut ends with a Cocktail Endnote. If you know me, you know why. It’s a reminder that some of the best thinking about where this industry is headed happens in conversation, not in slides, and if you are lucky, it’s over a good drink.
System Alerts
System Alerts are regular snapshots of what is shipping right now. They began as something closer to news commentary, but quickly shifted toward pattern recognition. The goal isn’t to keep up. It’s to notice repetition. Which themes keep reappearing. Which decisions start to look like defaults. The “Why it matters” sections are where the connective tissue lives.
Side Bars
Side Bars are shorter, looser, and intentionally less formal (By the way, this piece is in fact a Side Bar). They started as a sandbox. A place to throw an idea onto the table without turning it into a fully articulated essay. Sometimes that’s a quick lens on a story. Sometimes it’s a field note about what I’m working on next, or a more personal reaction.
They’ve also become the home for guest writers. Pattern recognition improves with more vantage points, and Engines of Change shouldn’t reflect only my thinking. Many of the “direction, not arrival” ideas surface here first because they’re grounded in use, not aspiration. If you’re a builder or operator, these often have the highest immediate utility.
You’re not expected to read everything. This Substack is cumulative by design. The signal comes from noticing what repeats across formats, not from consuming every post as it lands.
If you want to follow along between posts, Notes are where early signals and half-formed ideas show up. If you subscribe, Chat is the quickest way to ask questions or compare notes. Comments and email replies are always welcome when they add context or sharpen the model.
Looking toward 2026 without a roadmap
Writing has been the primary interface for Engines of Change, and that will remain true. Over the past few months, I’ve started experimenting with audio and video, and I plan to do more of that in 2026.
Some ideas land better out loud. Some arguments need tone. Some explanations benefit from cadence, from a pause, from the ability to work through a concept in conversation rather than compress it into a paragraph.
I’m treating audio and video as additional ways to make the thinking legible, especially when the subject matter is messy. The goal is explanation and inclusion, not performance. The work isn’t to appear busy. It’s to be useful.
The medium may change. The lens does not.
The people who make this sharper
This work may be written under one name, but it isn’t shaped alone.
Stephanie Haycock has become the content manager for Engines of Change and an essential partner in making it readable. She’s the first sounding board for new ideas and a reliable gut check on whether what I’ve written actually makes sense outside my own head. Her role isn’t to soften the work. It’s to make sure the edge is legible.
Josh Schwerin, my Chief of Staff at Alchemy Creations, plays a different but equally important role. While he isn’t directly publishing Engines of Change, he helps amplify it and keep it connected to the broader conversation. He shows up to our weekly conversations armed with stories, signals, and questions that often shape what gets written next.
Tom Merritt deserves a special callout. He’s been encouraging this work from early on, inviting me to contribute to DTNS, pulling me into Cordkillers conversations, and helping me find my footing as an independent voice in this space. For more than a decade, he’s modeled how to do thoughtful, durable media work in public without chasing noise.
Finally, the guest writers. I began accepting guest Side Bars in October, and it’s been one of the most rewarding evolutions of Engines of Change. New voices bring new vantage points, and pattern recognition improves with more perspectives in the room. It matters to me that this work reflects more than my own thinking.
Where this leaves the reader
If you’ve been reading along this year, thank you. And if you trusted me enough to pay for it, I owe you a drink, a hug, or something better.
More than anything, I owe you the effort. The time to slow down, check my own certainty, and keep trying to make this work legible. Your attention turns this from shouting into the void into building a shared map.
If you’re arriving now, you’re not late. This work isn’t chronological, and it isn’t meant to be consumed end to end. It’s designed to be entered sideways, revisited, and re-read as new stories surface and older ones click into place.
Engines of Change isn’t a feed to keep up with. It’s a cumulative orientation. The value isn’t in staying current. It’s in noticing what keeps showing up, even as the language around it changes.
The focus here has never been arrival. It’s direction. How systems move from possibility to practice. How defaults get set. How accountability either appears or disappears along the way.
The window that matters most is the one before things feel settled. Before choices are framed as inevitable. That’s where judgment still has room to operate.
That’s the work this Substack is meant to support.



Happy New Year Andy! Thank you for all of your invaluable thought leadership.