Andy Beach's Engines of Change

Andy Beach's Engines of Change

HUMAN-ENHANCED IS THE NEW AUTOMATION

The Human Layer, Part I

Andy Beach's avatar
Andy Beach
Apr 26, 2026
∙ Paid

A small creative team finishes a piece of work in an AI platform. The output is clean. The rendering is fast. Everyone in the room agrees it is good. Then someone asks a question that nobody can quite answer. Whose decisions are in this?

They can reconstruct some of it. The prompt that started the session. A few of the iterations that got discarded. The moment somebody said to try it again with a different composition. They can also remember the part of the project where they bailed out of one platform halfway through, ran the next pass through a different system to get a look the first one couldn’t produce, then brought the result back into the original workflow to finish. The decision to switch was creative. It happened in real time. It made the final piece better. It also broke whatever continuous record either platform was capable of keeping, because neither one was designed to know about the other. The handoff lived in the team’s heads, in a Slack thread, and in nobody’s metadata.

The output exists. The authorship does not, at least not in any form you could put in front of a rights lawyer, an awards jury, or a distribution partner and have it hold up.

This is the shape of the current AI adoption conversation in the creative industries, and it is almost never discussed at this level. The tools work. The outputs are credible. The economics look, at first glance, extraordinary. What keeps slipping is the one thing that makes the output a work in the first place. Under current copyright law in the United States and most comparable jurisdictions, the human contribution is the only element that makes generative output legally protectable. The model produced the frames. A person made it a work. When the workflow obscures the person, it doesn’t just create a documentation problem. It creates an asset that cannot be owned, defended, licensed, or compounded.

The Inheritance Shift described how machine context has to survive transformation. The human layer has its own version of the same problem. Agency has to survive too, and right now most of the systems we have built for it are designed to erase it.

The resistance is not about the tool

For two years, the dominant industry read on creative resistance to AI has been that it is some combination of protectionism, fear of technological change, and failure to understand what the tools can actually do. All three readings assume the resistance is about the tool itself. It is not. The tool is the least interesting variable in the room.

What is actually being resisted is a system deploying the tool without answering four questions. Who benefits from the productivity gain? Who is replaced and who is retained? Who decides how the tool gets used? Who owns the output, creatively, legally, and commercially? Those four questions have very little to do with whether a diffusion model can render a convincing ocean, and almost everything to do with how the industry surrounding the model chooses to operate. When the questions go unanswered, resistance isn’t irrational. It is the only rational response available.

This is not the first time a workforce has been told that its concerns about a new machine were really concerns about progress. The pattern is older than cinema, older than broadcast, older than the union contracts that shape the industry this essay is about. The pattern holds because the useful version of it has always been true: the machine is rarely the problem. The arrangement around the machine is.

User's avatar

Continue reading this post for free, courtesy of Andy Beach.

Or purchase a paid subscription.
© 2026 Andy Beach · Privacy ∙ Terms ∙ Collection notice
Start your SubstackGet the app
Substack is the home for great culture