So why are talent agents circling an AI ‘actress’? Follow the money.

TLDR: When talent agents started circling Tilly Norwood, a fully AI-generated “actress,” Hollywood erupted. But beneath the outrage lies a quiet business revolution nobody’s explaining. The 2023 SAG-AFTRA contract locks down strict rules for “digital replicas” of real actors—consent required, equal pay mandated. But “synthetic performers” like Tilly? They exist in a legal gray zone with far looser constraints. This opens the door to perpetual licensing deals and one-time usage fees that sidestep the day rates and residuals sustaining human careers. Here’s what’s explicitly allowed, what’s dangerously murky, and who’s actually doing the invisible work behind these so-called “AI stars.”

What happened with Tilly—and why the money matters more than the memes

In September 2025, at the Zurich Film Festival’s industry summit, Dutch actor-turned-technologist Eline Van der Velden introduced the world to Tilly Norwood—a perky, London-based AI “actress” with shoulder-length brown hair, a British accent, and an Instagram following (Fortune). The real bombshell wasn’t the tech. It was Van der Velden’s announcement that multiple talent agencies were interested in signing her.

Hollywood’s response was immediate and visceral. SAG-AFTRA condemned the move, declaring that “creativity is, and should remain, human-centered” and that Tilly is “not an actor” but “a character generated by a computer program that was trained on the work of countless professional performers—without permission or compensation” (Variety). Emily Blunt, shown Tilly’s image mid-interview, exclaimed: “Good Lord, we’re screwed. That is really, really scary” (Variety). Whoopi Goldberg told viewers on The View that audiences can always tell the difference between humans and synthetics.

The internet lit up with hot takes. What it mostly missed: the contract math. The interest in Tilly isn’t a weird tech flex—it’s a flashing neon sign pointing to a fundamental shift in how performance gets valued and paid for. To understand why an agent would represent a piece of code, you need to stop reading the tweets and start reading the fine print.

The rules, decoded: What 2023 SAG-AFTRA actually says about AI

The 2023 actors’ strike, which lasted 118 days, was fought largely over artificial intelligence. The resulting contract created a critical legal distinction between two types of non-human performers—and the difference determines who gets paid, how much, and when.

Digital Replicas: When AI copies a real person

A digital replica recreates a specific, identifiable human performer’s likeness, voice, or performance. The 2023 contract divides these into two subcategories: employment-based replicas (created with the actor’s participation during a paid gig) and independently created replicas (built from existing footage or recordings). Either way, the rules are strict (CBS News, The Hollywood Reporter).

Producers must obtain clear, informed consent with a “reasonably specific description” of how the replica will be used. More crucially, they must pay the human actor as if they’d physically done the work—day rates for the time saved, applicable residuals for reuse, pension and health contributions included. If a digital version of you works three days instead of the five you would have worked, you get paid for five days (The Hollywood Reporter). This structure essentially kills the financial incentive to replace an actor with their digital twin just to cut costs.

Synthetic Performers: The legal gray zone where Tilly lives

Tilly Norwood falls into a different, far murkier category. A “synthetic performer” is fully AI-generated and “not recognizable as any identifiable natural performer,” according to the contract’s legal language (The Hollywood Reporter, Backstage). Because no single actor’s identity is being replicated, the consent and compensation rules are dramatically looser.

Producers must notify SAG-AFTRA before using a synthetic performer and provide the union “an opportunity to bargain in good faith” if they’re replacing a human role. But there’s no preset payment formula. No day-rate equivalent. No mandated residuals. This is the loophole—and it’s wide enough to drive a fleet of digital ingénues through.

Follow the money: Perpetual licenses vs. residuals

The business case for synthetic performers becomes clear when you compare the economics.

The traditional human model is built on recurring revenue streams. An actor earns a day rate for time worked on set, then receives residuals—smaller payments triggered every time the content is rerun, sold to new platforms, or hits streaming success thresholds (Axios, CBS News). These payments sustain careers between gigs and ensure performers share in long-term success. Pension and health contributions flow from every paycheck. Union minimums set wage floors.

The synthetic model flips this entirely. AI studios like Xicoia, Tilly’s creator, aren’t collecting day rates. They’re monetizing intellectual property through:

  • Perpetual usage licenses: A studio pays a flat fee upfront for the right to use Tilly’s performance infinitely across any media, with no additional payments required for reruns or platform expansions.
  • Usage-based billing: Some AI systems charge per use, similar to cloud computing fees—pay for what you generate, when you generate it (legal analyses from DLA Piper, Perkins Coie).

The incentive structure is obvious: infinite reuse without ongoing residual obligations, scheduling conflicts, or union minimums. While SAG-AFTRA’s contract ensures a digital replica of Emily Blunt must be compensated like Emily Blunt, a synthetic performer like Tilly faces no such requirement. The negotiation becomes a straightforward IP licensing deal, not a labor agreement.

Van der Velden told the Zurich Summit that when she first pitched Tilly, studio executives said, “That’s not going to happen.” By May 2025, they were saying, “We need to do something with you guys” (Deadline). That shift wasn’t about the technology improving—it was about the business case clicking into focus.

The gray zones and the invisible labor

Beneath the contract language sit thorniness the mainstream coverage ignores.

Training data remains the Wild West. SAG-AFTRA argues synthetics like Tilly are “trained on the work of countless professional performers—without permission or compensation.” The 2023 contract requires producers to meet with the union regularly “to discuss appropriate remuneration, if any” for footage used to train AI systems, but as of now, no payments are mandated (The Hollywood Reporter). That “if any” is doing heavy lifting. Studios could theoretically build entire libraries of synthetic performances by training AI on union-covered work, paying nobody for the source material.

Background actors occupy another murky zone. The contract says digital replicas can’t be used to “circumvent the use or engagement of background actors,” but enforcement gets tricky when a synthetic character blends features from multiple people into an unrecognizable composite—what one SAG-AFTRA member described as “smashing together six or seven different actors to create one uber-actor” (Prism Reports). State laws are starting to fill the gaps—California’s AB 2602 and New York’s Digital Replica Law both took effect in 2025, requiring stricter consent protocols and voiding contracts with vague AI provisions—but the patchwork creates compliance headaches.

Then there’s the labor nobody’s crediting. An “AI star” doesn’t spring fully formed from a server. Behind every synthetic performance is a crew of human artists and technicians. Someone writes Tilly’s lines. Someone rigs her facial expressions and body movements. Someone directs her “performance,” adjusts the lighting in her generated scenes, and manages her Instagram persona (Variety reported publicists are already navigating the bizarre task of maintaining backstories for synthetic characters). Voice actors, animators, VFX artists, continuity supervisors—all the craft that makes performance believable still requires human expertise.

Yet current business models offer no clear path for these creators to receive residuals or even screen credit. When asked about Tilly’s workflow, a Particle6 spokesperson told Variety she wasn’t “available to speak” but confirmed the company manages her as IP. The humans animating that IP? Their compensation structure remains opaque.

So, what now?

The Tilly Norwood controversy isn’t the death of human acting. But it is a preview of the legal and financial battles ahead—and a stress test for whether the protections won in 2023 can hold.

For producers, the path forward demands meticulous documentation. Using synthetics requires notifying SAG-AFTRA, bargaining in good faith if you’re replacing human roles, and tracking state-by-state compliance as laws like California’s AB 2602 create stricter guardrails. Cutting corners invites both contract violations and reputational damage.

For talent agencies, the math gets uncomfortable. Representing synthetic IP might generate fees in the short term, but it undermines the talent pool that sustains the entire business. Gersh Agency president Leslie Siebert told Variety her firm won’t sign Tilly, though she expects others will. That split—between agencies protecting human clients and those chasing new revenue streams—will define the industry’s next chapter.

For performers, the stakes are existential. Developing a personal strategy around digital replicas matters now, not later. That means negotiating for limited consent scopes, revocation rights, posthumous control, and explicit exclusions from training datasets. The 2023 contract won baseline protections, but individual deals still vary wildly.

The core tension remains unresolved. Synthetic performers are being pitched as scalable, one-and-done assets—IP you license once and exploit forever. But acting has never been just an asset. It’s a craft sustained by residuals, health coverage, and the assumption that performance creates ongoing value for the performer, not just the studio.

The most important question isn’t whether AI can act. It’s this: If nobody’s on set collecting a day rate, who’s getting residuals? And when the answer is “nobody,” what happens to the infrastructure—financial, creative, and ethical—that keeps the industry functioning?

Follow the money, and you’ll find those answers sooner than you’d like.