Hollywood Writers Quietly Turn to AI Training Work as Industry Jobs Dry Up

Screenwriters and creatives describe a hidden economy where entertainment professionals help build the AI tools displacing them

edit
By LineZotpaper
Published
Read Time3 min
As Hollywood continues to grapple with the fallout from strikes, streaming contraction, and widespread industry layoffs, a growing number of screenwriters and television professionals have turned to a disquieting new source of income: training the artificial intelligence systems many believe are accelerating their displacement from the industry.

Screenwriter Ruth Fowler, writing in WIRED, describes completing 20 AI data-labelling and content-generation contracts across five different platforms over eight months — work she characterises as "soul-crushing" but financially necessary in a market where traditional commissions have become scarce.

Fowler's account points to a broader, largely unspoken trend in the entertainment industry: experienced writers, story editors, and other television professionals are quietly selling their creative expertise to AI companies, helping train large language models on narrative structure, dialogue, character development, and screenplay formatting.

The phenomenon draws a pointed parallel to an earlier era of Hollywood precarity. Just as aspiring writers once waited tables between gigs, AI training work has become the stop-gap employment of choice for those with creative credentials but diminishing conventional opportunities.

"Everyone who used to make TV is now secretly training AI," Fowler writes, suggesting the practice is far more widespread than the industry publicly acknowledges.

The work typically involves tasks such as rating AI-generated story ideas, rewriting weak machine-produced dialogue, or producing original creative content to expand training datasets — labour that draws directly on the skills writers spent years developing for studio and network projects.

For AI companies, the arrangement provides access to high-quality domain expertise at contractor rates, without the costs associated with full-time employment. For the writers, it offers flexible income, though accounts like Fowler's suggest the pay is modest and the work emotionally taxing.

The situation raises thorny questions that remain unresolved in ongoing negotiations between writers' guilds and studios over AI use in production. Critics argue that writers training AI systems on their own creative knowledge are, in effect, helping to automate their own profession — a dynamic with few historical precedents in creative industries.

The Writers Guild of America secured some provisions around AI use in its 2023 contract with major studios, but those rules govern AI's role in credited productions, not the separate market for AI training labour, where individual contractors operate largely outside collective bargaining protections.

The scale of the trend remains difficult to quantify. AI data-labelling platforms typically prohibit contractors from disclosing their work, which may explain why, as Fowler notes, it is happening "secretly." The opacity suits both parties in the short term, but leaves the broader industry without a clear picture of how extensively creative expertise is being fed into AI systems.

§

Analysis

Why This Matters

  • The trend creates a paradox at the heart of the AI-versus-labour debate: workers most threatened by automation are directly subsidising its development, raising urgent questions about long-term industry viability and the adequacy of existing guild protections.
  • Non-disclosure requirements mean the true scale of creative professionals' involvement in AI training is hidden, making it difficult for unions to negotiate meaningful protections or for the public to assess how much human creative labour underpins AI-generated content.
  • If AI systems are trained extensively on the work of experienced screenwriters, the resulting tools may reach competency thresholds faster, further compressing the window in which human writers retain a clear market advantage.

Background

Hollywood entered a prolonged period of disruption well before AI became a mainstream concern. The 2023 Writers Guild of America strike — the longest in decades — centred partly on fears about AI's role in production, alongside grievances over streaming residuals and minimum staffing requirements. The strike ended with some AI guardrails in place, but the contracts did not address the emerging market for AI training labour.

The broader entertainment industry has since faced additional pressure from streaming services cutting content budgets, a contraction in the overall number of scripted series being produced, and ongoing uncertainty about advertising and subscription revenues. Many writers who were regularly employed during the peak streaming boom of the late 2010s have found consistent work harder to secure.

AI data-labelling as a form of gig work has existed for years, but has expanded rapidly since the release of large language models capable of generating plausible creative content. Platforms specialising in "reinforcement learning from human feedback" (RLHF) and creative content evaluation have actively recruited domain experts, including writers, to improve model outputs.

Key Perspectives

Working writers: Describe AI training contracts as a financial lifeline in a difficult market, but express deep ambivalence about contributing to technology that may further erode traditional employment. The secrecy surrounding the work adds to a sense of stigma and isolation.

AI companies and platforms: Benefit from access to specialised creative expertise that meaningfully improves model quality. The contractor model keeps costs flexible and avoids the obligations of direct employment, though companies rarely comment publicly on the composition of their training workforces.

Critics and guild advocates: Argue that individual writers accepting AI training work — however understandable the financial motivation — collectively undermines the leverage unions need to negotiate meaningful limits on AI use. They warn that current guild agreements leave a significant regulatory gap around training labour.

What to Watch

  • Whether the WGA or other creative unions move to extend collective bargaining coverage to AI training contracts, or negotiate revenue-sharing arrangements with AI companies that use members' expertise.
  • Upcoming contract renewal cycles between guilds and studios, which could provide an opportunity to address the AI training labour gap identified in the 2023 agreements.
  • Any regulatory or legislative movement — particularly in California — on disclosure requirements for AI training data sourcing, which could force greater transparency about the role of creative professionals in building these systems.

Sources

newspaper

Zotpaper

Articles published under the Zotpaper byline are synthesized from multiple source publications by our AI editor and reviewed by our editorial process. Each story combines reporting from credible outlets to give readers a balanced, comprehensive view.