Fairness Isn’t A Metric
What creatives in tech should learn from the WGA strikes
<PLACEHOLDER REPLACE>
Beneath the deluge of model releases and AI product innovations in the spring of 2025 flows a quiet current of consolidation, the top names in tech and design swallowing each other whole to cut the form of the next creative economy. "Jonathan Ive Joins Forces with OpenAI"-type articles lead the tech news cycle, the watery black-and-white photo of the two men's soft smiles peaked the fold on most tech news sites for days. More subtly, legacy giants like Ogilvy are building bespoke AI pipelines that will soon pump out the hyper-targeted ad creatives destined to become the norm. And adrift in this news, I’ve found myself obsessed—not with a headline, but with the collapse of a video game reboot and the smell of burning oil on a summer day.
I stood across the street from a picket line in 2008. I was twenty-three and mostly invisible, working inside the Burbank lot but not quite in it, moderating comment threads for NBC.com. The WGA writer strike was at its high, so every day I watched the people I wanted to become trudge around near my office complex in a compelled protest until their guild-required hours were met and they were allowed to drive off to a home with the air conditioning and reliable plumbing I dreamt of. That day, Jay Leno had pulled up to support the writers, then as I gathered, the ridiculous car he drove there caught fire. I remember him calmly spraying a fire extinguisher, which he must have had at the ready, over the glowing hood while the striking writers looked on with total apathy.
2008 turned out to be the year that changed my life, though it didn't feel like it at the time. While most of the entertainment industry was bleeding out—development deals evaporating, assistants getting laid off mid-email—NBC.com was quietly expanding. I was working 39-hour weeks—one hour short of what you'd need to get health insurance—out of a windowless cubicle, doing whatever passed for digital content moderation back then, which meant combing through message boards, formatting recaps, and at one point, verifying whether message board posts about one of the new American Gladiators doing gay porn before donning a whole different, prime time appropriate cock sock were true. It was absurd work, but it was work. And while people I went to college with were sending out résumés into the void or pouring drinks in Silver Lake, I had a badge, a job title, and a seat on the inside. Sort of.
One of the designers in my cubicle collective—a soft-spoken guy named George Yoon—once told me about how, on Court TV, a senior manager used to oversee the freelance contractors from an actual raised platform deck, like a lifeguard at a public pool. “But he was a really cool guy,” George added. “It was a good gig.” At the time, it didn’t seem strange. What struck me more was how normal it had all started to feel. I’d graduated with a screenwriting degree six months earlier, imagining writers’ rooms and late-night brainstorming sessions, but the strange truth was that tech—at least in those early NBC.com days—felt much closer to the zany office culture I’d daydreamed about.
My sympathy for the TV writers wore thin pretty quickly. Sure, they were fighting for their future, but I was making twelve dollars an hour moderating message boards full of My Name Is Earl flame wars and making sure dweebs weren’t using javascript to hack our “The Office”-themed in-browser flash game in a way anyone could notice. Meanwhile, some of these guys were pulling in doctor money to make sure every page of a Two and a Half Men script hit its quota of two dick jokes and a fart. It was hard to feel too heartbroken about the end of an era I’d never been invited into.
As the strike dragged to a close, rumors of a deal were everywhere, and when my floor got the all-hands mandatory meeting invite with half hour notice, we knew it was here. That meeting was the first time I got a real look behind the curtain—my first experience watching a lawyer address a room full of tech people. The stage was aggressively stale: a fluorescent-lit boardroom with a speakerphone in the middle, but the actors were remarkable: techies packed the room beyond standing room only, the svelte middle managers in their late forties, wearing collared shirts and face moisturizer, trying to stay dignified as they angled their chests to let the girth of fanny-packed QA engineers squeeze past them. A room with a milieu of roles and potpourri of odors that would define the next decade of how entertainment industry types got paid. This seemed like the most important meeting I'd ever witness—and to this day, it is.
A tale of two strikes
Everyone who paid rent by making film and television—whatever side of the camera or desk they were on—in 2008 was in a meeting just like this one, one that would define the business for over a decade. We were informed a deal was struck, and it really came down to this: for the first time ever, union members would get paid residuals on streams, but that residual would be infinitesimal. So small the magnification needed to perceive it on the number line crossed the barrier from the Newtonian to the Quantum, where attribution fractured, value scattered, and certainty vanished.
That's the deal we have today: if someone makes money from media you help create, you should get paid—but how much and how often is a quagmire. Drawing the line between what’s fair and what’s exploitative often comes down to whether enough people agree something is in the spirit of the agreement, not just the letter.
The producers and guild executives were right: streaming was the future—just not in the gradual, manageable way anyone had hoped. It came faster, hit harder, and rewrote the rules more brutally than most of them could have imagined. And that shift needed infrastructure. Streaming didn’t just need content; it needed interfaces, flows, buttons—digital products designed from scratch. Less than four years after I was auditing the chastity of American Gladiators for poverty wages, I was wireframing the very platforms that would drip out fractional pennies to their descendants for generations.
The 2008 agreement had worked—technically. The real profits of streaming flowed to the platforms, the tools, the algorithms. But the creators—the people who once knew how to get millions to tune in at 8 p.m. sharp—turned out to be the same people who knew how to get someone to press play on a thumbnail calibrated to fire dopamine at the user level. They got paid. What they got paid was, legally speaking, appropriate. Fairness was more elusive. Debates about what was fair always seemed to settle once enough people agreed it felt fair. And that—feels fair—was the invisible force that lifted micro-fractions of a cent per stream from quantum uncertainty into the Newtonian realm of dollars and cents, printed cleanly on a residual check.
I remember visiting the Paramount lot for the first time years later, genuinely excited to walk into the “Gene Roddenberry” conference room—expecting something futuristic, almost sacred. Instead, it was beat up and strangely beige, like a mid-tier airport lounge with a plaque. The executives I met with, though, were warm, grounded, and deeply human. They listened. They asked smart questions. One of them was visibly excited about taking their kids to a family-only screening of Wonder Park that weekend. It was the opposite of everything I’d been told to expect back when I was a grunt at NBC—when “executive” meant someone whose job seemed to involve being chauffeured between panic and lunch.
That contrast stuck with me: the rooms might be falling apart, but the people inside them weren’t villains. They were just trying to keep the lights on in a business that doesn’t always make sense.
There’s a popular fantasy that everyone in Big Tech or entertainment is rich. They’re not. Most people I’ve worked with are just trying to make rent, keep their benefits, and maybe hold onto their job through the next reorg. But there’s a tiny subset—maybe a few hundred people across all of FAANG who are. You don’t encounter them often, but when you do, it’s disorienting. I’ve found myself somehow a part of conversations, with thoroughly lovely men, about the pros and cons of racing slicks on one’s racing car, or how fulfilling it is to watch one’s child succeed at their high school’s helicopter piloting instruction. Most of the time, though, the entertainment and tech industries aren’t made of people like that. They’re made of people trying to get by—people who still comparison-shop health insurance plans and split dessert on business trips.
I was around for the 2008 writers' strike, but only just—tucked away inside NBC’s digital arm, still figuring out what a CMS was, watching the picket lines from a distance. I wasn’t one of them. I didn’t know anyone with a WGA card. The strike felt abstract—loud, yes, but not personal.
The 2022–23 strike was different. I was still working in entertainment tech, but now I lived in a fashionable Silverlake neighborhood where a lot of the people who made television lived too. My kid went to school with their kids. Suddenly the strike wasn’t a line of signs at a studio gate—it was a marriage hanging on by a thread, the dad who didn’t know if his writers’ room would ever reopen, the mom scrounging for college essay editing gigs just to cover groceries. These weren’t people striking over abstract residual formulas. They were trying to figure out how to hold onto their healthcare, their homes, their reliable plumbing.
The 2008 agreement had technically been followed, but that agreement didn’t feel right in the stream-driven, post-pandemic entertainment boom. The numbers were there, but the sense of fairness wasn’t. Eventually, an inflection point was hit—and it was pencils down. Because creative work, especially the kind built by hundreds of people and leveraged into billions of dollars, doesn’t run on compliance alone. It doesn’t run on clock-ins or clean deliverables. When your work is intellectual property—subjective, collaborative, and ever-evolving—what matters more than policy is perception. Fairness isn’t a fixed number. It’s a feeling. And the real work of organized labor isn’t just negotiating legal frameworks—it’s generating enough shared understanding of what feels fair that people are willing to keep building together. That’s what sustains creative economies: not rules, but consensus that holds.
Entertainment’s Advantage: Shared Language, Shared Stakes
The industry I work in now—digital product design—feels like it’s teetering on the edge of its own 2008 WGA moment. Only ours might be more severe.
The Marathon plagiarism scandal, where Bungie lifted artwork from a small independent creator, Fern Hook of Anti-Real, didn’t just raise eyebrows—it carried the unmistakable feeling of enough is enough. The theft was so blatant, so easily traceable, that the standard practice of “moodboarding” stopped looking like inspiration. A line wasn’t crossed when some low-level contractor pulled a screenshot from an Anti-Real Instagram post and slipped it into a texture pack. It was crossed by the system that allowed Hermana Creatives and Joseph Cross to take creative credit for Marathon’s visual language—“graphic realism”—without ever acknowledging the fine art of Fern Hook or the aesthetic groundwork laid by Anti-Real.
But what makes this moment truly telling isn’t the theft—it’s the choreography of the fallout. Who gets blamed, who gets protected, and who quietly disappears from the story. There are no union heads to arm-twist into a definition of what is fair enough—just gobs of outrage on one side and lawyers and press releases on the other. Despite Bungie and Hermana Creatives’ best efforts to tacitly blame some laid-off subcontractor—the kind of overextended, under caffeinated freelancer just trying to make it to Friday with enough energy to microwave dinner—the online rancor wasn’t directed at them. It was the system that has obfuscated and skated past the work of the hundreds of contributors who make these kaleidoscopically intricate and oil-slick, billion-dollar intellectual properties like AAA games and tent-pole movie and TV franchises possible.
The demand for these kinds of individual contributors—those whose creative instincts can shape billion-dollar products—is about to explode because of generative AI. And I don’t think most people fully grasp the scale. Targeted ads like those served by Google and Facebook—demand-side platforms—aren’t just more effective than traditional spray-and-pray television advertising; they’re orders of magnitude more precise, more personal, and more profitable. And the strange truth is, the more “generative” these tools become, the more original, deeply human creative work they quietly consume—far more than the traditional workflows they claim to replace.
The Machine Forgets Who Fed It
Profit always comes from somewhere. Let me take you to the rooms that feed the models that fed or are feeding this next wave of advertising. Picture it: a windowless office tucked behind a loading dock, fluorescent lights buzzing overhead. A stylus taps a screen. A mouse hand scrolls. A retoucher leans in too close to a smudged monitor. The cursor flickers. The work is numbing. Logo lockups, pitch decks, casting selects, VFX breakdowns—version after version. Script treatments rewritten in someone else’s tone. Styleframes for title sequences that never get greenlit. It pays a little more than pouring lattes. No one planned to end up here, but here they are—folded into the production pipeline. Whether the job’s for a streaming series, a sports brand, or a crypto startup no one really believes in, the work gets done. It's signed away with NDAs, buried under version numbers, and backed up on servers no one will ever touch again.
Then—years later, on another continent entirely—another machine wakes up. The data center isn’t a room you can walk into. It’s a warehouse-sized hive somewhere in Iowa, or Taiwan, or an abandoned submarine hangar in Finland. A GPU hums as it begins to train. It ingests decades of work, millions of files from a thousand artists who we will never know. No names. No links. Just the cumulative residue of commercial imagination: LUTs, rigged models, casting selects, graphic passes, flattened PSDs, motion studies, animatics. All stripped, diced, and emulsified into vectors the machine needs to chew.
If the data isn’t fresh—if it contains too much recycled work from other AI—it begins to spoil. The vectors collapse inward. Patterns fray. Noise begets noise. It is called model collapse—an elegant phrase for something profoundly stupid. Models driving a machine so hungry, it eats its own tail. Then its teeth. Then its stomach. The machine, trained on the rancid slurry, generates copies stacked on copies stacked on guesses. Skin maps to skin mapped to shimmer mapped to fog. It generates horrors. Eyelashes that blink in the wrong direction. Fingers growing from palms. And the ad its paid to generate is nothing but a fever dream set to the sound of breathing.
The pure mixture that keeps the nightmares at bay requires the forgotten labor of people who worked in rooms with bad lighting and cheap office chairs. People who made things that looked like nothing, until they looked like everything. What is the value of the artist who feeds machines not yet invented, whose work is crushed into unrecognizable powder and used to power worlds years in the future?
What is fairness for them?
Conclusion
As I burned down the first draft of this essay—researching the strikes, combing my memory for every detail—what stood out to me, embarrassingly, was how little I thought back then about what was fair. Whether there was such a thing as a fair system. Whether fairness, even if beautifully constructed, could ever be practically true. No one asked me for my thoughts at the time. But now, years later, mid-career, in an industry that feels like it’s tilting underfoot, I wish I’d thought more seriously about it. I feel unprepared to give answers. But I feel I must try.
For all the ambiguity around fairness—legal, cultural, emotional—the one quality I keep returning to is honesty. Not transparency, not attribution, but something more elemental: the refusal to pretend. And when I try to picture that idea, I see Jeff Koons’s Balloon Rabbit. Chrome and enormous, it stands polished to the point of erasure—so smooth it seems less sculpted than conjured. Its surface reflects everything around it: gallery walls, skylights, spectators. And when you step close, it reflects you too—only bent, stretched, made strange. It’s a mirror that doesn’t show you as you are, but as the space makes you appear.
That’s what it feels like to examine the instruments we work in day after day. The tools, the platforms, the pipelines. When we hold up a question as broad as fairness, we do so inside systems that warp the very act of asking. Balloon Rabbit was the most expensive modern sculpture ever sold at auction. And there are three of them. Koons didn’t cast them, didn’t weld or polish the steel. It was his idea, and that was enough. People argue about the value of that gesture, and they should. But Koons never lied about his role. He said, this is what it is. In a landscape where authorship blurs and fairness dissolves into press releases and settlements, that kind of honesty—even if it’s hollow—is a shape I recognize. He doesn’t pretend it’s something else. And maybe, for now, that’s the clearest form of fairness I can name.