
I’ve been playing video games for as long as I can remember. From the days of the Game Boy Color and Super Nintendo, my hands have gripped dozens of controllers across dozens of consoles, and I’ve played more games than I can count. This industry means a lot to me—as I’m sure it does for many of you reading this.
It means so much, in fact, that countless people decided to make careers out of creating games. They rolled up their sleeves and poured their time, creativity, and energy into the experiences we’ve come to love. Whether it’s a big-budget blockbuster or a scrappy indie gem, the games we cherish were built by people who cared—people with passion, talent, and heart.
But in return, this industry has met that passion with exploitation, burnout, dehumanization—and the creeping hand of unchecked greed. Slowly, and often quietly, the games industry has started to rot from within. And yet, when people look for someone to blame, they rarely point the finger where it belongs. It’s not “wokeness” or diversity that’s bleeding this industry dry—it’s corporate greed, executive bloat, and a business model built on disposable labor. Most people don’t even see it happening. The average player only sees the final product, not the toll it takes to get there. The human cost is easy to ignore—but we shouldn’t.
So as you read this, and as you think about the games you love, I ask one thing: No matter how good the game is. No matter how excited you are to play it. Remember the human cost.
The First Domino
If you asked me where things started to go wrong, I’d point to the rise of downloadable content. Not the good kind—the kind that expanded on a complete game, like Sims 2 expansions or Halo 2 map packs. Those were fine. Those were optional. They added more to an already full package.
But once consoles started integrating online connectivity out of the box, something changed. Developers and publishers realized they didn’t need to sell you a whole game anymore. They could sell you parts of one. And then they asked the question that would haunt the industry for years to come: How little could they get away with charging for?
Gone were the big expansions and chunky map packs. In came the trickle—bite-sized DLC, cosmetic add-ons, paid cheats. Most of it was harmless at first. It felt like a novelty, not a threat. But it was a signal. A quiet test. One that would explode on April 3rd, 2006, when Bethesda dropped what I still consider the first real domino: the infamous, mocked, dumb-as-hell Horse Armor for Elder Scrolls IV: Oblivion.
It cost 200 Microsoft Points—which didn’t seem like much… until you realized you couldn’t buy exactly 200. You had to buy 400 for $5. Meaning that dumb cosmetic armor, which did nothing to help you in-game, effectively cost you five bucks. It was instantly roasted. Forums lit up. Magazines dunked on it. Even people who didn’t own Oblivion knew about the Horse Armor.
But here’s the twist: some people still bought it. Curiosity, maybe. Support, maybe. And when the outrage faded, the publishers took notes. We thought we’d laughed it off.
They saw a new revenue stream.
Note: Microtransactions Made Up 58% of PC Game Revenue in 2024, Research Shows
According to financial analysis by Digital River, microtransactions have fundamentally transformed gaming economics, with free-to-play games generating 80% of digital game revenue despite representing only 15% of total releases. The model has proven so effective that the average revenue per paying user (ARPPU) in free-to-play mobile games ranges from $15 to $25 monthly—significantly higher than traditional one-time purchase models could achieve. (source).
The Joke Became Reality
If Horse Armor was a joke, 2012 was the punchline. By this point, DLC wasn’t just accepted—it was expected. Extra characters, costumes, new levels—it all became part of the launch conversation. But this was also the year when big publishers showed us just how far they were willing to twist a good idea for profit.
Asura’s Wrath dropped in February. Great game. Wild game. It played like a playable anime, with sky-punching gods and over-the-top boss fights. But when April rolled around, Capcom released the game’s “true ending” as paid DLC. The actual ending. Behind a paywall.
Then Mass Effect 3 landed in March. Solid game. Controversial ending. But what made it worse was the Day One DLC, From Ashes, which locked away a critical lore character. The content was on the disc. You could see it—but you couldn’t access it unless you paid extra.
Oh, and Street Fighter X Tekken? Also released that same day. Capcom again. Twelve full characters were locked on the disc at launch, waiting to be sold back to us months later. Not future content. Not bonus content. Content that was already finished.
And while it didn’t invent the concept, Borderlands 2 doubled down on the now-standard Season Pass model—pay us now, get content later, maybe. If you’re lucky. And yet, despite all this, 2012 is still remembered as a great year for games. Even the ones caught up in these controversies were (and are) beloved. Which just proves the point: publishers bet that players would eventually accept the bare minimum if the games were good enough.
Over time, outrage turned into eye-rolls. “Don’t like it? Don’t buy it.” “How else are they supposed to make money?” People stopped pushing back. And once the industry realized they could nickel-and-dime us without consequence, things only got worse. Microtransactions stopped being cosmetic. “Time savers” replaced skill progression. Premium features started showing up in single-player games. And the idea of getting a full experience for $60? Ancient history.
That bet paid off. (Click here for additional context)
Note: In 2022, DLC sales accounted for 13% of PC revenue and 7% of console revenue in the US. DLC boosted monthly active users (MAU) overall by 11% across PC and console games. Some game genres benefitted from DLC more than others. Strategy games experienced the highest MAU growth from DLC releases, followed by role-playing and simulation titles. On average, 30% of MAUs for Dead Cells (one of the games we cover in the report) in DLC-release months were new players.
(source)
The Rise of Time Savers
Over the years, the industry kept pushing. And little by little, the outrage faded. Players who once mocked Horse Armor were now paying for boosters, cosmetics, currencies, and convenience. Publishers had successfully rebranded greed as player choice.
We started hearing phrases like “time savers” and “optional content,” usually from executives trying to explain why their $60 game now had a microtransaction menu baked into the pause screen. Why grind for upgrades when you could just buy a shortcut?
Take Devil May Cry 4: Special Edition, for example. Capcom sold red orb packs—used to unlock moves and abilities—for real money. You didn’t need to buy them, sure. But that’s the trick. If they’re so unnecessary, why sell them at all? That’s where “time savers” come in. A term popularized by Ubisoft, time savers were the beginning of a quiet shift in how games were designed. Progression got slower. XP gains got stingier. Suddenly, buying a multiplier didn’t just save time—it felt necessary to avoid tedium.
Assassin’s Creed: Origins was the flashpoint. It introduced a permanent XP boost you could purchase to level up faster. Players noticed. Some even ran tests, comparing the grind with and without the booster. The results were clear: the game was built to be slower without it. And that wasn’t by accident. At this point, “optional” started to mean “optimized.” If you didn’t pay up, you weren’t getting the best experience—you were getting the longest one.
And here’s the part that sealed the deal: time itself became a commodity. According to the ESA, the average gamer in 2004 was 29. In 2024, that number has jumped to 36. Gamers are older now. Busier. Time-strapped. And when you’re juggling work, family, and life, ten bucks to speed things up doesn’t sound so bad.
That’s how they got us.
The Illusion of Infinite Wealth
If you asked me who’s doing the most damage to this industry, I’d point to the CEOs and the investors. It’s hard to choose between them. CEOs make the calls, but investors set the rules. One exploits for profit, the other demands it. Either way, their obsession with infinite growth is bleeding the games industry dry. And while they chase those profit margins, the people actually making the games—the artists, the writers, the animators, the engineers—are the ones who suffer.
Let’s be clear: this isn’t about “wokeness,” or DEI initiatives, or whatever culture war talking point is trending this week. The mass layoffs, the studio closures, the waves of burnout and abuse—it all stems from corporate greed, not diversity. From shareholders who want more, faster, and cheaper. From executives who get bonuses for cutting corners and slashing teams. And while players were getting nickel-and-dimed, developers were getting crushed.
It’s hard to pinpoint when things started unraveling inside the studios, but once social media cracked the door open, the stories came pouring out. Twitter became a confession booth. Forums turned into therapy circles. We started hearing about what development really looked like behind the scenes—and it was brutal.
The biggest red flag? Crunch. Defined by the IGDA as “employees working overtime in order to meet a deadline,” crunch sounds tame on paper—until you realize just how extreme it gets. Take Red Dead Redemption 2. Rockstar co-founder Dan Houser once proudly said the team worked 100-hour weeks “several times” during development. That’s not passion. That’s exploitation. And Rockstar was hardly alone.
Crunch became a silent expectation across the AAA space—normalized, even praised. Naughty Dog’s Neil Druckmann once said people chose to work longer hours out of passion. But as developer Carrie Patel from Obsidian pointed out, one person’s “passion project” becomes everyone else’s emergency. Even if you don’t want to crunch, the team still has to meet the deadline. Voluntary or not, the pressure is real. And crunch isn’t even the worst of it.
Note: They do not get paid extra for working overtime. (Click here for additional context)
If employees are classified as salaried and meet certain criteria (such as being considered “exempt” computer professionals), they are not legally entitled to extra pay for working additional hours, including during periods of crunch. This is due to exemptions in federal and state labor laws that allow companies to avoid paying overtime to employees who meet particular salary thresholds and job duties, which is common for software developers, engineers, and many other technical roles in game development. In practice, this means that even though crunch often involves working well beyond a standard 40-hour week, most salaried game developers do not receive additional compensation for those extra hours unless their company voluntarily offers bonuses, paid time off, or other incentives.
Crunch was only the beginning
However, crunch wouldn’t even be the worst violation. Behind the scenes, a much uglier pattern was forming. Harassment, discrimination, retaliation—especially against women, queer developers, and marginalized voices—became regular headlines. The workplace wasn’t just grueling. In many studios, it was hostile.
Blizzard was one of the biggest implosions. In 2021, the California Department of Fair Employment and Housing sued Activision Blizzard for fostering what they called a “frat boy” workplace culture. The stories that came out—about sexual harassment, gender discrimination, and leadership turning a blind eye—were stomach-churning. It wasn’t just a toxic work environment. It was systemic abuse.
Ubisoft followed with its own wave of allegations: executives accused of predatory behavior, HR departments covering it up, and a revolving door of survivors who left the company while their abusers stayed. For years, people at the top promised reform. Most of them are still there.
And these are just the companies we heard about. Smaller studios? Outsourced QA teams? Contract workers? They face the same issues with even less protection and none of the headlines.
And when developers do try to push back—through walkouts, organizing, or speaking publicly—they’re hit with layoffs, blacklisting, or legal pressure. Unionization efforts have grown, but so has resistance. In 2024 alone, dozens of studios saw record profits… followed by mass layoffs. Entire departments gutted. Entire teams axed. Projects canceled mid-development, not because they failed—but because they didn’t hit some arbitrary forecast set by executives who likely never touched a dev kit.
All of this has made one thing painfully clear: To the people in charge, developers are disposable.
You can pour your life into a game, work nights and weekends to make it shine, and still be on the chopping block the moment the stock dips. Because to shareholders, you’re not a person—you’re a number. And when you cost too much, they replace you with a cheaper one.
The Pay Gap (Click here for additional context)
Average Compensation: The average total compensation for a gaming CEO rose from about $6.1 million in 2019 to $8.6 million in 2024.
Top Earners: In 2020, the highest-paid video game CEOs received over $100 million in total compensation, with some (like Robert Antokol of Playtika) earning $372 million and Bobby Kotick of Activision Blizzard earning $154.6 million, mainly due to stock awards and bonuses (source: 4,5,1.)
Pay Gap: The gap between CEO and median employee pay has widened, with some companies showing a CEO-to-median-employee pay ratio as high as 1:1,560

The Price Increase Fallacy (Click here for additional context)
📝 Editor’s Note
You may have heard the argument that rising game prices are necessary to ensure developers get paid fairly. But the numbers tell a different story. While some premium titles have crept up to $70 or more, that extra revenue hasn’t gone to the people making the games—it’s gone to executives and shareholders. Entry- and mid-level developer salaries have grown modestly in the past decade (mostly due to inflation and demand), but C-suite compensation has skyrocketed, often increasing by hundreds of percent through bonuses and stock payouts. So no, higher game prices aren’t trickling down to dev paychecks. They’re padding profit margins. And that’s not a sustainable or ethical model.
The Blame Game is Undefeated
So where does that leave us? Some will tell you that gaming is in decline because of “wokeness.” That developers are too focused on diversity, too distracted by inclusion. But that’s not just wrong—it’s a deliberate misdirection. A smokescreen. Something to keep you angry at the wrong people.
The truth is simple: this industry isn’t being killed by politics. It’s being gutted by profit.
It’s being dragged down by bloated executive salaries, bad faith investors, and CEOs who treat workers like spare parts and players like wallets. It’s being undone by an endless hunger for growth that no creative industry—no human-driven industry—can sustainably meet.
And yet, despite all this, people still make games. Beautiful, creative, heartbreaking games. Developers still pour themselves into projects knowing they could be laid off the second it ships. That’s not just passion. That’s resilience. That’s love. But love shouldn’t be exploited. And passion shouldn’t be a license for abuse. As players, we have a role in this. We can speak up. We can support developers. We can stop accepting bad business practices just because the game is fun. We can stop pretending that “optional” means harmless. And we can stop falling for the lie that representation is the problem, when the real villain is sitting in a boardroom.
The next time you hear someone blame diversity for the state of the industry, ask them how many women and queer devs were in the room when Activision laid off 1,900 employees after posting record profits. Ask them who really benefits when teams are slashed, games are rushed, and content is carved up for sale. Ask them where all the money goes. We deserve better. The people who make our games deserve better. And the industry won’t get there until we stop taking bait—and start holding power accountable.
About Author
You may also like
-
IndieQuest 2025 Proved Indie RPGs Aren’t Just Alive—They’re Leading the Revolution
-
Androids, Autism, and the Performance of Humanity
-
Blades of Fire: A Soulslike with Heart (Review)
-
Seafrog Proves Indie Games Are Still Full of Surprises (Review)
-
It’s Time to Refresh the C2E2 Cosplay Crown Competition