I really like Extra Credits. I think their analysis of Metroid: Other M is probably the most level-headed and constructive take on the game. Their Open Letter to EA Marketing is the most damning analysis of EA’s marketing behavior, and is greatly bolstered by the fact that they’re not trying to create some rage-filled rant to appeal to angry fans but are honestly trying to show how harmful these practices are to the industry in general and even to EA itself. Their analysis of what went wrong with the animations in Mass Effect: Andromeda might involve a bit of speculation, but along the way you’ll get a great education in just how complex modern animation systems are.
I often agree with the show, and it’s pretty hard to make a weekly column out of “Yeah! What that guy said!” There are tricks you can use to pad something like that out to a 3,000 word essay, but those tricks are very, very, very, very, very, very, very, very, very, very, very, very, very, very, obviously obvious. So despite my admiration for their content, their videos just don’t work as conversation starters for me.
But last month they posted Games Should Not Cost $60 Anymore – Inflation, Microtransactions, and Publishing. It makes a lot of points I disagree with, so now we’ve got something to talk about. Yes, I’m aware this behavior is one of the reasons I get a reputation of being overly negative and nitpicky. But look, I’m only criticizing the show because I’m a fan. There are lots of popular YouTube channels out there that I don’t like and don’t care about, and I don’t waste time arguing with them. On this site, we criticize because we loveOr sometimes because we’re angry..
Finding The Right Price
The argument is that prices need to go up because publishers need more money. Except, that’s not how prices work. Do you really think Starbucks spends five times as much making their $5 coffee as McDonald’s spends making their $1 coffee? The prices aren’t really based on how much it costs to make the coffee, but how much the consumer is willing to pay. If everyone was suddenly willing to spend more money on coffee then coffee prices would go up, even if the cost of producing it didn’t. A really big company (like a global brand or publisher) can experiment with prices and get a feel for what consumers are willing to pay. They will put prices as high as they can possibly go without hurting sales. If costs go up then it might hurt the bottom line, but unless there’s an economy-wide disruption going on then it’s not going to change how much the consumer is willing to pay.
Of course, this is an imperfect science. It’s not like everyone in the world has a fixed limit where they’ll spend no more than $5 on a coffee. Some people are willing to go up to $6. Some will even pay $10. For a lot of us, even $5 is too much. Some are okay with $5, but will stop buying if the price goes above that. Or they’ll switch to cheaper alternatives. Or they’ll just buy less of it.
If you charge $0.1 you’ll lose money and if you charge $100 you’ll never sell any. Between these two you’ve got a curve that ramps up and then back down, with the peak of the curve representing the price where you’ll make the most money. Where is that peak and how do you find it? That’s why you went to business school, sparky. You’ll need to do the market research to find out. Mess around with prices, gather up the results, and then fire up that spreadsheet and see what the numbers say.
Let’s try a thought experiment: Let’s imagine that EA manages to produce a high-quality, full-featured, modern-day AAA title with all the bells and whistles. And somehow they manage to accomplish this for only one dollar. I don’t know how they did it. They paid a single U.S. dollar and got a market-ready AAA game. Maybe a genie was involved.
The game cost one hundred millionth of what games normally cost to produce. So the question is: What will they charge for it? It doesn’t matter what you would charge. What would Electronic Arts Incorporated charge for this game?
Go ahead. Think about it.
You Already Know The Answer
Obviously they would charge sixty dollars, because that’s what consumers are willing to pay. Sure, they could charge five bucks for the game if they wanted and they’d still make loads of money, but then they would be making less money for no reason. If $60 is the optimal price for a game that cost a hundred million to produceFor the sake of argument, we’re assuming this is the case. then it’s also the optimal price for a game that cost nothing to produce, and also the optimal price for a game that cost a billion to produce, because the consumer doesn’t actually care how much you spentAssuming they’re of roughly equal quality.. The average consumer doesn’t look at a price tag and think, “I wouldn’t normally pay this much, but I heard this game went way over budget and they probably need the money.”
Yes, there are indeed companies that deliberately keep their prices low and forego profits because of principles. I can understand that. As an independent creator my goal is to get as many people to see my work as possible while still covering my expenses. I’d rather 50,000 people paid me $1 each for my book rather than just one guy paying me $100,000 for the book, even if the second option makes me more money overall. But if you’ve been a gamer for more than three minutes then you know that EA, Activision, Ubisoft, and Microsoft are not those kinds of companies. The big-publisher mandate is to maximize profits.
Extra Credits makes the case that the $60 price tag was set back in 2005 or so. Since then U.S. inflation has pushed the buying power of the dollar down by roughly 2% per year. Using these figures, they conclude games “ought to” cost about $70 right now. As I pointed out above, this is looking at it the wrong way. A better angle to take might be to say that “consumers ought to be willing to pay $70”.
Now, some people make this argument in this condescending tone: “You should be willing to pay $70 for games and the reason you’re not is because you’re ignorant and entitled.” Extra Credits doesn’t take this tone, but I’ve seen the argument made. Again, this ignores how consumers behave and assumes we all do inflation calculations when we shop. If you’re buying from a faceless entity then you don’t look at the price and think, “Do they deserve this money?” You look at the price and think, “Is the product worth this much to me?”
In a practical sense, consumers are always “entitled” and sellers are always “greedy”. The buyer always wants the price as low as possible and the seller wants to charge as much as they can, and this tug-of-war ends up setting the priceThis is obviously ignoring edge-cases like Humble Bundle, Kickstarter, or meta-game purchasing decisions like buying a full-price game you don’t really love from (say) Obsidian or CD Projekt RED because you want to support their overall work.. If EA gave me the option of paying $30 or $60, I’d take the $30 every time. If I gave them the option of charging me $60 or $100, I’m sure they’d pick $160 because that’s how they roll.
So rather than lecturing consumers that they’re being unreasonable, a more productive question ought to be, “Given the rise of prices due to inflation, why aren’t consumers willing to pay more than $60?”
We’ll answer that question in a bit, but first let’s talk about…
The High Cost of High Fidelity
Next the EC team makes the argument that given the rising cost of game development, games ought to cost somewhere in the $90 range. For the sake of argument, I’m going to accept all of their assertions at face value: The cost of making games has increased dramatically, perhaps even as much as 3× or 4×. The availability of engines and middleware has created some savings, but not enough to offset the sheer cost of having dozens of artists working for years to fill these games with cutting-edge visual content. Fine. I might have some questions about how this works, but let’s just go with the Extra Credits numbers for now.
But Like I said above, the price can’t go above what the consumer is willing to pay. If Honda Motor Company took their popular $20k Honda Civic and upped the price to $100k because the new models are gold-plated, you’d be correct to say that they need to charge more money to pay for the gold. The problem from my side is that I still have no reason to go above $20k. I don’t need the gold. I just want a machine that will take me to work without killing me.
Publishers are free to spend as much as they want on developing a game, but their spending more doesn’t automatically translate into me wanting it more. If I don’t feel like the game offers good value for the money, then I’ll just buy something else.
But Shamus, what if they can’t make their money back at $60?
Then that game loses money and people stop making those sorts of games. If the cost of producing a game requires you to charge more than what the consumer is willing to pay for it, then that’s a game the market can’t support. Make something else. Or better yet, find a way to make it for less.
According to EC, to make a game with graphics up to the current standards of fidelity, it costs four times more here in 2018 as it did in 2005. Even if that’s true, are you really getting a return on that budget increase? Or to put it more directly: What if you dialed back the graphics to 2012 levels? Let’s assume for the sake of argument that this would only make the game cost twice what it did in 2005. How much would that drop in “realism” harm sales?
We don’t know the answer, and neither do the big studios. We can tell they don’t know, because they haven’t done the homework. They haven’t experimented with making a game with current gameplay, current marketing, but last-gen visuals. Sure, the indies have been messing around with this stuff, but the extreme differences in marketing, brand recognition, gameplay, and scope makes it impossible to get a worthwhile comparison. Knowing that Avant-Garde Tearjerk Walking Simulator sold 100k copies with 2012 graphics doesn’t tell us anything about how (say) Call of Duty would fare with the same technology. Publishers are willing to experiment with all of these techniques to wring extra money out of the consumer, but they haven’t even done the most basic homework to figure out how much the consumer values extra visual fidelity, much less figuring out where the optimal spot on the cost / benefit curve might be.
Like I keep saying, the people at the top of these companies are not informed about the consumers needs. They’re selling products they don’t understand. They’re focusing on graphics because that’s the most obvious and superficial attribute of a game to an outside observer.
(For the sake of brevity, I’m lumping all the big publishers together. Obviously this is an oversimplification. Blizzard has a really good handle on how to make good visuals, Valve was really good about playtesting back when they made games, and Sony has been really good about letting Naughty Dog leverage their cutting-edge visuals effectively. I think EA is the worst offender when it comes to overspending on visuals, but they’re not the only offender. Every company is a little different and every franchise has different needs in terms of visuals and budget, but I don’t have the patience to diagnose the problems of every individual publisher in this space.)
And before you reductio ad absurdum me, I’ll point out that I’m not saying that graphics don’t matter at all. Games were niche in 90s, and its true that the audience has grown along with fidelity. I love 1999’s System Shock 2, but I can understand why lots of people might find the blocky graphics a little hard to deal with. You really do need to use your imagination a bit when playing those old titles. And I understand if you’re young and you grew up on modern graphics then those pellet-headed, mitten-handed character models might look unintentionally comical, which would clearly hurt your experience with a game designed to be tense and scary.
On the other hand, have a look at the visuals of Mass Effect 3 (2012) Deus Ex Human Revolution (2011) Spec Ops The Line (2012) and Dead Space 2 (2011). Sure, if you compare them side-by-side you can see they’re not quite as sharp and detailed as the games of today. But… does the consumer really care? Is the difference so extreme that a player is going to look at those visuals and say, “No, I just can’t get into this because the visuals don’t look realistic enough for me.” I won’t say that people like that don’t exist, but I will say they must be a very small minority.
Moreover, the pursuit of cutting-edge graphics carries tremendous risk. Assassins Creed Unity, Arkham Knight, and Mass Effect Andromeda were all seriously hurt when their development teams tried to make the jump to “next generation” graphics. How many extra copies did those enhanced visuals in Wolfenstein II sell? These games generated bad press, product returns, lost sales, and support headaches for the publisher. Moving to a new engine is hugely disruptive to a development team. Everyone has to learn new tools and master a new system. It will slow down development, which means you can either allow the game to take longer or you can hold the release date and put out a buggy, broken, unfinished mess.
These publishers are supposedly afraid of rising costs and risk, and yet they keep pushing the one thing that makes both problems substantially worse!
And even if we insist that Call of Duty and Battlefront really do need cutting-edge graphics to please the fans, that still doesn’t mean the same is true for all genres. Can anyone seriously argue that Mass Effect: Andromeda would have been worse off if the team stuck with the familiar (to them) 2012 version of Unreal Engine 3 rather than migrating the the latest Frostbite Engine? How much does the average Tomb Raider or Hitman player really think about graphics when making purchasing decisions?
I think developers are chasing after better graphics because it’s the one thing guys like Andrew Wilson can understand. Wilson can’t tell if the new Shoot Guy game “feels good” to play. He can’t tell if the balance is right, or if the pacing is a bit off. But he can look at the game and see that the cinematics are “more cinematic” and the visuals are “more realistic”. Andrew can’t intuit just how much it would ruin a multiplayer game to have pay-to-gamble-to-win mechanics in place, but he can tell that these new screenshots look better than the screenshots from two years ago.
Keeping Prices Down
Next Extra Credits also makes a point that emphatically I agree with: Competition is keeping prices where they are.
If we were still buying games at retail stores, if the indie revolution hadn’t happened, if GoG wasn’t giving us access to two decades of cheap retro games, and if the mobile gaming craze had never taken place, then it’s possible game prices would have crept up to $70 or even $90 like the Extra Credits team says. In fact, I find that very likely.
Back in the 90s, games were really “expensive” for me in terms of justifying their cost in my budget. I only shopped at retail so I only had access to recent AAA titles. So I got one new game every few months. Sometimes I’d only get two or three new games a year. If retail was still the only way to get games and they cost $90 now, then I’d probably still be following those very conservative buying habits.
But now the indies, the mobiles, and the retro games are all competing for those same gaming dollars. If AAA games jumped up to $90 then I could easily get by on the other stuff. AAA publishers don’t have the leverage over me the way they used to. Now the only time I’m obliged to pay $60 for something is if it’s part of a series I fell in love with in decades past. And given how many compromises those games have made in their transition to the modern age, it’s easier than ever to walk away.
Second Stage Monetization
Publishers are engaging in aggressive and annoying new methods of monetizing their games. Pay-to-win microtransactions. Pay-to-skip-the-grind microtransactions. Lootboxes. DLC that feels like it should have been part of the core product. Let’s call all of these these techniques “Second-stage monetization”. It’s money you pay for the game after you’ve supposedly already bought it.
The excuse is that inflation and rising expenses have “forced” them to adopt these practices, but like I said above: Spending more doesn’t automatically make the consumer willing to pay more. These companies want to make as much money as possible. If these monetization techniques are a good idea now, then they would also be a good idea if budgets hadn’t changed at all in the last ten years.
When World of Warcraft began making more money every month than what Blizzard originally spent developing itI don’t know if this actually happened, but it’s a very plausible scenario given their peak subscriber base and the game budgets of the time period., the people at the top didn’t say, “I guess we have enough money now. Let’s slash prices to be nice.” They let the money roll in. And I don’t begrudge them that. Customers were happy, they were happy. Everybody won.
If these second-stage monetization practices had worked flawlessly, then they would have been added to every game – even ones that didn’t have trouble making money. Publishers would have let the money roll in, just like Blizzard let the money roll in back in 2010.
Publishers have concluded that $60 is the price ceiling for gamesIt’s possible that the various “Collector’s Edition” packages are (among other things) attempts to test for how open the market is to price increases. If the $80 or $100 version of a game sells really well, then it’s safe to assume lots of people are ready to spend more than $60. If nobody buys them, then the market isn’t ready for a price bump.. So these various experiments with second-stage monetization are their attempt to raise prices without raising prices.
That’s actually a legitimate goal. If you can charge some people $60 and other people $70 and other people $80, then that’s great. People with lots of disposable income will think nothing of dropping a few extra dollars, and you can get that extra money without driving off the people who can’t (or won’t) spend more than $60.
The problem is that most of these second-stage techniques harm the overall quality of the product in the process. Allow me to repeat an analogy I used a few months ago:
Back in 1991, McDonald’s rolled out their "value menu". They noticed a lot of people just bought a drink and a burger, and didn’t bother getting the french fries. So they introduced the "Value Meal", which included drink, fries, and a burger for one "low price". This sped up ordering, since someone could simply order a "Number 1" without having to list all of the items individually. It also created the false impression that they were saving money by ordering the meal. (If you added up the cost of items individually, they were within a few cents of the cost of the equivalent meal.) And most importantly, it got people to buy more food.
It sped up order times, it got people to order more food, and it made the customer think they were saving money! You might not like this sort of behavior, but it was effective and clever.
Now imagine a version of McDonalds run like EA. The company leadership doesn’t really eat fast food except to sample their own products, and going to a fast food restaurant with the family isn’t something they would ever do.
So when they decide they want more money, the idea of a value meal doesn’t occur to them. Instead they just charge more. Charge for ketchup packets. Charge for napkins. Charge for bags.
There’s an outcry, and the leadership doesn’t understand why. They did the math and they figured these new policies should only add a few cents onto the usual order. What they didn’t foresee was just how much this change would hurt the overall dining experience. Now a lone mother with three kids has to stop and calculate how many napkins her children might need before they place the order. A guy who runs out of ketchup has to go and stand in line to get one more packet, while his food gets cold back at the table. Cheapskates try to save money by going without lids and straws, which results in more spills for the staff to clean up.
Pay-to-win microtransactions, lootboxes, and turning key parts of the experience into DLC is the equivalent of charging for napkins and straws. Yes, it might squeeze a little more money out of additional customers. But what you can’t measure is how much it damages the experience of using your product. If you understood your customers it would be obvious, and if you don’t understand or care about your games then you’ll never understand it. If Andrew Wilson had been personally looking forward to playing Star Wars: Battlefront II then he would immediately have sensed how upsetting the lootbox policy would be. He’d be able to say, “Wow. 40 hours of grind just to unlock Luke Skywalker? That sounds like a complete killjoy. That might even ruin the game for me.” But instead he only understands games as products for “other people”.
The publishers are trying everything they can to get more money out of consumers, right at the moment when consumers are spoiled for choice and awash in cheap games. The one thing publishers aren’t doing is working on keeping costs down, which is the source of all their alleged financial difficulties. But to fix the problem of skyrocketing budgets, you’d need a deeper understanding of videogames than “Does it look pretty in a trailer?” Instead they’re doubling down on the visuals and then trying to squeeze the customer for more money, thus creating a whole new public relations problem.
They’re working very hard, but there is no substitute for knowing what you’re doing.
 Or sometimes because we’re angry.
 For the sake of argument, we’re assuming this is the case.
 Assuming they’re of roughly equal quality.
 This is obviously ignoring edge-cases like Humble Bundle, Kickstarter, or meta-game purchasing decisions like buying a full-price game you don’t really love from (say) Obsidian or CD Projekt RED because you want to support their overall work.
 I don’t know if this actually happened, but it’s a very plausible scenario given their peak subscriber base and the game budgets of the time period.
 It’s possible that the various “Collector’s Edition” packages are (among other things) attempts to test for how open the market is to price increases. If the $80 or $100 version of a game sells really well, then it’s safe to assume lots of people are ready to spend more than $60. If nobody buys them, then the market isn’t ready for a price bump.
Marvel's Civil War
Team Cap or Team Iron Man? More importantly, what basis would you use for making that decision?
Shamus Plays LOTRO
As someone who loves Tolkein lore and despises silly MMO quests, this game left me deeply conflicted.
Silent Hill Turbo HD II
I was trying to make fun of how Silent Hill had lost its way but I ended up making fun of fighting games. Whatever.
This Game is Too Videogame-y
What's wrong with a game being "too videogameish"?
A look at the main Borderlands games. What works, what doesn't, and where the series can go from here.