Chris’ survival Horror Quest has a brilliant post that examines the sales performance of PS2 games against their metacritic scores. He’s looking to see how much quality affects sales. He charted 1,281 games and shows us the breakdown in a number of very interesting graphs.
The only nitpick I have is that I’ve never thought scores were all that useful for determining quality. The way the review system works, a critic usually sits down and pushes through a game in less than a week and then hammers out a review. (And the whole system is a sham in the PC realm, where the reviewer is likely using a top-end PC and a review copy that might not have the DRM found in the retail version.) The process suffers from the same problem that movie reviews do, which is that the reviewers are voracious consumers of games, to the point where they make “hardcore” gamers seem “casual”. Add in the marketing “tilt” effected by big name publishers (which we caught a glimpse of in the firing of Jeff Gerstman) and you have a system where scores don’t have a lot to do with quality. I trust scores to filter out the really horrible stuff, but beyond that I rely on demos and word of mouth. I’ve seen many big-name, top-rated games that turned out to be “meh”, and I’ve seen some real gems that were given modest scores by critics.
The disparity between scores and quality might account for some of the seeming randomness in Chris’ charts, but absent a way to quantify subjective things for that many games, there just isn’t any good way to sort that out.
I will add that I wish I had access to the data he’s using. There are two things I’d like to do with it:
- Color-code the dots by year, perhaps using red for the first year of the PS2 and going through the color spectrum from there. It would be interesting to see if patterns emerged from the noise, if certain years trended higher than others, and so on.
- Size the dots by the relative size of the publisher. EA would have great big dots, and little operations would be tiny specs. While big-name publishers don’t always throw heaps of money at a game, I’ll bet we’d see some patterns emerge that would tell us about the value of large marketing campaigns.
Only 8 diggs on that article? That’s a crime. It was a great read.
In Defense of Crunch
Crunch-mode game development isn't good, but sometimes it happens for good reasons.
Starcraft 2: Rush Analysis
I write a program to simulate different strategies in Starcraft 2, to see how they compare.
The Best of 2012
My picks for what was important, awesome, or worth talking about in 2012.
Bethesda felt the need to jam a morality system into Fallout 3, and they blew it. Good and evil make no sense and the moral compass points sideways.
A horrible, railroading, stupid, contrived, and painfully ill-conceived roleplaying campaign. All in good fun.