Chris’ survival Horror Quest has a brilliant post that examines the sales performance of PS2 games against their metacritic scores. He’s looking to see how much quality affects sales. He charted 1,281 games and shows us the breakdown in a number of very interesting graphs.
The only nitpick I have is that I’ve never thought scores were all that useful for determining quality. The way the review system works, a critic usually sits down and pushes through a game in less than a week and then hammers out a review. (And the whole system is a sham in the PC realm, where the reviewer is likely using a top-end PC and a review copy that might not have the DRM found in the retail version.) The process suffers from the same problem that movie reviews do, which is that the reviewers are voracious consumers of games, to the point where they make “hardcore” gamers seem “casual”. Add in the marketing “tilt” effected by big name publishers (which we caught a glimpse of in the firing of Jeff Gerstman) and you have a system where scores don’t have a lot to do with quality. I trust scores to filter out the really horrible stuff, but beyond that I rely on demos and word of mouth. I’ve seen many big-name, top-rated games that turned out to be “meh”, and I’ve seen some real gems that were given modest scores by critics.
The disparity between scores and quality might account for some of the seeming randomness in Chris’ charts, but absent a way to quantify subjective things for that many games, there just isn’t any good way to sort that out.
I will add that I wish I had access to the data he’s using. There are two things I’d like to do with it:
- Color-code the dots by year, perhaps using red for the first year of the PS2 and going through the color spectrum from there. It would be interesting to see if patterns emerged from the noise, if certain years trended higher than others, and so on.
- Size the dots by the relative size of the publisher. EA would have great big dots, and little operations would be tiny specs. While big-name publishers don’t always throw heaps of money at a game, I’ll bet we’d see some patterns emerge that would tell us about the value of large marketing campaigns.
Only 8 diggs on that article? That’s a crime. It was a great read.
Quakecon 2012 Annotated
An interesting but technically dense talk about gaming technology. I translate it for the non-coders.
Trusting the System
How do you know the rules of the game are what the game claims? More importantly, how do the DEVELOPERS know?
Blistering Stupidity of Fallout 3
Yeah, this game is a classic. But the story is idiotic, incoherent, thematically confused, and patronizing.
Pixel City Dev Blog
An attempt to make a good looking cityscape with nothing but simple tricks and a few rectangles of light.
Secret of Good Secrets
Sometimes in-game secrets are fun and sometimes they're lame. Here's why.