Chris’ survival Horror Quest has a brilliant post that examines the sales performance of PS2 games against their metacritic scores. He’s looking to see how much quality affects sales. He charted 1,281 games and shows us the breakdown in a number of very interesting graphs.
The only nitpick I have is that I’ve never thought scores were all that useful for determining quality. The way the review system works, a critic usually sits down and pushes through a game in less than a week and then hammers out a review. (And the whole system is a sham in the PC realm, where the reviewer is likely using a top-end PC and a review copy that might not have the DRM found in the retail version.) The process suffers from the same problem that movie reviews do, which is that the reviewers are voracious consumers of games, to the point where they make “hardcore” gamers seem “casual”. Add in the marketing “tilt” effected by big name publishers (which we caught a glimpse of in the firing of Jeff Gerstman) and you have a system where scores don’t have a lot to do with quality. I trust scores to filter out the really horrible stuff, but beyond that I rely on demos and word of mouth. I’ve seen many big-name, top-rated games that turned out to be “meh”, and I’ve seen some real gems that were given modest scores by critics.
The disparity between scores and quality might account for some of the seeming randomness in Chris’ charts, but absent a way to quantify subjective things for that many games, there just isn’t any good way to sort that out.
I will add that I wish I had access to the data he’s using. There are two things I’d like to do with it:
- Color-code the dots by year, perhaps using red for the first year of the PS2 and going through the color spectrum from there. It would be interesting to see if patterns emerged from the noise, if certain years trended higher than others, and so on.
- Size the dots by the relative size of the publisher. EA would have great big dots, and little operations would be tiny specs. While big-name publishers don’t always throw heaps of money at a game, I’ll bet we’d see some patterns emerge that would tell us about the value of large marketing campaigns.
Only 8 diggs on that article? That’s a crime. It was a great read.
Trashing the Heap
What does it mean when a program crashes, and why does it happen?
A stream-of-gameplay review of Dead Island. This game is a cavalcade of bugs and bad design choices.
Batman v. Superman Wasn't All Bad
It's not a good movie, but it was made with good intentions and if you look closely you can find a few interesting ideas.
The plot of this game isn't just dumb, it's actively hostile to the player. This game hates you and thinks you are stupid.
Crash Dot Com
Back in 1999, I rode the dot-com bubble. Got rich. Worked hard. Went crazy. Turned poor. It was fun.