The Cost of Spectacle

By Shamus Posted Wednesday Jul 30, 2008

Filed under: Game Design 139 comments

This post is for developers, investors, and publishers of A-list games. (Or is that AAA titles? Whatever. “Games on the shelf at Wal-Mart for $50” is a more accurate descriptor but it’s kind of verbose. Sort of like me.) I know I’ve got at least two or three of you in my audience, hiding amongst the crowd of regular gamers, indie developers, and my fellow curmudgeons. I’m going to have another go at talking you out of your obsessive pursuit of graphics, which at this point makes Gollum’s pursuit of The One Ring look lackadaisical. I realize this is a hopeless task, but it’s no less hopeless and unfulfilling than trying to keep my humble hardware up to date in the face of your skyrocketing system requirements. And since I’m not having a good time I might as well drag you along with me.

If you’re one of those people who is unable to tell the difference between spending money and making fun games then you’re excused. Go back to developing your juvenile plotless cookie-cutter tech demo and don’t trouble yourself with this business.

In the 1980’s, while I was still fantasizing about becoming a programmer and trying to figure out how to kiss girls (or maybe the other way ’round) your average videogame development team was A Guy. Sometimes larger teams might include His Buddy. As we entered the 90’s and the age of Doom we saw team sizes swell to numbers that occasionally made it feasible to play a little doubles tennis. Not that anyone had time for that sort of thing. A few years later and development teams could, if they ever went outside, possibly fill the positions in a baseball diamond. A few years after that and we had teams of 20 or more people and something strange started to happen. Games started getting shorter. When Max Payne blasted his way onto the scene everyone pointed at his short little 10 hour game and giggled. 10 hours? Who is going to pay $40 for a ten hour game?

Well, all of us, eventually. If we’re lucky. But now we can look back fondly at those days. Teams are still getting bigger and games are still getting shorter. It now takes a hundred people to produce content that offers even less gameplay than Max Payne did. Oh yeah: And development time has increased as well, so not only are you paying a lot more people but you’re also paying them for a lot longer before you actually get your game.

This is not a good trend and it should fill reasonable developers with apprehension, because it can’t keep going like this. The number of PC gamers hasn’t really gone up all that much. You’re still aiming for that 4 million units target you were a decade ago. To put it another way, you’re now funding teams five times as large for twice as long to sell shorter games to an audience that is roughly the same size. You can quibble with these numbers if you like, but the trend is there, and it’s visible. If I’ve overstated or understated the problem by some margin it doesn’t really change that fact the PC Gaming is seeing an unsustainable escalation of development costs.

An example might be useful.

In Wolfenstein 3D, you can fire up the level editor and make a single room in less then a minute. In Doom you might spend ten minutes making a room. Around the turn of the century you might spend a couple of hours on it. Last I heard it now takes several people (usually a mapper, a 3D modeler, a texture artist, and possibly some sort of scripting person) working together for a couple of days to make that room. Yes, the room is very realistic and cool looking, but it takes 72 man-hours to make the damn thing.

Each new graphics generation requires more development hours to exploit it. The jump from 2D spites to moving 3D characters was a pretty compelling one. The leap forward in lighting technology that give us flashlights actually added something to the gameplay. But adding another lighting pass to make sure a doorknob can support multiple specular reflections? Improving the shading and density of foliage by 25%? Are these really worth the development time and the upgrade cost for the user? How many of those customers would you lose if you didn’t take the next step? What if you just stayed where you are right now from a technology standpoint, instead of adding another 50 people or another six months of development time to your game? Sure, you wouldn’t have the latest graphics, but how many sales is that going to cost you? I hear people claim that they can’t attract gamers without the next-gen graphics, but… when was the last time anyone honestly tried?

I don’t know what the hardware breakdown is out there. How many PC owners have which graphics cards? Now, your first answer might be to jump over and have a look at the Valve Hardware Survey. “Oh look! An overwhelming majority of users have NVidia 6000 series or better!” It really gets on my nerves when you do that and if I could I would whap you on the head with this copy of Game Informer I keep handy for when I need a good laugh. That survey is for Steam users, and therefore mostly people who already own Half-Life 2. Which is exactly the problem you keep running into. You keep aiming your game at the same fragment of the potential audience.

Like I said, I don’t know the breakdown, but if it’s not a bell curve I’ll eat my keyboard. I’ve heard that nothing makes a presentation seem professional like a good chart, so with that in mind I give you this:

graphics_cards.jpg

Maybe that wasn’t as professional as it could have been. At any rate, you keep aiming your games so that users need the female terminator to run the game well, and a T-1000 to run the game poorly. This leaves out all those people in the middle, who are happy to spend a million billion dollars on Peggle and re-skinned Bejeweled clones. I can hear you arguing, “Oh Shamus, those people are all boring and stuffy and wouldn’t want our videogames. Also I need to be hit in the face with that magazine again.” How would you know they don’t want your games? You’ve never made anything that they can run. They haven’t rejected you, because you’ve never given them the chance. By the time Joe Average has hardware that can run your fancy-pants game, it’s long gone from stores and replaced by newer games he can’t run. Keep in mind that most of these peggle-players have no clue how to use torrents. You keep aiming your game at this tiny, pirate-infested group and wondering why sales are so small.

(Also note that a lot of people in the middle are former PC gamers, who did the math and realized that could take the graphics-card money and put it into a console instead of mucking about inside of their computer every eighteen months. Those people aren’t against PC games, they just don’t have any available. With a few exceptions.)

Case in point: Note how World of Warcraft is at least two graphics generations out of date, and yet Blizzard had to buy avalanche insurance just in case their pile of money falls over. Their game looks like Lego Middle Earth and they are kicking your ass. Part of the reason is because their game will run on almost every battered laptop and second-hand computer on the planet. Your customers, on the other hand, need to blow a couple hundred dollars every other year just to run your games poorly.

As far as making a game fun, graphics spectacle is the most foolhardy and inefficient way to spend your money. The fancy visuals are exciting for the first few minutes, but then the user becomes acclimated and desensitized to your razzle-dazzle and they’re left with just the gameplay to entertain them. And gameplay is the one thing you keep cutting to pay for the graphics. That’s like cutting off the top of your head because your high heels make you too tall to fit through the door.

If you would just take two steps back from that accursed bleeding edge and aim for the middle of the bell curve you would discover that:

  1. You could work with a much smaller team, paying fewer salaries.
  2. Since you’re not re-writing your tools and changing your art pipeline every time you start a new game, development time will be shorter.
  3. Your artists will be more productive since you won’t be snatching away the tools they understand for newer, more complex tools. You might find it to be a little easier to make longer / deeper games.
  4. You’ll have a far larger potential audience.
  5. You’ll have fewer support / QA issues because you’ll be building on established technology instead of working the gremlins out of the new stuff.
  6. Better framerates, faster load times, quicker installs.

Even if none of those Peggle-playing goofs embraced your game, you’d still be better off because you spent a lot less money to sell to the same group of gamers you’ve been dealing with for years. You risk so much money on your new pixel shaders, but you’re not willing to risk spending less and see if you can still get the usual suspects to pony up?

I realize I just summed up using a bulleted list, but let me sum up again, just to make absolutely sure I’ve driven my point home: You can spend far less to make a game with more value that can offer a better play experience to a larger audience with less pirates.

And now let me sum up my summary, for the benefit of those in marketing: You can spend less money and make more money.

Me? I’m saving up for an XBox 360. You guys are driving me nuts with this business.

 


From The Archives:
 

139 thoughts on “The Cost of Spectacle

  1. Sitte says:

    Nitpick, sorry:

    Was shocked to see the author of DMotR talking about someone named “Golem”. Who’s that?

  2. Shamus says:

    He’s someone with which my spellchecker was unable to help.

    Also: Fixed. Maybe.

    At least I didn’t write about Boreomere.

  3. Paladin109 says:

    *takes a bite…chews thoughtfully…and immediately reaches for {insert beverage here}*

    Hmm…hint of cayenne…touch of onion…very bold statement…what, this isn’t the chili cook-off?

    I still like your ‘extra-spicy’ flavor of late…

  4. Dan Bruno says:

    “Sure, you wouldn't have the latest graphics, but how many sales is that going to cost you? I hear people claim that they can't attract gamers without the next-gen graphics, but… when was the last time anyone honestly tried?”

    Well, there’s World of Warcraft, as you point out, but I think the more dramatic answer is “in the console industry.” As you’re probably aware, the Wii is beating the pants off of the competition with hardware that is significantly underpowered when compared to the PS3 and Xbox 360.

    Granted, there are other factors at play besides graphical fidelity, but there’s clearly a market for games that don’t push hardware to the limits. A developer that’s earned its customers’ trust with good games — as Nintendo has, or as a PC dev like Blizzard has — will make money hand over fist.

  5. Adam says:

    Shamus.. tell me how you really feel. Better yet, thanks for stating what I have been thinking for the last few years.

  6. Sitte says:

    This was fantastic.

    While reading it, I kept seeing the Zero Punctuation animation of it in my mind and hearing Yahtzee read it in his ultra-quick & ultra-condescending voice.

    This is great on its own – don’t take the ZP comparison negatively.

    Also: 2 Ls, I think?

  7. Shamus says:

    You’re trying to murder me with my own shame.

    Fixed. Maybe.

  8. MRL says:

    Hm. Thanks for (inadvertently) reminding me of another reason I liked WoW – simple, unrealistic (cartoony, even) but fundamentally well-made graphics (except for the appearances of pretty much any male who’s not an elf, dwarf or goblin, for some strange reason).

  9. swcrusader says:

    I think some games graphics have really added to the game itself. I will never forget getting out of that boat at the beggining of morrowind and going slackjawed at how beautiful it was. That said nobody seems to optomise games any more, and you’re right about consoles. I’m one of those former pc gamers that traded it all in for a console. That said the release of Diablo 3 and Starcraft are getting me itchy to plonk down some real money for a new laptop. If they released d3 on console I would forget about it.

  10. Chris says:

    Someone mentioned the Wii, but a perfect example on the 360 also exists. I purchased Earth Defense Force 2017 the day it was released for the Xbox 360, which was $40 new (as opposed to the standard $60 price). The game’s graphics are pretty poor. Environments and terrain are covered with bland textures, the animations on some of the models look awkward, and the only reason the robotic enemies look so good is because you can see the cheap shortcuts they took. They applied a reflective setting on their 3-D modeler and used the bloom effects built into the 360 dev. kit to the max.

    Overall, the game looks like shit, and it’s certainly lacking in the A.I. and physics department.

    The game has enjoyed a reasonable amount of success, and I have yet to meet someone that has played it that disliked it. Why? Because sometimes all you need is a game where the premise is “you fight off giant ants, spiders and robots with weapons like a rapid fire rocket launcher. That’s right, a rapid fire rocket launcher”.

    I love games like CoD4 and Gears of War, but not every game has the budget or ability to achieve such awesomeness. Sometimes, you just need a game that is stupid, stupid fun.

  11. Viktor says:

    Me? I'm saving up for an XBox 360. You guys are driving me nuts with this business.

    Is that true? You’ve finally come over to the dark side? Dang. Pretty soon, the only people playing any PC games will be the pirates.

  12. Ian says:

    Nice, and I agree wholeheartedly.

    You might also want to include games “journalists” who almost without exception mark down titles with “out of date” graphics.

    FUN! IT’S SUPPOSED TO BE FUN!! THAT’S IT! THAT’S THE ONLY CRITERIA THAT ACTUALLY MATTERS!

    .. and breathe ..

  13. Jeremiah says:

    There was a point in the past when I was the type of gamer riding the right-side of that graph and I spent an inordinate amount of money on a desktop. To give you an idea: I bought just about the best money could be 3 years ago and I can still play most games on their highest settings (most recent examples being BioShock & Gears of War 2); and that’s with no upgrades.

    But, I’ve seen the error of my ways. I much prefer to stay in the middle-ground from now on. I’ve always liked story and gameplay over “OMB uber-pixels”, which makes me wonder why I spent so much on the beast (at the time) of a computer.

    Anyhow, great as usual. Hopefully developers will realize they don’t have to ride the bleeding edge to keep customers.

    ‘Course, I’d say some of the blame lies with the video card makers, too. One version to the next is so different that games almost require being written with specific hardware in mind, or at least patches to make certain cards work. If cards would work more similar from version to version, developers could aim at the middle of the road and the hardcore gamer folks could get their shiny-new-pixel-bling-mapping-EXTREME cards and ramp up the quality as high as they want without worrying about hardware compatibility.

  14. Cineris says:

    World of Warcraft wasn’t two graphics generations out of date when it was released. Or even one generation, really. But they went with a stylized look that has aged pretty well. Team Fortress 2 is also pretty liable to age well, in my opinion, just because they set out with a visual agenda other than photorealism.

    CounterStrike 1.6 is the true game that will run on every laptop. I think CounterStrike Source fairly recently overtook it in player numbers, but I’m not 100% sure. XFire regularly lists World of Warcraft as the “top played” game (as ranked by XFire users time spent logged in to the game), with Call of Duty 4 second place. Gamespy doesn’t list World of Warcraft, but Half Life (including CS1.6 and TF1 and a billion other mods) and Half Life 2 are the #1 and #2 games on their ranking, respectively, followed by COD4.

    So, far as I can tell, the market isn’t really determined so much by graphics requirements as by what’s good, and what everyone else is playing. COD4’s graphics requirements aren’t trivial, but it’s one of the most popular PC games out there.

    Anyway, today’s level editors can make Wolf3D or Doom-level visuals just as quickly (if not moreso) than before. You COULD limit yourself to cubical rooms with flat textures and a few sprite decoration objects… But, well, I think your game would have to have something truly amazing to convince someone to buy it. People’s expectations are higher, and even budget titles have to go up against games like Fear and Prey sellings for peanuts these days. Even if your budget title isn’t going to try and compete on the level of graphics – Are you really going to get voice actors and script complex cutscenes for your game? ‘Cause even though you can say a game can tell a really good story and be compelling… Things like voice actors, complex cutscenes, and good facial animation are great storytelling devices that won’t be available to you.

  15. SolkaTruesilver says:

    Sitte, I though EXACTLY the same thing. I discovered ZP in the last few days, and it seems that Shamus has.. (how can I say it?) grown something of his way. Shamus, however, goes to the core idea while ZP simply review games (and SOMETIMES goes to the core).

    Anyway, while I enjoy that tone, Shamus, I hope we will be able to find back the man (sometime) who was less.. spicy? :)

  16. I am reminded of when I first played Doom3.

    Initially I was uttering all kinds of “ohhhs” and “aaahhhs” as I admired the pretty colors and the nifty lighting.

    After a few hours I stopped noticing the pretty shadows as a niggling sense of “sameness” slowly overcame me.

    After about 5 hours of play I was so damn tired of the monster closet and the predictability of the game that I completely gave up and never completed it.

    It’s like dating an attractive but vacuous woman. Yes, she may be nice to look at, but eventually you’re going to want to have a conversation with her. After a while you’re going to get tired of her staring at you blankly when you ask for her opinion on Kierkegaard.

  17. Illiterate says:

    Shhh.. Shamus, if they stop the upgrade treadmill, I won’t be able to buy last year’s hardward dirt-cheap and play Pharoah on it.

    Funny the wii was mentioned, I was thinking about Z:TP while I was reading this.

    Here’s hoping D3 with it’s effectively 2d gameplay runs on old hardware.

  18. Dave says:

    The HW industry also drives game development. Once SW developers can’t justify a game even after taking NVidia’s, Microsoft’s (perhaps), and Intel’s marketing dollars, that’s the end of the bleeding-edge PC gaming industry.

    Then the Peggle crowd can have the whole segment, and that will presumably be a boost for the indie devs who can afford to put out these kind of games, and have a good infrastructure in place (BigFish, etc.) to distribute it.

    It’ll be good news for the evolution of consoles, I think. They’ll presumably add more of the PC features that gamers need – quick alt-tab(esque) web access, more storage/removable storage, moddability.

  19. Illiterate says:

    ah, it appears putting in a homepage sets the “moderate me” flag. apologize for the re-post, Shamus. I’ll stop putting in my stupid blog address if that’s the cost of participating
    Shhh.. Shamus, if they stop the upgrade treadmill, I won't be able to buy last year's hardward dirt-cheap and play Pharoah on it.
    Funny the wii was mentioned, I was thinking about Z:TP while I was reading this.
    Here's hoping D3 with it's effectively 2d gameplay runs on old hardware.

  20. Shuggah says:

    Amen! Keep on preaching that sweet truth, brother! My 2004 relic of a laptop with its integrated radeon 9600 level graphics card runs WoW at a playable rate, but pretty much everything beyond UT2k4 lags, stutters or plain refuses to even run. The market is full of games I would find enticing if only they didn’t require me to drop another fifteen hundred euros on new hardware every year. It’s sad to see that my former favourite pastime has been pretty much priced out of my reach.

  21. Kleedrac says:

    I couldn’t agree more Shamus! One of the WORST trends in PC gaming (in my humble opinion anyhow) has always been the “pretty” shooters. Right now it’s Crysis a game with a higher graphics setting they haven’t released because the computer powerful enough to play on that setting doesn’t bloody exist yet :P There is nothing about that I think is good for the industry. And there are far more examples than WoW but you are correct for using it as most in the gaming industry think WoW is some backwards acronym for “Money Generating Machine” Well done Shamus … well done.

  22. Illiterate says:

    Seriously, do I need to start logging in to stop being moderated?

    Shamus, I’m not trying to be a jerk, please feel free to delete this, hit me at “the dot illiterate at gmail dot com”, let me know what is causing me to be moderated.

  23. JohnW says:

    Note how World of Warcraft is at least two graphics generations out of date, and yet Blizzard had to buy avalanche insurance just in case their pile of money falls over.

    LOL!

  24. Drew says:

    You know how when you fire up games nowadays, there are 35 splash screens before the game starts, indicating every layer of publisher and developer and producer of the game? Regardless of how annoying that is, I’ve noticed that the last few games I’ve fired up have included nvidia as one of the splash-screen companies. That means the graphics card companies are paying for game development, specifically in order to sell new hardware. If someone else started funding this stuff in their stead, we might see some games that don’t require the latest-and-greatest, but I’d imagine anything nvidia sponsors is going to be required to push the envelope, so they can sell some more cards. This has got to be part of the problem.

  25. Nixorbo says:

    This is one of the things I prefer about console gaming as opposed to PC gaming. For console devs there are set boundaries they have to work within. The only boundaries for PC devs seem to be whatever they decide to set for themselves.

    Why is this a problem, you say? PC devs strike me as Tim Taylors. “What do we need to do to make our game better? Add more power, ar ar ar!” They move from The Next Big Thing to The Next Big Thing, pushing the boundaries without ever stopping to maximize the abilities of The Average Joe, what’s already within the boundaries.

    Meanwhile, console devs have hard limits. The Next Big Thing only comes around once every 7-10 years. To make their games better they must work better, figure out how to do things better. Eventually they begin to maximize the potential of the existing hardware (see the difference between Halo and Halo 2 for an example. Like night and day).

  26. Maddy says:

    Bleeding-edge hardware requirements are just part one of why I don’t buy games for my computer. My other computer activities don’t require fancy graphics or a fast processor, so I am loathe to upgrade.

    Part two: I used to buy games for my computer. Thanks to OS and hardware upgrades, I can’t play those any more. I know some people say “but once you finish the game, who cares?” but I’m the kind of player who takes forever to finish a game and then likes to revisit parts of it, explore it, try weird stuff, etc. When I buy a game, I want it for a LONG time.

    Meanwhile, there are old Atari 2600s that still work. Not that I want to play that version of Centipede again, but the point is, if you buy a console game you can play it now and you can play it forever (or for a lot longer than you can play a game you bought for your PC).

    Part three: I work on a computer all day and well into the evening. Last thing I want to do is spend a couple more hours playing on it too. I might kill time with a game of Spider between meetings or something, but that’s as much as I’m willing to do. Granted, playing on the TV isn’t a lot better, but at least it’s a change of venue.

  27. K says:

    Well spoken, I could not agree more. Yeah, I recently played through Mass Effect. In 15 hours. That seems kinda short to me for the most hyped “Epic” RPG right now. Sure, I can run around and do a hundred boring Fed-Ex quests on the same boring, empty planets (with beautiful skyline) to produce another 30 hours of “entertainment”. Or I could play a game like Zelda, Links Awakening, which took me at least as long the first run through AND IT WAS ON THE GAMEBOY which has less ram than I have L1-Cache on the graphics card alone.

    Yes, Mass Effect looked brilliant. I even cared for that for like twenty minutes. And then I turned down all details (which made the game look about as pretty as Half Life ONE), so I could play it at a decent framerate.

    I also like the dating analogy. Games are like Women. Those that are extremly pretty spend all their time and effort on becoming pretty and staying like that or else they wouldn’t be. And those are horribly boring. Hell, they are not even better in bed. :P

  28. Rick C says:

    You can spend less money and make more money.

    What’s going on, here? Are you trying to apply economics and the Laffer Curve?

  29. Mike says:

    Quite frankly, while Bioshock is beautiful, I can’t play it for more than an hour at a time or I get nauseous. If there were an over-the-shoulder mode it would be MUCH better.

    And I’m one of the guys that dropped $7k on an ass-kicking system back in 2005. Dual 6800 Ultras in SLI! FX55! TWO GIG RAM! Whoa.

    Nowadays, I can buy a better system off the shelf at WalMart for under a grand. *sigh*

    But it still plays most games just fine. City of Heroes, everything enabled, 1600×1200, no problems. Even Bioshock at that res is playable if I drop the realistic shadows and reflections. Except for the dizzines and nausea, of course.

    They need to stop making things so realistic and start working on ways to make it less sickening. :)

  30. Avaz says:

    Interesting.

    Firstly, interesting because I wholeheartedly agree with your rant, Shamus.

    Second, interesting because this rant, for the first time, seems less “spicy” as some are prone to calling it. I actually thought this rant was calmer.

    That said, however, I reread it and laughed out loud because I, too, heard Yahtzee’s voice rambling it. Funny stuff. :)

  31. Spider Dave says:

    I think it’s about freaking time PC games started pushing the limits of their imagination instead of their hardware. The only game recently that has caught my eye as something I’m looking forward to is Spore. Most of everything else is more “meh, it’s been done.”

    To any game developers who might read this, here are some games I would like to see:
    A Sandbox RPG. The game that gave me the best taste of that was Gothic, but I think it can be done better. And by god it doesnt have to be Medieval Fantasy, you could do pirates or sci fi. GO crazy, do something new for a change.
    Or better yet, an online D&D-esque game (though not necessarily d20 by any means.) Multiplayer in the four guys and a DM sense. It would be wonderful. I love MUDs but I wish they had objectives and plot and all that.
    ADVENTURE GAMES! Point and click. No need to push graphics at all. Think Monkey Island, the White Chamber, stuff like that.

    As Leslee Beldotti pointed out, games are like women. Pretty ones are nice to look at, but at the end of the day the one you really want is interesting, fun, and with a lot more to her than “Look at my nicely rendered curves!”

    Excuse me while I go play Earthbound.

  32. Derek K says:

    “You might also want to include games “journalists” who almost without exception mark down titles with “out of date” graphics.”

    No freakin’ kidding.

    “Cons: Looks like it was made in 2006.”

    Uh, yeah, that’s actually okay. We haven’t discovered a new dimension since then, or anything….

    Re: Consoles:

    Yup. My cycle is generally to get the new console about a year after it comes out, play it for some time, and let PC games build up. Then I buy my PC used/at the Dell outlet – it’s about a half to a full generation behind, but it can play all the games that came out during the year or three it took to get there just fine, with the benefit of knowing which are actually good, and having a thriving mod community if applicable. There’s no point in getting cutting edge, especially when 3 years old is still new to you….

  33. Jeff says:

    @Spider Dave:
    Have you heard of Neverwinter Nights?

    The thing about graphics is very true, too. You only notice graphics in the beginning, after which it’s all gameplay.

  34. Cineris says:

    Side note: I think one big hope for the whole two-or-three-generations-behind game development is mobile platforms like the DS or PSP. I think the agility you can see with these is part of why you’ve got some interesting non-mainstream ideas (Phoenix Wright?) cropping up on these platforms but not on the bigger consoles.

  35. Freaky Dug says:

    Even though these games with extreme graphics do look good, games with less demanding art-styles look better. If you look at Zelda: The Wind Waker on the Gamecube, you’ll see a game that has an art style that it’s hardware could do perfectly, so it looked great and it still looks great. TF2 is, I’d say, the best looking game in the Orange Box, because it isn’t going for realism, which we can’t reach, but a cartoony style, which we can.

    So, not only you spend less to make more, you can have worse graphics and make a better looking game.

  36. WWWebb says:

    Good points overall. But while bemoaning the shortness of games, remember that it’s possible to go too far in the “add content” direction.

    When I installed DOSBox last year, I pulled out some of my Wizardry/Ultima/Magic Candle/Wasteland games (yes, I have a floppy drive) just to see if they would run. They did, but I learned too things.

    #1- Looking at EGA graphics on a XYZGA (or whatever) screen is PAINFUL. It is either squeezed into a tiny 3 inch square on my screen or each pixel is 3 inches across.

    #2- I played the first hour and had a little fun with the character creation and early game. I briefly thought, “hey, maybe I should play through this again for old times sake.” Then I remembered that they all took almost 100 hours to finish.

    That was fine when I was a kid and could plow through that in a month, but these days that would take me 6 months. As a grown up, I just don’t have that time to burn, and 20 hours is just about right. Any longer that that and I’ll have forgotten the exposition before I get to the climax. To put it another way, I know you like open ended gameplay and exploration, but would you really have the patience to play World of Warcraft all the way through if you were the only person on the server?

    Going back to the development discussion, think about how long it would take to remake a classic like Wasteland with a modern game engine. Morrowwind was a tiny, tiny island compared to a lot of those games and filling them out would take forever. It would be like making a single-player World of Warcraft from scratch. The developers could never hope to recoup their investment.

    There’s a sweet spot somewhere in the middle. There just aren’t many games that hit it.

  37. Factoid says:

    I would like to point everyone to Twilight Princess:

    That game is:

    a) Quite long, especially by today’s standards
    b) Incredibly awesome
    c) Has innovative ideas about gameplay
    d) Uses last-gen graphics
    e) A massive hit both critically and in sales
    f) One of my favorite games of all time

  38. David V.S. says:

    I enjoy Thief 2 more than Thief 3. The graphics are not as pretty. But the missions are huge without mid-mission load screens, the maps better encourage and allow exploration, and the objectives and obstacles better encourage a problem-solving approach.

    This is even true of many of the over 700 fan missions, including a very professional fan-made expansion pack.

    To me, the main point Shamus made is about how complicated it is to do level editing. I expect a company that designed a robust engine (for shooting, sneaking, and shopping) whose level editor was easy to use (tools made common tasks quick) would earn a whole lot of money. And we gamers would see a lot of games in which developers could focus on creating a fun experience.

    Look at how many games are built of another company’s game engine even when that engine is not specifically appropriate!

    Does the Linux open-source community have this kind of a generic engine/editor?

  39. Andre says:

    Very, VERY well put, Shamus.

    Also, I really like this analogy: “That's like cutting off the top of your head because your high heels make you too tall to fit through the door.”

    Also also, pretty apropos of you to use a picture of a Cylon that says “replaced by CGI”.

  40. wererogue says:

    Game dev reporting in:

    The problem is that unless you get some nasty bugs or bad optimisation, supporting the latest *graphics* isn’t really the biggest part of the workload. It’s pretty trivial from a code point of view, certainly compared to supporting whatever new feature the game designer wants *this* week.

    The technologies that *do* take a lot of time are the brand new, game-specific ones – flexible AI/animation AI especially. Also data mining, adaptable difficulty (which I hate with the intensity of a thousand suns, but that’s a whole blog post on it’s own), allowing your RTS to manage 3.2 billion units at once. Usually implementing a new shader or reflection generator is pretty much “write code into renderer” and go, especially if it’s hardware-supported.

    With that in mind, it can seem like a waste of time and resources to write the menus and repetitive implementations of older rendering tech as well, when it’s pretty quick to just use the latest one and stick with it.

    The place that pretties do take their toll is on the art team. They suddenly have to put more effort into each part of your assets, and if you don’t have enough artists, your project’s going to slip. Supporting older tech probably means that your artists have to spend more time on the models, or even do two versions – a lot of models now are pretty low-poly, but with bags of bump-mapping (dot3), so do you do a high-poly one as well for the old renderer? Then you have two versions of the assets, taking up space, both needing to be changed when the game design does.

    Most artists in the games industry WANT to make good-looking assets, so they’re pushing for you to support the new technology too. “Latest graphics” sounds good to management and publishers, so of course when you can’t give a reason why your game ought to look like crud, they’re going to want you to use newer techniques too.

    I think part of the reason that development time is longer for bigger teams is that when you had one guy in his basement, he knew what game he wanted to make. He knew what wasn’t worth doing, and didn’t do it. Now, if you even think about dropping a feature, publishers think about dropping your game. At the same time, the designer is sitting in his office with nothing better to do than think “This game needs more *features*”, and suddenly the feature set starts expanding. Once something’s suggested, everybody wants it in, and it gets wedged into the schedule and things start slipping. If you’re a development team working for a publisher, they’ll have their own opinions and desired features too, and you can guarantee that they’re adamant that the game has to have them.

    I don’t think that graphics are the biggest culprit for short games or long development times (unless you count “animation” as “graphics” – then they’re a bigger hit). But I do think you make a good point about target audience – the focus shouldn’t be entirely on high-end gamers.

  41. Maryam says:

    In addition to what Shamus has said, what I wish is that game developers would stop striving for realism. My interest in a game takes a big hit if the characters are an attempt to look like real people. I much prefer some sort of interesting stylized look. The uncanny valley has something to do with it, but it’s also because I play games to experience something different from reality.

    I’m no game designer, but I’d also assume that if you aren’t spending scads of time trying to make everything look realistic, you have more time to spend on designing the gameplay.

  42. Spider Dave says:

    @Jeff:
    Neverwinter nights couldnt do it for me. It was too clunky and too restrictive. What I love about RPGs is freedom, and NWN lacks that. I think a game more like Oblivion could come closer to the mark, if it were done properly. As Shamus has said, d20 doesnt really work in computerised gaming. I just want that D&D feel.
    Check out this awesome post that expresses how I feel.
    http://www.shamusyoung.com/twentysidedtale/?p=945

  43. Oleyo says:

    This made me think back to all the games that I was totally sucked in by. None were what you would call graphical powerhouses, but they actually used the most advanced graphics engine on the planet; our brains.

    Our brains are too adaptive to be wowed by graphics for more than a few minutes. However, they are so powerful that I can sit at a table with two or three buddies and nothing more than some sheets of paper and dice and take part in an amazing adventure.

    Immerseive game play allows our brain to conjure up the gameworld, while broken gameplay, or weak story rip our brains out of our willful suspension of disbelief, regardless of graphical refinement of the gameworld.

  44. Alex says:

    “Me? I'm saving up for an XBox 360.”

    Might wanna do what my friend did: Save up for two. Just in case the worst happens. Apparently a Microsoft product is prone to killing itself for no reason. Who’da thunk it? <_<

    -Factoid:

    Other than (b) and (c), I think you bring up a good example. Although (e) is only (e) because IT’S A ZELDA GAME. ;)

  45. Benjamin O says:

    I was just talking to a friend about the game I would make if I had the time/money/ability. I have the idea. Here’s the problem–it’s now a LOT harder to get into it. When you are just learning how to make a simple game for a simple environment, its easy. Try to make a game for a 3D world, and suddenly it takes a lot of work.

    I don’t have the time/money/ability.

  46. MRL says:

    @Alex:
    I’m going to take the quicker route and just save up for a Wii.

    Hey, when XBox comes out with a game that doubles as a fitness coach, let me know – but Wii Fit will do me a heck of a lot more good right now than Halo.

  47. Matt` says:

    “Immersive game play allows our brain to conjure up the gameworld, while broken gameplay, or weak story rip our brains out of our wilful suspension of disbelief, regardless of graphical refinement of the gameworld.”

    *Agrees with that guy*

    No matter how pretty the game looks, if it does stupid things because it’s poorly designed, then it’ll seem crappy.

    Conversely, a well designed game with graphics that are just “good enough” (read: can be produced by old hardware) will be fun to play and seem like a great game, even without epic graphics or realism.

  48. Gbyron says:

    One word: Crimsonland.

    Me and my 64Mb Graphics Card salute you and head off to blast some .

  49. McNutcase says:

    The thing that always gets me is how much less useful these ultrashinies are. I quit the treadmill back in 2005, by the way, but even back then, I was noticing that you can’t freakin’ SEE anything. It’s all shades of brown, or grey, or bluish-grey, and the bloom is turned up so high that you just can’t tell what the crud anything is meant to be. Yes, bloom is nifty, but it should be kept for where it’s realistic: coming out of low-light areas into brighter ones.

    Or are publishers convinced that we all go to the optometrist’s office daily for those dilating drops?

  50. Dys says:

    In some cases, graphical quality adds to a game. It’s largely to do with the immersion factor, and not related to technology so much as design. A good piece of art it won’t ever look dated.

    Another factor to consider is that, given the prior existence of games like Half Life, a developer working on a new game can either try to use the same tech and beat the design (think about that for a moment), or go for the advantage of new technology.

    Simply put, in the games market you are competing with everything ever released. You can either try to fight the best games ever made, on their own ground, or you can take the one advantage they cannot have. The new tech.

    As for the development time vs game length argument, I strongly suspect there needs to come, soon, a revolution in games development. Some form of extra level whereby the devs do not design the game, but instead design the tools which design the game. You have made posts before on procedural content, so I expect you know what I mean by this.

  51. Tejlgaard says:

    I agree that cutting off old computers is silly if you don’t need to do it for your game; but then, _you actually need to_, contrary to what you imply here.

    Let me illustrate what it’ll be like if you set the system reqs too low:

    Lets say we want to make a new game with low reqs based upon how old games did it.

    For CPU’s which you will support if you want reasonable AI, for example. Hitman 2 would be a perfectly good benchmark for reasonable AI. That had a minimum of 450mhz, so lets be generous and go for 750 for our test platform where we will guarantee that the game will work.

    Secondly, you need to decide on some level of graphics. Do you want 3d graphics?, Or can you do with sprites?

    Worms 2 had sprites and was a reasonably pretty and had reasonable audio, so deciding on that is perfectly fine. That’s a direct x 5.0 card, if memory serves. That’s a Riva 128 card, so again, let’s be generous and go for a Riva TNT.

    Now that we have a Riva TNT, pentium 3 750mhz, and oh, lets say 128 megs of ram, we can start working on a game that everybody who’s bought a machine in the last 7-8 years will be able to run.

    Only…oh. The machine doesn’t have the guts to run our AI scripts because we’ve gone ahead and written them in Python. Well there’s nothing to do, let’s crack open the manual for Ascii C, and convert to that…sure, the code will take up many more manhours to write and require considerably more experienced employees, be harder to maintain, and will be like chaining a rock to our leg when we want to develop our next title, but there are bright sides too.

    Bob is an australian customer who ventured into sidney one day, after finding an 8 years old computer at the scrapheap, to visit an EB store and purchase a game. Why, he will certainly be able to play our game because we invested that extra time, money and manpower. (this is not a dig at Australians, but examples become more vivid if I use stereotypes, so sorry if anyone takes offense)

    But then, none of the games at EB appear to have low enough system requirements, and the salesperson doesn’t know what a Riva TNT is, and she starts laughing when bob says Pentium. The bright side is, there is a slim possibility she points bob to our game anyway, and he picks it up. The more probable outcome?

    Bob visits a friend who goes to the underdogs, downloads 50 games, burns them on a dvd, and gives them to bob who now has more games available for his platform than he’ll be able to complete before the system gives up on him. He won’t ever get around to our game.

    Catering to bobs demographic with new games is nonsensical.

    Unless you’d like to cater to people who specifically enjoy casual games (where the older languages and API’s will arguably be less of a problem) and are stuck on old platforms that can hardly even run youtube, there is no positive aspect to developing for systems which are more than 5 years old. None.

    I can do the same math with a geforce 3 card from 01, or a geforce 4 from 02, and processors from the same time, and illustrate how developing for the old technology is more expensive and turns very little extra profit for a _huge_ extra investment if you want to do things as well as the big games from that time.

    You’d be stuck with old, outdated API’s which are harder to develop for, no tech support, and you’ll need to use archaic programming paradigms for managing CPU and ram capacity if you want to fully exploit technology from 02. And 03?, that’s 5 years old. That’s 10 months from geforce 6 country. According to this blog post, that clearly doesn’t count and thinking such thoughts deserves a whip from a magazine.

    My point is, you should not decide to use old technology, you should decide to simply _invest less_ in exploiting technology from any day or age, and instead rely on cheaper technical solutions (….usually that will not be older ones…..)

    Going for compatibility alone is very expensive if you want any sort of modern feature.

    But that doesn’t mean you shouldn’t exploit new technology if it’s cheap to do so. Since everybody and his mom is switching to dual core, the majority of the market would be able to run hitman with its AI rewritten in python without a hitch.

    So invest in writing good AI in python, and do so for about 1/10th the cost IO invested back in 2002 in the same feature.

    Sure, your game won’t be able to run on a machine from 2001, and IO’s game is able to do that, but hey….that really is not that big of a loss. Bob can do without, he already has access to a huge, cheap catalogue.

    Don’t try to improve the AI using the new, faster processors. Simply try to develop the same quality of AI for less money, and allow the new, faster processors to give you a heel up.

    Plus, hitman 2 is 10 bucks on steam. You’d have to sell your game for the same price to be able to compete, if it didn’t have better features, and chances are, it wouldn’t have if developed with the same system req.s in mind.

    But yeah, in conclusion, I’m discussing “real, bought in a brick and mortar store” games here, not the casual stuff you can find on steam, at Neopets etc. For the casual stuff I’m sure you could probably come up with something that doesn’t require new technology.

    But the rest of the time, developing features that are as good as those found in other modern games is _harder_ on older technology, not easier.

  52. Fieari says:

    Shamus, what’s the chances that your overlords and masters at The Escapist would publish this article as a feature? I think it’s worthy of a larger audience…

    Funny how a few months ago, The Escapist wouldn’t have been a larger audience at all. But thanks to ZP and guys like you, it’s a runaway success these days.

  53. Ian B. says:

    I think I’m going to change my name to Ian B., since some smart alec above had the nerve to be given the same name as me. ;) I guess I could also be known as “that guy with the Shadow avatar” but whatever.

    Anyway, this rant was definitely necessary. Some people think I have something wrong with me when I start up things like Kroz, ZZT, and MegaZeux on my computer. The fact that those games have held my attention off-and-on since they came out (1987, 1991, and 1994, respectively) says a lot, I think. Anyone ever play Pyro 2? That game is bloody amazing and it will happily run on a system with an MDA. I like having a Crysis-ready computer for the rare occasion that a good game is released for it, but the only reason I really upgraded in the first place was because my old Pentium 4 just wasn’t cutting it for what I wanted to do anymore.

    It probably wouldn’t be so bad if the PC weren’t regarded as a “secondary platform” for most developers nowadays. The only real PC exclusives nowadays are indie titles and things like strategy games and simulation games (though those seem to be gradually making the move to consoles). I’ve definitely had my eye on more 360 games than PC games as of late.

    What doesn’t help are when companies like Crytek whine, bitch, and moan about piracy. Crytek recently announced that after the new Crysis game is released they are no longer going to be making PC exclusive titles due to rampant piracy. The fact that they made a game that will only run on 5% of the systems in the market and only run well on half of those systems pretty much means that their sales are going to be low as it is. Hell, I’ll bet even most pirates can’t run that damn game. Sure, it looks pretty (ridiculously pretty, even — screenshots don’t do it justice), but to expect a game that doesn’t run on anything to sell is just ludicrous, especially when that’s coupled with Crytek’s idiotic design decisions (like making games into stupidly difficult DIAS-fests even when played on easy).

    I’ve been paying particular attention to the indie and “casual” game developers as of late. Their titles are just better (and cheaper, too, which is a definite bonus).

    @MRL: “Hey, when XBox comes out with a game that doubles as a fitness coach, let me know”

    Dance Dance Revolution.

    No, but seriously, when I first started playing DDR I lost around 40 pounds in a month.

    If you want to invest a little more time into setting it up, you can always grab a soft pad, a PS2/Xbox -> USB adapter, and use StepMania (or In The Groove for PC…).

  54. John says:

    I believe this falls into a classic tech business category: companies who build what their developers want to build, rather than what the market wants to buy. Normally the boring business managers aim for the dollars, the exciting developers aim for the latest flash and sparkle, but the business managers pay the techies salaries and thus win the argument. However, in a subset of companies either the managers are techies promoted in defiance of the Peter Principle, or the owners/managers have been convinced that they can’t retain good talent without giving into their cutting-edge fetish.

    As supporting evidence, look at the failure rate of said companies.

    (full disclosure: software developer by education and early training, now manager seeking MBA)

  55. Factoid says:

    Sorry to double post…but Crysis gets kind of a bad reputation.

    It’s true that the story sucks balls, and no earthly computer can play it at full rez…but it does scale down reasonably well to play on most systems in that “geforce 6000 or better” category.

    It has an incredible UI and a few innovative gameplay features. The first half of the game is actually really fun…until they get to the story portion…at which point it gets linear and boring.

  56. Blackbird71 says:

    Well said Shamus, once again a well-placed jab at the state of the industry. Now if only we could draw the attention of those responsible a bit more effectively.

    Personally, my desktop is currently using an ATI X700 Pro graphics card. My wife’s computer has a GeForceFX 5200. Yeah, they’re a bit dated, but they work for most of what we play (I’ve even got an old machine in the corner with a Diamond
    stealth II for those times I crave a revisit of 7th Guest or Daggerfall). But this week I’ve ordered a new card (ATI 4850) and other hardware for a major upgrade. Not because I enjoy shelling out the money every 3 years for this, nor because I want those really cool graphics, but because I’m at the point where it’s even getting difficult to buy games a couple of years old that will run on my system. Chalk up a win for the graphics card companies! I think Drew (#25) hit the nail on the head there. If games don’t requrie bleeding-edge graphics, Nvidia and ATI won’t have anyone to sell their pricey, top-of-the-line hardware to, so it makes sense for them to promote resource-intensive games. It’s creating a market for their product.

    Alas for me, consoles are not an option as an alternative. I don’t know why, but I have never been able to manipulate console controllers with any reasonable degree of proficiency. There’s just something about the way they’re rigged that runs counter to my instinctive motions I guess. That, and I find gaming from a couch looking at a distant TV screen makes me feel a bit detached from the game, as if it’s something I’m watching rather than playing.

  57. yd says:

    You complain about the HL2 numbers, but which runs on one of the more forgiving graphics engines I’ve played with. Sure, Valve is still trying to target the pretty colors crowd, but they do it without /forgetting/ the less upgrade happy market.

  58. Shamus says:

    I wasn’t complaining about the HL2 numbers – I was just saying that thinking that HL2 players = all PC Gamers is a rampant sort of myopia.

    HL2 is indeed about as flexible as they come. I can’t think of another game that scales as well. I grumble about Steam, but you just can’t fault them on game design.

  59. Eltanin says:

    In a way I dislike these posts. I mean, sure they’re funny, and that graph was simply awesome, but Shamus you’re just so damn right that it hurts. I mean it really causes me emotional pain. Why can’t game companies see it? What exactly is their problem? They clearly need a liberal dose of that magazine upside the head.

    Gah! It’s so frustrating.

    Thanks for being a voice of reason amidst all the sound and fury Shamus.

  60. Sitte says:

    In response to the “spiciness”:

    1) Is that usage common around the webernet? This is the first place I’ve ever seen it, and only in the past few days.

    2) In my experience, Shamus goes up and down in level of spice, just like most other writers (unlike Yahtzee, who stays at a 5-alarm NSFW level of spicy always). Check out old posts on Steam and Bioshock.

    3) Looking purely at the number of posts in the Rants category, I see that the end of last year was a slow time for the fury, so the last couple of months are definitely an upturn from there, but are about average compared to the first half of last year.

    4) Shamus has impressed me multiple times with his class. He has closed comments when things get too heated, edited his posts to tone down potential offensiveness, and (at least once) deleted a heated response he made when the offender apologized.

    5) http://www.shamusyoung.com/twentysidedtale/?p=1300

  61. Anders says:

    Should there be some sort of revolution that overthrows this evil union that claims that graphics is everything I’d join up for sure. I’ve seen myself fall further and further from that with time.

    I’m blessed with a dead boring job that pays way to much so my system is way, way to the right of the graphics curve and I mostly use it to play two-three year old games that might be able to run at high resolutions but almost never even challenge my system.

    And when I see that the local PC Gamer has a full 8 page story on how AWESOME the new Crysis-MP game will look I’m not even interested (Oh, yeah. It will come with a new Trans-dimensional Lightning thingie-bob that you will notice a whole 0 times seeing as you probably will not stop to look on how pretty your enemy is being rendered when he is shooting you). To me the actual worth of a game is all in the fun and almost nothing of the fun comes from the graphics (as long as the graphics/artwork reach high enough to be nice looking, like WoW).

    I think I’ll continue to spend my money on games that promise lots of fun, usually Indie games, without the graphics in the future too but I’m afraid I do not really think that the big companies will stop equaling “graphics” with “good idea” seeing as it is a downwards spiral. The magazines write about how cool the graphics is, the companies makes them and those that don’t get’s bad scores.

  62. Laura F says:

    Hee hee, avalanche insurance…

    Your wordplay makes me giggle.

  63. Tejlgaard catches a lot of truth.

    Wing Commander vs. Lightspeed.

    Wing Commander barely played on a top of the line computer. Lightspeed played on anything.

    Or, the AI in Age of Empires vs. the, err, AI in Warcraft II.

    Though Ensemble makes a real effort to sell products that run on most computers.

    Game Magazines hammered Dai Katana as much over its graphics as its gameplay. That’s a lesson too.

    Though I agree, WoW has worked well, amazing what you can do if people aren’t able to pirate your game, and where that puts you in terms of the market you can sell to.

  64. Derek K says:

    “Some form of extra level whereby the devs do not design the game, but instead design the tools which design the game.”

    See, I know procedural design is all the rage now.

    I’m still highly unconvinced it will be a better game than one designed by a human. If you’re just rendering random mountain ranges to trek across, sure. If it’s a real place of use, a human is needed.

    Compare the dungeons of Zelda vs Diablo, for instance.

    @ Tejlgaard, Re: 5 year old hardware – I’m pretty sure that you misrepresent people with 5 year old hardware. It’s not Bob who found his computer in the scrapheap and wants to play X-Com. It’s people who upgrade their computers piecemeal, because they have 4 in the house that all need bumps, so computer A gets a new card, and then the cards all get an upgrade.

    I have 4 functioning computers right now – mine, my wife’s, my daughters, and a file server. I upgraded mine about 4 months ago, when I got a new job. I upgraded my wife’s about a year ago, when hers died. I upgraded my daughter’s about 4 months ago, when I got my new one.

    So right now, I have my machine (dual core, 3 gigs of ram, fair vid card), my wife’s machine (dual Xenon workstation bought off Craigslist) and my daughter’s machine (hodge-podge originally a dell machine). We all three would like to play games together. We can all play WoW. We can all play Second Life (although I try not to). We can all play Diablo 2, or even Team Fortress 2, for the most part (she can kind of play it, and doesn’t much like it). I would love to be able to go to the store and get a game that all three of us can play when it comes out. But that’s gonna be rare.

    Also, you suggest that it is nigh unto impossible to find someone that can code for computers 5 years old. Have you looked at the indie game community? Do you really think everyone has erased all knowledge of C++ from their brains at this point? There’s a huge supply of modules, even.

    Plus, you’re making Shamus’ argument: You can only find people who know the new tech, because that’s all that people want. If we weren’t pushing for the newest and greatest every single time a new game came out, it wouldn’t be hard to find someone that can code on a 3 year old engine, because they just finished a game on it last month….

  65. Mechman says:

    I’m probably going to get flamed for this, but here goes: Graphics do matter.
    Not to the degree that most companies emphasize them, but to a great degree nonetheless. For example, gearbox has been working on a new aliens shooter. Look at the screenshots and ask yourself how scary it would be with last-gen shadow tech and shaders. Would clive barkers Jericho have been scary without those same technologies?
    The problem is graphics must be used to enhance gameplay, not replace it. In crysis, they essentially cuft out big portions of potential gameplay to focus on graphics, but in game like sins of a solar empire, the graphics were made to support the game itself. Which was more fun?
    Of course, cutting edge graphics doesn’t mean it can’t scale. World in conflict has some amazing high end graphics, and I’ve run it on computers made 4 years apart with no real problems. Red alert 3 is an amazingly pretty game, but still can run on several year old tech and still be fun and smooth, because they focused on gameplay and compatability as a priority.

  66. MadTinkerer says:

    One of the neat things about Source-engine games is that although HL2 required a bleeding-edge PC when it came out, TF2 and Portal only need that same hardware from years ago.

    Unreal Engine games, on the other hand, are tragic examples of exactly what you’re writing about. Bioshock runs at a decent framerate only if I reduce the resolution to 800 x 600 and turn off almost all of the graphics options. UT3 has modes that function, but not a single configuration compromise that doesn’t look like crap.

    Furthermore, I’d like to point out that the Source engine absolutely beats the crap out of the Unreal Engine and Doom 3 and rubs their faces in the mud with one very important factor: People in the Source Engine look like people. The friendly characters in Prey and Bioshock are usually obscured by darkness or just keep their distance. The one character in Prey that was supposed to be remotely attractive wasn’t. The Little Sisters, even the “cured” ones look creepy if you examine them too closely. But even the ordinary people of HL2 and Sin Episodes: Emergence look like decent human beings without straying into the Uncanny Valley.

    It’s no wonder Source is the engine of choice for Machinima makers who want to make a film with real drama. So what if the characters in the Source engine look “blockier” than characters in other engines? They’re still a hundred times more expressive and never creepy unless they’re supposed to be.

    HL3 will probably require a bleeding edge computer when it comes out. But 5-10 years after it does, I’ll still be able to play HL3 engine games on the same hardware and I bet the ordinary folks will still look better than in any other game.

  67. Miako says:

    @ Mike
    You think they haven’t tried??? VALVE would give tons of money to the first person to figure out why its game makes people nauseous and other engines don’t.

    Also, try adjusting the view angle, particularly if it’s really low or really high.

  68. Dev Null says:

    I used to buy games for my computer. Thanks to OS and hardware upgrades, I can't play those any more. I know some people say “but once you finish the game, who cares?” but I'm the kind of player who takes forever to finish a game and then likes to revisit parts of it, explore it, try weird stuff, etc. When I buy a game, I want it for a LONG time.

    Meanwhile, there are old Atari 2600s that still work. Not that I want to play that version of Centipede again, but the point is, if you buy a console game you can play it now and you can play it forever (or for a lot longer than you can play a game you bought for your PC).

    Thats kind of a weird point Maddy. Are you claiming you can play Atari 2600 games on your XBox? Because if you mean you can still play them by trotting out the old console, well I could still play the floppy disk version of Zork I used to love… if I dug up my 20-year-old IBM PC and plugged it in. Or I can play it on my current machine in an emulator, if I could track down a 5.25″ floppy drive. I still have and play a copy of the first Space Hulk game (1993) – its a bastard to install these days, but it still works.

    I’m not sure the fact that PC games sometimes have difficulty running on later machines is much of a selling point for consoles, given that console games are most often physically tied to the hardware they came out on.

  69. Drew says:

    Mechman, in my experience, things you see don’t scare you. They can horrify you, but they’re not going to scare you. Resident Evil 2 was a great game, and the scary bits all came from just KNOWING that something was going to come through that window, not from looking at the shambling zombie walking down the hall. Even if that zombie was totally photorealistic, it wouldn’t be scary, it would just be disgusting.

    Now, can improved graphics give you a more immersive environment to create suspense? Sure, to some degree. But people were spooked by games like RE and Silent Hill a long time ago, and they sure didn’t have the latest and greatest graphics. This isn’t to say that improvements in visuals aren’t helpful, just that aren’t necessarily necessary.

    Do I hate on great graphics? Not at all. But it’s also worth noting that artistic design goes a lot farther than new advances in graphics hardware.

  70. David V.S. says:

    In response to wererogue #40…

    One thing Blizzard does well with WoW is to make a corporate distinction between “art” and “graphics”. The WoW team has very few people doing mapping, models, textures, and scripting. But it has hundreds of employees in the artistic design department.

    The result is that locations lacking what modern graphics has to offer still are a pleasure to adventure in. There is a lot of variety in flora and fauna from region to region. There are awe-inspiring moments when encountering a vastness in the sense of scale. Color palettes vary widely between regions but always are well done.

    The finished product is not “modern” and does not have great animations but still looks very “professional”. The visuals are obviously aiming for a certain type of cartoony style and they hit the nail on the head. (Fan art that tries to mimic it usually fails, so the target this style aims for is not of a wimpy, large size.)

    Like most kinds of quality art, it strongly appeals to some people and does not fit with other people — but everyone likes some of it a lot (notice how some folks think the Ashenvale Forest is ugly but delight in Westfall, whereas others are exactly the opposite).

    From what wererogue wrote, it appears that game other companies need to acquire this appreciation of art that is well done apart from trying to be photo-realistic and well-animated.

  71. Chris says:

    wererogue – I just wanted to note that I found your thoughts intriguing, and it’s easy to forget that not everything is backwards/forwards compatible in the world of technology. I imagine this is a reason we’re beginning to see a lot of cross-platform games as well. Once upon a time a game could be made more easily on PC and still sell well, but in this day and age console development is less time-consuming, hits a larger market and costs less (unless licensing costs equalize it a bit).

    With that said, is it less difficult to build with new technology and then try and find ways to downgrade it without creating new assets, or to build with older technology and then try and upgrade it without adding new assets (if you can get away with either without stacking new assets in)?

    Tejlgaard – You make similar and still good points as well.

    It’s always nice to see people with evidence as to why the current system exists as opposed to just complaining about it. Now the question is, what other options are there? What potential solutions? This is usually more difficult, as Shamus’ simple concept of a solution has been proven flawed.

  72. Lain says:

    The first two consequences of your well written text running through my mind are:

    1) That game developers you want to adress with a comic-like thinking bobble and therin is the same content like in any Homer Simpson thinking bobble ever showed in television

    2) Some eager McKinsey-yuppies “pirating” your text and selling it for several X0.000,- Dollars to the above mentioned group of people. These people react like 1)

    Result: You feel better, but they stay on that successfull implemented DRM and steam philosophy and hire some extra lawyers to sue the evil pirates.

    My tip: Play Consultant and earn money with your wisdom yourself.
    Than you can afford the newest PC technologies and their games to play them with money spend of their own stupidity.

  73. MadTinkerer says:

    Oh by the way, here’s my amazingly sensible approach that would address everyone’s concerns simultaneously:

    Don’t develop a game for hardware that will be cutting-edge when you game comes out, just develop for hardware that’s cutting-edge now and in two years your game will be out and it’ll be closer to the middle of the curve and it won’t look terrible and you’ll have saved on development costs. As a benchmark, let’s call cutting edge “A PC that can run Crysis” but “A PC that can run Crysis well” is going a little too far.

    It’s a principle that “old school” developers like Peter Molyneux (probably wrong spelling) and Richard Garriot and many others aimed for back in the day when processing speed and memory were the only factors before 3D cards changed everything. Even Doom and Quake and the first Tomb Raider were designed with this principle in mind.

    But then everyone got impatient to get their games looking better sooner and lost sight of what made games fun in the first place. And then everyone falls into the trap of making nicer looking sequels instead of anything original. And then they don’t even make the right sequels like Dungeon Keeper 3 or Ultima Underworld 3(though Bioshock isn’t a terrible substitute), but instead make another even-nicer-looking fantasy RPG, sci fi RTS, or horror FPS as if all those areas aren’t over-crowded already.

    On the other hand you have games like Half Life 2 Episodes where the developers say “We really don’t have to sacrifice characters and story and action to make the graphics look better, so let’s not. Oh, and let’s really go for the cinematic angle with the Episodes and really make the player feel like they’re actually in an action movie.”. HL2 wasn’t the greatest FPS ever made, but The Orange Box is.

    EDIT: Hmmmm… sorry about the double-post. It won’t let me edit the first one now. If you want to delete the first one, that’d be fine.

  74. Cineris says:

    @MadTinkerer:

    I’ll agree that idTech 4 couldn’t do characters right just because the whole time they looked like they were covered in plastic. But it seems a little silly to say that, well idTech 5 or Unreal Engine (presumably 3) “can’t” do human characters just because the titles you’ve seen may not have done them well.

    I do think the default human characters in Half Life 2 are a lot better than you’ll see in Quake or Unreal/GoW, but that’s art direction. Presumably if you wanted to tell a story involving Strogg or pro-wrestlers in space you’d have more luck with something other than Half Life 2 as a starting point. And anyway, the most popular machinima ever being Red vs. Blue … Yeah.

  75. Mechman says:

    The new unreal engine is actually amazingly great at creating realistic human characters. Using bioshock as an example doesn’t work, because the characters are all modified in some way. Instead, look at the faces from UT3, gears of war, splinter cell, mass effect, rainbow six, etc.

  76. MadTinkerer says:

    “I'll agree that idTech 4 couldn't do characters right just because the whole time they looked like they were covered in plastic. But it seems a little silly to say that, well idTech 5 or Unreal Engine (presumably 3) “can't” do human characters just because the titles you've seen may not have done them well.”

    Well that’s strange because in HL2 Episodes & related games you have dozens and dozens of characters that don’t look like plastic, but in Prey you have just two non-hostile characters that look really weird (the grandfather looks less weird because he’s supposed to be a wrinkly old guy) and in Bioshock you have the Little Sisters which are supposed to look creepy before you rescue them and continue to look creepy when they’re “cured” if you pay too close attention. All the other sympathetic characters in those games stay far away from you.

    How hard is it to make two important able-to-be-closely-scrutinized characters look not-creepy? Valve does it quite a lot. So I’m pretty darn sure it’s a technical issue.

    On a related note, I’m not sure which engine Overlord uses (it’s own engine, possibly) but it has a few not-creepy characters. It also has atrocious lip-synch so you only get to see a couple characters’ faces while they talk, although there’s a ton of dialogue that you hear. But nevertheless: a couple not-creepy characters in a game which also has plenty of creepy characters. So it’s not necessarily a matter of the developer prioritizing making monsters look creepy over sympathetic characters not looking creepy.

    Hence my suspicion that it is a technical issue, or there’s just something very worrying about the Bioshock and Prey teams. But frankly, if UT3 and Gears of War are any indication, making realistic proportioned humans isn’t remotely a priority for the UT tech team anyway.

    @Mechman: “The new unreal engine is actually amazingly great at creating realistic human characters. Using bioshock as an example doesn't work, because the characters are all modified in some way. Instead, look at the faces from UT3, gears of war, splinter cell, mass effect, rainbow six, etc.”

    I posted before I read your post. Faces aside, my snarky comment about character proportions still stands. ;)

  77. Cineris says:

    @MadTinkerer:

    Regardless of the absurdity of Epic’s designs for male characters, their female characters (at least in UT3) are fairly reasonable – Athletic females who aren’t overly… curvy.

    And the in-engine cinematics are definitely proof that they can do really good machinima. I just wouldn’t expect anyone to be able to accomplish that in-game

  78. TheRailwayMan says:

    An Xbox 360? Finally!

    Now if we could just get you to like Halo 3…

    Maybe I’m being too hopeful…

  79. Deltaway says:

    Usually when I guess how problems will evolve in the future it’s helpful to view them from an economic standard. It’s rare that the trend moves in a direction away from the money. To use a rather risky example(i.e. a vastly oversimplified ad hoc example), I’m less worried about energy crises and carbon emissions than I am about the destruction of biodiversity hotspots, because while there is a foreseeable departure from fossil fuels at the point where ever-cheaper alternative energy becomes less expensive than ever-costlier hydrocarbons, and money to be made from pursuing new sources, the preservation of rainforests does not have this same type of monetary backing as the value of beauty, environmental stability, and biodiversity are more abstract and not as profitable.

    Thus I appreciate seeing a discussion that shows how developers can better make the connection between fun games and profit, a connection that might seem self-explanatory but is sadly missing from the development world today.

    Personally, any grievances I have against new games do not include the fact that most will not run on my system. If developers make games that I cannot play at the moment, that’s fine. If Moore’s law is anything to go on, I will be able to easily afford the hardware in two years. I play games based on how fun they are, not on how new the box is. Indeed, older games often benefit from patches and a more developed community. Of course, I wish this was always the case, and that I didn’t have to worry about a game’s servers suddenly dying on me. (Sigh…Homeworld…)

    For this reason I don’t see this article so much a plea that games be compatible on older systems as much as a movement for better philosophies in game design, helping developers release better games faster. I think that Dys’ perspective on design becoming more “meta” and creating the rules instead of the facts might become very important in the future. Procedural animation and content design are a way to create detailed worlds while still being sustainable enough for the world to be art, and move away from the current picture of over-worked artists mass producing mediocrity. I do not think that graphics are not important, but I think that artistry should be the ultimate goal, not the side-effect. Actually fun should be the ultimate goal, and artistry should be one means toward achieving it. Priorities, priorities. Many stylistic kinds of artistry do not require the latest in light and texture, but if the artistry is of a kind that requires high-level graphics, then I’m prepared to wait for something masterful as long as the my future upgrade will be worth the game. However it seems that many new games convey the idea that graphics are the primary objective, and that artistry and fun will work themselves out. Much of the problem seems to be one of prioritization.

    I hope that new techniques will be able to give people who make games the ability to work from enjoyability down. The techniques and ideas will be there; there are still creative and inventive people in this world. Maybe some will be creative enough to use those techniques and ideas in a way that speaks to gamers’ minds, not just their eyes.

  80. Sludgebuster says:

    Hmm- didn’t see anyone bring up the latest graphics hog: Age of Conan… so I’ll do it! :) (If someone did and I haven’t had enough coffee to notice it, my apologies)

    Lets face it: AOC is a gem- graphics-wise. I have gone online many times just to run around in the more scenic areas taking “Tourist Shots” of the beautiful scenery in the highest mode my computer can stand….And lets not forget the killing moves: the majority of my guildies’ chats over vent had to do with impressive new ways of watching an NPC meet its well-deserved fate, described in loving detail.

    Once those are stated, you’ve stated the best of the game. My personal feeling is that, instead of working on content, the devs had week-long “Who can make the coolest killing animation” contests. They’re cool, don’t get me wrong. But after killing the 500th NPC by embedding your axe in his chest and lighting him on fire, you get kinda bored and start wanting the animation to move along. This also gets quite annoying, if your healer is doing one of those moves while you’re desperately waiting on a heal. Its hard to shout at somebody for letting you die when they retaliate with “The game had me kill someone” (Note: I believe they’ve now put an option in to remove this feature while playing. I haven’t checked it out)

    And then, of course, is the framerate. AoC is advertised to run on anything with an NVidia GeForce 6600 or better: I personally have a 6800. At first, before tweaking for a week or so, I was happy to get 10-15 fps. And, I also now know that I can play a game, as long as the frame rate is at least 5 fps: slower than that is too slow. This is an experience that I would just as soon live without. After tweaking, I managed to raise my average rates into the 20-30 range: namely by starting with the “Low” settings and setting them manually lower. (There was supposed to be a bug that made the “High” settings actually work better. I tried it but never managed to make it work the way it was advertised). To add insult to injury, the settings would reset every time you mapped from zone to zone (back to the Grandma Moses is looking really fast settings).

    Anyway- bleeding edge graphics indeed (I WISH those devs had done a little more bleeding!). Content sux, but boy is it pretty: whenever you can get the frame to change.

  81. Luke Maciak says:

    ABSOLUTELY FUCKING BRILLIANT POST!

    I’m going to frame it and put it on my wall maybe. Especially the chart! And the avalanche insurance line – that one totally killed me.

  82. Mari says:

    So right now I’m in the process of building a new computer. I started slapping together the first shipment of parts last night. Which included the new video card. For me, it’s OMGZORS powerful which basically means it’s a mid-level card that’s only one gen out of date (yes, I stepped off the treadmill nearly a decade ago).

    What I noticed immediately is that the thing is HUGE. I mean, seriously, it’s just huge. I haven’t seen video cards this big since I was slapping together P1s back in the day. But, as I pointed out to my darling husband when he commented on the ginormosity of the card, “It has to be big to support OMGzorz pixel pipelines and stuff. I guess. I mean, I think ‘pipelines’ means it has to be pretty huge, right?”

    As I slapped it into the slot and started grappling to fit it in and get it connected up, I realized that because this thing is so tremendously large, my ATX mid-tower and my ATX mobo line up in such a way that I can EITHER have my video card in place OR I can put my hard drive in the second of four 3.5″ bays, but NOT BOTH. That’s right, my video card is so huge it prevents the use of a hard drive bay.

    Being the forward thinking individual I am, I got a mobo that supports SLI and a card capable of the same. I didn’t get the second card, planning it into the future upgrade path. But as I studied this huge card and my pretty standard mobo and case, I realized that if I were to SLI this thing, I would lose TWO hard drive bays. Out of four. How am I supposed to have a proper RAID array then, huh?

    Please, for the love of god ye publishers, don’t ever make me make a decision like this again. And you know what first spurred my realization that I desperately needed a new computer? My kid wanted to play “Chocolatier.” Yes, an arcade/business strategy sim needs super-powerful graphics to run, apparently, and my computer wasn’t up to the task. Before I know it, the Bejeweled clones won’t even run without top-of-the line cards anymore. Something is BROKEN and it’s not just my PC.

  83. RPharazon says:

    I run Team Fortress 2 on all low settings using a measly 1.6GHz processor and a dinky integrated Mobility Radeon 9600. I am just barely over the minimum specs.

    But it is more enjoyable than the Xbox 360 version, which I also have. Why?

    The gameplay is better. It’s more frequently updated, there’s new achievements, there’s custom maps, there’s the ability to play lagless games, there’s the ability to play games with more than 15 other people.

    Not only that, but the Team Fortress 2 feel, the very art direction, isn’t lost. There’s no difference in the feel of the game between all-low settings and the highest settings possible.

    The scalability of the Source engine is so great that I bought the Source Premier Pack a few days ago. I can play all the games in there, from HL: Source to Team Fortress 2 smoothly and perfectly. That’s what attracts me to Valve’s creations, because I know that this computer can run their games until their next engine upgrade, which should be in 2010.

    Scalability is key. HL2 has the ability to be played smoothly by computers from 2001 all the way to 2008.

    You may have noticed that I am infatuated with Valve.

  84. Dave says:

    Ah, but if the games take 10 hours to play rather than 40 hours, you’ll buy four times as many games, right?

  85. Sitte says:

    If they all have Portal in their title, I’ll by 15 times as many. (and not just because that was the shortest FPS ever)

  86. qrter says:

    I’d just like to say I never read any of your posts as if it’s Zero Punctuation – you’re a much better writer than that, Shamus.

  87. MRL says:

    @Ian B (from around the 50-comment mark):

    I was actually VP of my college DDR club for awhile. We had soft pads, with those reinforcing foam rubber things to put ’em in, but when they see a lot of use they really tend to fall apart.

    If I had the cash to spend on a hard pad, I’d love to get back into DDR…but a Wii just seems like a better investment right now. DDR is officially awesome, though.

  88. Sheer_FALACY says:

    So the obvious problem with writing games for 5 year old hardware is people already HAVE games for 5 year old hardware. They bought them 5 years ago, or in a bargain bin, or whatever. You may save money by making a game for old hardware, but are you saving enough to compete with the bargain bin?

  89. Scourge says:

    Hmm.. Funny thing is, I got Mass Effect, my PC is back out from 2005 and.. I can run the game on mediocre settings fluently, after tweaing the Ini a bit and giving it more memory.

    But then i compare it to Dark mesiag of Might and magic, which had the same pretty graphics, the same different faces to show how someone felt, and also did it offer me a bit realism. Running out into the sun after having crawled through a dark tunnel? prepare to be blinded… for about 4 seconds before you can just look fine again.

    And this leads to another point…

    I play all of my games in 1024 x768… However…

    Most people I know play their games in 1600 x 1234 or whatever. In incredible high resolution. alright, it makes things look better but it also takes it’s toll on the hardware, in other words, you need to upgrade it so that it can run it so high. Also are lower resolutions, like 800 x 640 not supported anymore.

    Perhaps I’ll upgrade my standard resolution once I get my new pc in a year or so, but so far am I quite happy

  90. Mark says:

    The funny thing is that even console developers are starting to realize this. Look at all the stuff on XBLA, PSN, and Wiiware. Or, god help us all, retail games for the Wii.

    They’re realizing that it’s not just a cost-cutting measure, but it’s also a visual aesthetic that can be worth pursuing. Retail PC developers seem to be behind the curve at the moment but they can go back to their rightful place as the real movers and shakers in gaming.

  91. Chris Arndt says:

    You know how when you fire up games nowadays, there are 35 splash screens before the game starts, indicating every layer of publisher and developer and producer of the game? Regardless of how annoying that is, I've noticed that the last few games I've fired up have included nvidia as one of the splash-screen companies. That means the graphics card companies are paying for game development, specifically in order to sell new hardware. If someone else started funding this stuff in their stead, we might see some games that don't require the latest-and-greatest, but I'd imagine anything nvidia sponsors is going to be required to push the envelope, so they can sell some more cards. This has got to be part of the problem.

    I would not doubt that that is why the System Requirements for Warhammer 40,000 Dawn of War listed a Nvidia texture mapper in its 2006 release.

    I can’t remember what texture mapper is called.

    UPDATE: Forget it. I am wrong. Here are the system requirements for the game:

    32 MB DirectX(R) 9.0b compatible AGP video card with Hardware Transform and Lighting

    I am thinking of “Hardware Transform and Lighting”.

    Sorry. Sorry. Carry on.

  92. Derek K says:

    @Sludgebuster: AoC had boobies. So it had to have the highest possible graphic options available. Otherwise, someone else might get more realistic boobies, and there would be a boobie gap.

    Also, succubi not having nipples was a reason to fire the developers. I read it on the forums.

    I kinda gave up on wanting to play AoC.

  93. Deoxy says:

    “boobie gap” – Heh. Dr. Strangelove reference FTW.

    And the avalanche insurance line – that one totally killed me.

    Ditto. The high heels line was also amazingly awesome.

    Before I know it, the Bejeweled clones won't even run without top-of-the line cards anymore. Something is BROKEN and it's not just my PC.

    This is a serious problem with the programming industry as a whole (even outside of gaming) – few people put much (if any) effort into optimization anymore. Whatever the default toolset wants to use/do is what they go with.

    So, when the toolset you are using wants 2GB of RAM on the video card, well, that’s what they do. It doesn’t matter if it’s a no-frills remake of TETRIS, which can run amazingly well on the original Game Boy (which made the original NES look powerful), it now needs 2GB of RAM on the video card…

    Microsoft pioneered this ridiculous bloat problem, but others have picked up the mantle quite enthusiastically. It’s pretty annoying, really.

  94. Rob says:

    Well I was going to post the following…
    But Shamus hit the nail on the head even better!

    AN ARGUMENT AGAINST THE GROWTH OF ‘REALISM’ IN GAMES

    I’ve been a gamer since computers first had games. (I can remember when we oohed and ahhed over CGA graphics!) I've played just about every FPS, RTS, RPG and any other kind of game you can care to name.
    I recently bought into console technology, only because I don’t have the time and money to maintain a bleeding edge PC and I'm a fan of some console games which haven't been ported to PC.

    Recently, much has been said on the increase in the graphics technology and the demands on that technology to improve the ‘realism’ of games. Until “˜smellovision' and “˜touchovision' are invented your experience with a game will always be purely aural and visual. Thus, realism is never fully achieved.

    Why do we want our games to push ever more towards real life? The whole point of games is to be able to escape the grind of real life for a short time.
    I use games to rest my mind from the demands of real life. For a short time, I can let my imagination run wild and become a draenei mage flying about on a golden griffon casting arcane spells at orcs and trolls.

    Lets use a generic FPS scenario as an example…
    You are a lone super solider, you follow nobody and nobody follows you. You are carrying a dozen different weapons which you can utilise immediately and expertly. You are battling evil monstrosities which are nothing like you. Each enemy takes many dozens of rounds to fell. You are fighting intense battles through a futuristic space station. You can go without rest or food or water for days at a time. If you are wounded a first aid kit will miraculously cure you to ‘100%’. This is rare because you have strong yet lightweight armour and it takes many hits to do you any damage.
    I'll bet you're thinking, that sounds like a fun game!

    Now lets look at something more ‘realistic’…
    You are a member of a team tasked to achieve a specific goal, you have to do whatever your team leader orders you to do, and you have to supervise your juniors. Communications equipment are cumbersome and unreliable. You are carrying one, maybe two weapons which you have been trained exhaustively to utilise. Your ammunition is limited and heavy. Your enemy is difficult to distinguish from those you are tasked to protect. This zero gravity is playing havoc with your ability to move and get your bearings. You have to stop to eat every 6-8 hours or so and you have to stop to sleep every 12-24 hours or so, or risk dropping to zero in stamina. If you are wounded, you can look forward to weeks maybe months of rehab. It’s a case of one hit, one kill.
    This time around it sounds like… not so much fun.

    Real life, in short is not as full of fun as a game. This sounds obvious but then why the obsession with making games as “˜realistic' as possible? I see games like “˜Call of Duty' http://www.callofduty.com/ and I wonder, why not just join up in real life? There are also commercially available combat simulators, such as “˜Virtual Battle Space' http://www.virtualbattlespace.com/, that do a far better job.
    Also, as games get more realistic, more questions are raised when the game is set in an unrealistic environment or uses unrealistic items, monsters, etc…

    Think of the games of which you have the fondest memories. Do they have ‘realistic’ graphics or was a lot of it left up to your imagination?

    For me it's “˜Commander Keen 4' http://www.commander-keen.com . The graphics were VGA (hawt!) but each level had it's own atmosphere, the game was fun, I actually finished with a sense of satisfaction and closure rather than feeling frustrated and/or ripped off. I still hold the Commander Keen series and many other titles of the same generation with special regard. (Maybe this is due to the games working first time, every time without pushing the computer to its limits and requiring an expensive hardware upgrade.)

    Hands up who has played “˜Quake'? http://www.idsoftware.com/games/quake/quake/ Yes, a lot of you. The graphics were gritty with lots of polygon goodness. You got to fight weird mutants with weapons like the lightning gun and grenade launcher. Multiplayer was even more awesome.

    Who remembers “˜Worms'? http://en.wikipedia.org/wiki/Worms_(computer_game) It was a game about little pink worms that exploded each other with various destructive implements of war. It wasn't in the least realistic, but it was extremely fun (and at times funny!)

    Let's look at “˜Starcraft' http://www.blizzard.com/starcraft/ . You can choose to play as futuristic humans, monsterous alien hordes or technologically advanced psychic aliens. Each side has it's strengths and weaknesses and a character that sets it apart.

    Lastly, “˜Warcraft' http://www.wowwiki.com/Portal:Main, which has “˜cartoonish' graphics and some unrealistic aspects. It allows your imagination to fill in the gaps, the graphics are generating an atmosphere and “˜feel' more than attempting to be photorealistic. Artistic license is taken with encumbrance and abilities to keep the gameplay going. Also, it's hard to decide what's realistic in a world with orc shamans riding wolves through magic portals to pick a made-up herb in order to distil a potion of mana.

    Now for a conspiracy theory. If the software and hardware companies are in cahoots, then it makes sense that the latest game should provoke customers to also invest in the latest hardware to support it.
    Think of “˜Halo 3' http://en.wikipedia.org/wiki/Halo_(video_game_series ), which only works on the Xbox 360. This is blatant cross-selling. “˜Halo 2' worked on a normal Xbox. If we imagine a “˜Halo 3' that was developed with the same engine, graphics, etc… as “˜Halo 2' then it stands to reason that, that version of “˜Halo 3' would have worked on the normal Xbox. Of course, this would have saved all those who bought the game from buying an Xbox 360 too and deprived Microsoft of more millions.

    I believe that as the graphics and ‘realism’ get better, our imaginations are used less and less. Rather than keep pushing the envelope to satisfy some sort of deluded ideal of realism. I say, let’s get back to making games that are fun with less realistic graphics (or graphics left as good as they are currently) and let our imaginations fill in the gaps.

  95. Patrick the Malcontent says:

    In business this phenomenom is actually quite common. I think this is also an almost uniquely American problem , not just in games but in all American born business idealogy. Consider:
    Having been to several other countries I can attest that, Non-American Football (soccer) aside, the rest of the world does not share our love of competition. We compete at EVERYTHING. As children and adults we compete for attention, status, money blah blah blah. This is such an ingrained part of our culture that we don’t even notice it until we are submersed in a culture that doesn’t value this as much as we do. Example: While in Australia, which is so similar to out own country if you woke up in it you wouldn’t be able to tell you were in a foreign country if not for the accent, as a sailor I did what all sailors do and I drank alot. And of all the bars I came across (about 20?) I encountered ONE pool table, no dart boards and no pinball machines. Nothing but chairs tables and beer (lots of beer). Only one game. In 20 bars. I dare you to find a bar without at least one of above, most would have all 3. Simple things like this may seem trivial, but I think it’s indicative of the rest of the worlds apathy towards constant pissing contests, even if they are trivial. Perhaps because they are trivial.
    I honestly feel most game developers fall prey to the same vice that controls Hollywood, Auto manufacturing, and corporate America in general.
    Vanity.

    You can talk all you want about how “…they can't attract gamers without the next-gen graphics,…”, to me that is viewing numbers how you want to see them. Shamus’ chart, however comical, is correct. But numbers are convenient in that they can be rearranged however one wants them to be, or however they need to be to justify a budget with 12 zeroes so that you can put ” lead developer/producer/designer/whatever of prettiest, most talked about game of 200X” on top of your resume’.
    Because we like having budgets with 12 zeroes as opposed to 5, even if it means making the same profit.
    Because we like being the best of the best, if only for 3 months until your rival out does you.
    Because corporate lackeys who can’t program like words such as ” cutting edge”,”state-of-the-art”,”next-gen” and any number of catchy acronyms that just sound neato.
    Because we like having our name in bold print in some half-assed magazine like Game Informer next to the above buzzwords. Preferably above other words like “bold”, “mesmerizing”, ” addictive” and other such lavish comments that the studio pays them too. The fact that they pay these puppets to talk about them in such a way, and that rarely are they true, is inconsequential. True or not, people like having their asses kissed.
    Making good games isn’t the point.
    In some ways making money isn’t even the point.
    Polishing our vanity is the point. Competing is the point. Even if no one else cares.

  96. Chris Arndt says:

    As for the growth of “realism”, when I first played the Demo of Mario 64 in that store (by Demo, it means that you get the full game to play but only for a fixed amount of time) I noted that it was “so 3D!” and then I realized…. with the bright colors it really is meant to be “more 3D than real life”.

    I don’t think it’s good overall. I just want to play a good game. I don’t care that much about immersion, just certain sort of mental synch with the game.

    There’s a bit too much emphasis on immersion. I do remember as a small one in the mid- to late eighties as a small child playing Berzerk and Missile Command on my 1981 edition Atari 2600 and getting far enough into it that I felt claustrophobic during Berzerk and downright jolted when MC made those endgame End of the World It’s Exploding It’s All going to HELL sounds.

    So it doesn’t really matter how powerful the system or the game is. When you are into the game, you are into the game.

  97. Chris Arndt says:

    Now for a conspiracy theory. If the software and hardware companies are in cahoots, then it makes sense that the latest game should provoke customers to also invest in the latest hardware to support it.
    Think of “˜Halo 3' http://en.wikipedia.org/wiki/Halo_(video_game_series ), which only works on the Xbox 360. This is blatant cross-selling. “˜Halo 2' worked on a normal Xbox. If we imagine a “˜Halo 3' that was developed with the same engine, graphics, etc… as “˜Halo 2' then it stands to reason that, that version of “˜Halo 3' would have worked on the normal Xbox. Of course, this would have saved all those who bought the game from buying an Xbox 360 too and deprived Microsoft of more millions.

    That’s not a conspiracy theory. That’s how the exclusive games work. Soulcalibur IV and Metal Gear Solid 4 are made for Playstation 3. Metal Gear Solid 2 is a Playstation 2 game.

    There are versions of Resident Evil, Street Fighter and even Metal Gear franchise games that are console-specific, and not for Playstation 2, but each for a different console.

    The exclusive games are the real advertisements for the respective systems. It’s not merely the processing power or some weird thing like that. A lot of people would not buy Xbox 360 except for Xbox Live or Halo 2 or Halo 3 on Xbox Live. Some people really want…. ah…. I don’t know.

    I’ve made my point.

  98. Gaping_MAW says:

    Shamus, I don’t think you’ve kept your finger on the pulse this time.

    More and more games are being released (and being developed) with large scalability and therefore larger user base in mind. The whole ‘upgrade’ or perish mentality is going from the PC sector (outside of a few tech ‘concept’ games or devs who are basing themselves in low cost countries). I think this is mainly because of the ‘PC gaming is dead’ hysteria that struck earlier this year and the rise of the console market.

    Your rant is probably 6 months to a year too late :)

  99. DocTwisted says:

    I think the TRUE challenge that the PC gaming market is afraid to face is handing their programming teams over to truly creative game designers… the next Will Wright or Sid Meier, or maybe even the next Richard Garfield or, God willing, The Next Steve Jackson.

    What I want to see is something like, oh… Cheapass Games expanding into the video game industry. They just start with the board and card games they’ve already designed and developed, like, say, the whole Fryday’s set starting with “Give Me The Brain!,” And make a relatively low-grapics (yeah, like C-3PO level in Shamus’ chart) version for the PCs and Macs that can be played with other owners of the game online and/or some computer AIs of varying difficulties from “ZOMG N00B” to “Cheating Bastard.” Each game would be a $6 to $10 download, just like their current paper games are priced.

    I think they just might double their income overnight, if they busted out with this.

  100. Zerotime says:

    What’s with all the anti-Australian sentiment in here lately?

  101. Shamus says:

    Zerotime: Anti-Australian? To what are you referring?

  102. Telas says:

    DITTO!!

    And: A lot of those games that were sold back in the Dark Days Before Doom had a longer lifespan in the market than the games sold today.

    WoW excepted, of course.

  103. Ian B. says:

    @MRL: Hey, no kidding! Nice.

    Yeah, metal pads are pretty expensive. I was able to get my hands on a fairly inexpensive ($150) Cobalt Flux, thankfully. Before that I was using quite possibly the worst home pads money that can buy (you guessed it — the Mad Catz Beat Pad). :P

    I generally play in arcades, though. I’ve met quite a few people through DDR/ITG/PIU.

  104. Maddy says:

    It’s true that I could still play my old computer games if I had kept the computers they ran on. But those computers were freakin’ huge compared to most game consoles (and don’t forget your old computer will also want an old monitor). That’s why I don’t consider it to be comparable. I think I could have fit my old Atari (which still worked when I unloaded it on eBay a few years ago), as well as my PS1, PS2, and a few game controllers in a box the size of one of my old desktop computers.

    But I will admit that a lot of my disgruntlement over unplayable software comes from my using a Mac in those days. When they released System 7, a lot of the old stuff wasn’t compatible. I don’t know if the same thing happened to people when they went from 9 to 10. Nor if it happens as often for Windows games.

  105. Justin says:

    Amidst all the Yahtzee-bashing and dreams of our host enjoying teh Haloz, I have noticed a couple of things. First, it’s not just me that thinks that a well rendered pile of excrement is exactly that. Pretty doesn’t automatically mean fun. There are a ton of good ideas out there already, so I’ll leave that where it lies. Second, who cares if Shamus doesn’t like Halo? Most games for the Xbox support online multiplayer. I’d be downright proud to add Shamus to my friends list if he decided to “Jump in.”

    Although… the 100 person limit is likely to be filled REAL fast…

  106. Ian B. says:

    @Maddy:

    I don't know if the same thing happened to people when they went from [Mac OS] 9 to 10.

    I bought a PowerMac G4 in 2002 and, despite my limited experiences with Mac OS 9, I did run into a fair number of games that refused to run under Classic on OS X. Now that Classic isn’t supported at all in 10.5 (and the fact that the PowerPC Mac days are very much numbered), I think it’s safe to say that you’re not going to have much luck natively running old Mac applications.

    That’s one thing that kind of impresses me with Windows. If you have a system with 32-bit Vista, you can technically run programs that are older than me. As far as Windows itself is concerned, Windows 1.0 programs will run on Vista with virtually no modifications. 64-bit users don’t have the luxury of running 16-bit applications without virtualization (though I think it is possible to do so in Linux using dosemu or wine), but oh well. 15 years of native backwards compatibility (from the release of Windows NT 3.1) is still quite good.

  107. MaxEd says:

    I’ve been working in game development for the past two years and I completely agree with Shamus. Teams got too big => development costs too much => need to reach wider audience => dumb games with shiny gfx.

    I think that model is starting to fail. Some major companies (read: Atari) report LOSES instead of profits, but Dinosaurs (big game companies who own dozens of subsidaries/internal studios) has yet to feel the pain. I think they will get smaller – or die and get replaced with leaner, more agile teams who will know their audience and make gamer for them, not for everyone. If you allow me a little analogy – did Chuck Berry ever recorded a disco number when disco was popular? Nope. He’s still playing his rock’n’roll and gets money from everyone who likes rock’n’roll. Any band who betrays their audience (not with experimenting, mind it, but with trend-following) usually lose everything.

  108. Arelion says:

    Awesome Shamus! I totally agree and have been thinking about this for a while. I used to buy games that I couldn’t play yet, knowing that by the time I could afford to upgrade my computer enough to play them they wouldn’t be on the shelves anymore.

    Seriously though, I think one of the main problems is that any game that doesn’t have bleeding-edge graphics usually gets marked down by game reviewers as being “out-of-date” and receives demerits. I still think it’s funny though that games like Diablo 2 or Baldur’s Gate 2: Shadows of Amn, run better and look better than Oblivion or NWN2.

  109. ShadowDragon8685 says:

    I think the only true middle ground is Half-Life 2, and games made on the Source engine.

    Source came out yonks and yonks ago, but, guess what? It still looks *good*. I don’t need absoloute photorealism to shoot a Combine soldier’s face off, I don’t need absoloute photorealism to decapitate a zombie with a grenade. I just want to have fun *doing* so.

    And, more importantly, thanks to the fast pace of Source’s upgrades due to the Episodic content of the HL2 Episodes, the engine itself gets *upgrades*. Upgrades which are more or less free, look good, and will *still run* on old hardware.

    Granted, if your rig was whining and groaning to play HL2 on lowest settings back in ’04, you’re not going to be able to run HL2:Ep2, but if you purred along well back then, you’re still going to by able to play Episode 2 and have fun now. You may have to turn the HDR off, but guess what – you’re probably not going to even notice it’s gone. VALVe themselves commented on this – when we’re doing our jobs right, you won’t even notice; it just looks cool.

    So, how about some more and varied content, instead of taking a tiny slice of content, poliushing it to the point it *hurts* to view it, and then plastering a tiny-ass game with it, hmm? Just look at Half-Life 2: I once boasted that I could still go through it all in one sitting. I was wrong, but then, I was never right. That game is easily 12 hours long if you’re a speedrunner, and can stretch out to 20, easily.

    The Episodes, too, are good for awhile; if you take the time to look around and smell the roses, you won’t even notice if you’re running it on less-than-bleeding-edge settings, because it still looks *good*. I just wish, and I’m sure most of you can agree with me, that if we could have more games produced to the visual levels of HL2: Ep2, or even HL2, which were long, had nicely varied scenery, and had a great fun time playing, we’d rather throw them our money.

    Hell, a friend bought me Dues Ex over Steam awhile back, after Shamus mentioned it, and I love that game to death. Sure it looks like it was made in 2000! So what? Yeah, the graphics are a bit jarringly dated if you go into it from HL2, and even moreso if you go into it from Crysis, but they’re not overwhelmingly suck-tastic, and the game, more importantly, has such a compelling story for you to live.

    It’s better to be Gordon Freeman, or J.C. Denton, not lookin’ the best you could but having depth, than to have every shader and damn graphics technique known to man applied to you, and be as shallow a wading pool.

  110. Kevin says:

    Heh heh, great article!

    I have also considered that marketing to a bigger audience might make more money. (Seems absurdly simple, doesn’t it?)

    I wonder though… if the “bleeding edge” guys are the trend-setters, dragging the rest along at roughly the same pace, perhaps the model isn’t for immediately huge sales, but SOME sales right off the bat, followed by increased sales over the next few years. Just a thought.

  111. DaveMc says:

    Sitte (comment 60): “In response to the “spiciness”: Is that usage common around the webernet? This is the first place I've ever seen it, and only in the past few days.”

    I think it started when someone left a comment (sorry, I’m not going back to dig it up!) about how Shamus recently seemed to be using ‘spicier’ language than usual. Shamus picked it up in a post about his recent poor health, which he wasn’t mentioning except to say that he wasn’t mentioning it, but which he pointed out might be causing his baseline level of crankiness to increase. Since then, spiciness has been picked up as a local catch-phrase.

    So you’re not just imagining that it’s not common usage. :) But I agree with you, I hadn’t really noticed any significant increase in either crankiness or spiciness.

  112. mephane says:

    This was fantastic.

    While reading it, I kept seeing the Zero Punctuation animation of it in my mind and hearing Yahtzee read it in his ultra-quick & ultra-condescending voice.

    Haha, as soon as I read these lines, I realized how true it is. Maybe Shamus and Yahtzee should consider some cooperation, that would be awesome. :D

  113. RadioDave says:

    Shamus… long time lurker on the site… just wanted to let you know that you couldn’t be more right. I am a former PC gamer, and the word “former” is there for exactly the reasons you laid out. I have my XBOX 360 and I know without a shadow of a doubt that every game I buy for the system will work.

    No upgrades, no updates.. and no online activation… for the same price (or less) I’d pay for the PC version.

  114. Avilan the Grey says:

    Great Article (had to say it). I have not read all the comments, so forgive me if I repeat stuff.

    A large team might be justified, if they do other stuff than the bling. Note Spore, for example. I am particularly interested in the fact that they went back to basics when it come to optimizing the code, and the way it is coded (not the same thing) to make sure it can run on what is now really old machines. They hired people from the Demo scene because these know how to actually make a machine do more with less, unlike most modern game engines that really do less with more (which means you have to use so MUCH more to actually archive an actual More).

    Blizzard might not go that far, but as far as I can tell the demands on computers for the upcoming D3 have been promised to be very low, simply because they make more money if they sell more games. Which is a Good Thing.

    Sidenote: When Russian games started coming into the mainstream market in the 90ies (not counting Tetris, obviously) it was often noted in reviews how efficiently they were coded; a flight simulator made by former east-block programmers could run on a low-end 486 PC as well as a Pentium, simply because they had learned the hard way how to cramp as many instructions as possible into small amount of memory…
    End Sidenote.

    As for creativity, as someone else pointed out: Spore, again. I will buy it, copy protection be damned, because quite frankly it is the game I have dreamed about for 25 years. Yes I am a God-Game fanatic. I still remember my immense love of Populous on the Amiga 500, the Dungeon Keeper on my PII and yes, I really enjoyed Sim Ant, Sim Life, Sim Island, Sim City, Sims…

  115. Zaxares says:

    Take heart, Shamus. Eventually they’ll create games with graphics so good it’s indistinguishable from reality. Then they’ll HAVE to start devoting more attention to other aspects of the game, simply because graphics can’t be improved any further. :P

  116. beno says:

    hey, don’t knock “fancy pants” games! they rock!

    http://armorgames.com/play/553/the-fancy-pants-adventure-world-2

    btw, I think that all the one-man C64 game writers from 20 years ago are now writing flash games like the one above. I love it! It’s so much easier to get character and charm into simpler games – and if the game turns out to be crap, at least we’ve made nowhere near the investment of time or money compared to other topline-wow-amazing-graphics games (either the game maker’s investment or mine).

    hear hear Shamus. and you know that the more you rant and are sarcastic, the more Australians will like you because you’re just like us! there’s nothing wrong with cynicism and jadedness, you just make it into an art form and it becomes fun! (whingeing also gels with the Brits and they will buy you beer)

  117. Eric says:

    And, the french will buy you cheese

  118. Patrick the Malcontent says:

    @zerotime

    If my comment came across as unpolite or “australia bashing” it was certainly not meant to. I spent a week in Perth and it was BY FAR the best shore leave I ever took. It is the one country I look back on and wish I had been more sober that I might remember more of my time there. RED BACK BEER RULEZ!

  119. Derek K says:

    “I see games like “˜Call of Duty' and I wonder, why not just join up in real life?”

    Because the respawn in real life kinda sucks? And it takes a bit longer to do the install and setup on the game? And because exiting is a bit more involved in real life?

    I’ve never quite understood that argument – people used to make it when we’d talk about gritty roleplaying campaigns – “Why would you want to play a game that’s like life?” Because it’s still very different, in terms of environment, situation, reality, and investment.

    I think I may be the only person that didn’t worship Doom when it came out – I played it, and dismissed it as pretty simplistic, and not very engaging, then went back to replaying my Gold Box games, or spending way to much time on X-Com, or MoO, or Civ…

  120. Daath says:

    #94,

    I agree that singleminded pursuit of realism can lead developers astray. First and foremost, the games should be enjoyable, and without that, everything else is rather irrelevant. I think you’re overlooking couple of points, though, propably because they don’t matter to you.

    First, the immersion and suspension of disbelief. Yes, it can be fun to play a kick-ass supersoldier who shoots .45 bullets out of his Trouser Titan, but the sheer irreality of the situation screams “I’m just a game!” at you. The realistic games don’t. It’s a complex subject, but that’s the gist of it. Doesn’t have that much to do with graphics – doesn’t matter how many à¼ber-shaders the game puts out as long as the style isn’t cartoonish or too stylisized – but rather with gameplay.

    Second, the ruthless, often unfair nature of realistic games does appeal to me and many others. Combined with deeper immersion, it feels as if you’re really overcoming greater obstacles than just killing 1001 monsters because you’re so awesome. It’s not tedious either. In real life, I do boring stuff at work, study some considerably less boring things, struggle with petty social conflicts and so on. I don’t want that from a game too, so I don’t play Sims. Maybe if I actually was a special forces soldier with hundreds of hours of combat experience, I’d be less interested in these games, but I most definitely am not.

  121. Derek K says:

    Random comment:

    Given the volume you’re getting now, over 100 comments isn’t quite the watershed moment it used to be. ;) Currently, 3 of the 5 entries that can take comments are over 100 right now. So the “HOLY COW OVER 100!” comment strikes me as amusing each time I see it.

  122. Alexis says:

    “The fancy visuals are exciting for the first few minutes, but then the user becomes acclimated and desensitized to your razzle-dazzle and they're left with just the gameplay to entertain them”

    QFT. Further people who are good at these games tend to actively tune out most of the prettiness. Most FPS consists, Terminator-like, of eliminating every part of the scene which cannot be shot at and then shooting at it. I hear top players turn down graphics to the minimum, textures blur the edges.

    Even at my level (bloodsmear) it takes a conscious effort to look at the scenery instead of constantly scanning and filtering.

    I also snickered at the avalanche insurance.

  123. Mari says:

    @Derek, I dunno it still rings true to me. Yes, Shamus’s viewership has increased but he seems like pretty much the same guy that I’ve been reading for quite a while now so it seems fitting that “holy cow over 100 comments” would still elicit pretty much the same astonishment. And now a wretched old country music song that goes something like “Oh Lord it’s Hard to be Humble When You’re Perfect in Every Way” is playing in my head.

  124. Chalicier says:

    Another dev checking in:

    The fundamental problem is the continued dominance of publishers. I realise this isn’t obvious from the user end of things, but it really is the core of the issue. Quite simply, without a publisher a game doesn’t get made, and publishers tend to be very… *specific* about what they will and won’t publish.
    So what makes the difference? Well, publishers want to get good reviews (for something at least), so you have to please the dipshit parasites who downmark anything that doesn’t look like Crysis by 50%. But worse than that, they scour the universe for new technological buzzwords, and then demand to see them in their games to be sure of getting a return on their “investment”.

    From a coder’s point of view, writing a game is more an issue of survival than anything else. We don’t optimise properly because we don’t have time to. We don’t spend enough time in QA because we don’t have time to. After your 10th consecutive 75 hour week, you start to lose patience for these things. And why is it that compressed? Because publishers are always trying to get more game out of less money and less time. “Good business sense.” Fuck that, we’re supposed to be making games here, not saving money so they can spend it on coke and whores.

    Worse, as wererogue mentioned earlier, designers have a habit of saying bloody stupid things like “wouldn’t it be cool if…” a month before we’re due to publish. And annoyingly they’re generally right, so it’s hard to say “NO! Just fucking NO! We’re busy, OK? Come up with the good ideas sometime sooner and we’ll talk!”

    Feature creep, publisher buzzword lust, and the damn journalists, those are the bastards you want to watch out for – on PC or any other platform. Want proof? Those games that are so successful – did WoW:Burning Crusade get marked down for graphics? Do Blizzard have an incredibly tight feature control approach? (Quick clue: Yes, they do.) Do PopCap even have a publisher? Success comes from either playing the game better than anyone else (see Rockstar) or from ignoring the “industry” altogether and making your own rules. As a business, not as a dev team. Unfortunately everyone’s happy to blame “lazy devs” and ignore the giant monolithic leeches that have been busy draining the life from the games industry for decades. Ah well.

  125. Alleyoop says:

    @Chalicier: more end users get that than you may know. These days any time EA, for example, ingests yet another studio, a great many people groan and start grieving, because they know what’s in store for themselves and the developers who got eaten.

  126. MadTinkerer says:

    As someone who remembers the once legendary Origin Systems and Bullfrog getting eaten by EA and what happened afterward (the death of every franchise by those companies), I’ve since regarded publishers as, at best, a necessary evil. I’ve heard virtually nothing to dissuade me from this point of view.

    I know there are good people working at EA. It’d be impossible to staff a company of that size exclusively with heartless bastards. Nevertheless, I still think that the company overall does more harm than good, and that’s me being charitable.

  127. Nabeshin says:

    Good gravy.
    In all of this, I don’t think anyone’s mentioned the one game that STILL looks good after all these years: Myst.
    With some exceptions, I’ve yet to see a game that even comes close in visual quality. And when I played it…I was using an integrated GFX chip. The only thing that comes close (that I can think of off the top of my head) is Bioshock. And I’m using a NVidia 5200.

    Right now I’m running an old *derisive snort* P4 3G hyperthreading processor. It’s only within the past year/year and a half that I’ve really had to start looking at the minimum requirements.

    Let’s take into account BF 2142 as vs. BF 2. BF2 I can play almost immediately, not much in the way of lag, some jumpiness in the framerate (Always when I’m lining up a head shot dammit) but very playable.
    When 2142 came out I clotheslined and cock punched little old men in shorts, sandles and black socks to get to the gaming section of my local store to get to it. Bliss and glee! I hold in my hands the ULTIMATE…..what the…?
    Loading screen? Okay…I’ll go grab a lager.
    Still loading? Well dammit. I’ll go make a sammich with cheese and bacon…
    Still loading?! Okay, lemme go have a red…15 minutes later-I’m in. I take one shot, and the map is OVER.
    I’m sure you can imagine my rage at that point.

    Pretty shiny is all fine and good, and I CAN afford to buy the latest and greatest in hardware. But who says I want to? So your game has better physics and graphics. All this tells me is that you spent more time making the female character’s milk mounds more ludicrisly large and floppy.
    Yeah….lemme go out and buy that new GFX 9000 card in SLI so I can REALLY get a good view.
    Pass, thanks.

  128. Deltaway says:

    Ah, Chalicier: You give a good perspective, and I’d like to apologize for my free-handed use of the word “developer.” I realize that it isn’t always the developers who get to make the bad decisions, and I don’t want to attach blame to a facet of a system about whose workings I know very little about. Thanks for offering your view on this trend. I’ll try to be more precise in the future.

  129. Chalicier says:

    Sorry if I sounded a little aggressive, I was writing from work while ill (never a brilliant idea at the best of ideas).

    Ultimately the problem is the opposite of what John said earlier. The issue isn’t that companies make the games that developers want to make, because developers just want to make games; we want to be part of the creation process of something we can be proud of. The problem is, perversely, business people and marketers, who all-too-often don’t understand the first thing about the process of game creation and generally don’t really understand their markets either, only the rather twisted caricature of such provided by the press.

    Fundamentally this problem comes down to money vs art, which is a perennial problem of all artforms that have ever existed. TV and Hollywood have been homogenised to produce bland CGI-propped pap, the stage produces nothing but “shocking” sex romps for the most part, even opera goes for the big names rather than obscure-but-beautiful productions. Games are just maturing enough to make it obvious that this is happening for the first time.

  130. wererogue says:

    For the reasons Chalicier gives above, I have high hopes for the games coming up from GameCock, and intend to support as many of them as I can. They’re a publisher who fund independent studios and developers to make their own games – the ones the *developer* wants to make.

    I was also really interested in a recent piece of news about Naughty Dog, where they bash the whole concept of having producers in a dev studio. There’s a piece on it here: http://www.eurogamer.net/article.php?article_id=203260

  131. Cybron says:

    Maybe I’m just strange, but I don’t really feel the ZP/Shamus connection. They have very different styles – most of the comparisons just seem forced.

    In any case, I’ll have to jump on the photorealism-hating bandwagon. Give me games with bright colors and stylized graphics any day – I greatly preferred the cartoony appearance of Windwaker to Twilight Princess’s BROWN EVERYWHERE look.

  132. Damian says:

    I’ve always been a proponent of the stop-caring-about-graphics argument. I literally pound my steering wheel (not a euphemism) in anger whenever I listen to a podcast in which a game developer is complaining that they have to reinvent the camera each time they release a game. Are they actually insane? Can they really not understand the concept of re-using existing code and writing another set of control code and rules for a new, fun experience?

    I tried the demo of Depths of Peril on the weekend. It looks like it was ‘made in 2006’. Hell, it looks like it was made in 2002. It’s not triple-A. It’s barely B. But you know what? It’s damn fun. I bought it immediately.

  133. FlameKiller says:

    Blizzard seems to be following the trend in Starcraft2.

    the video trailers look to be top line high graphics.

    i guess after the out of date graphics in WoW the head of sales wanted the best visuals they could get. so they took all the money saved by cuting the graphics on WoW and put it and then some on SC2.

    the saying of WoW working on almost all computers is false. i have a vista computer(Acer) with 1 GB mem, amd athalon 64 processer. and can play many of my games with better details. It can baely run WoW and at one point i got stuck in a loop. i got on a zeplin and by the time the thing almost loaded it had started the trip back. yet i have a Dell Win XP that yells at you when you start up a game that needs a graphics card of more then 64 mb. but WoW runs extreamly well on it.

    i just fear the time that Starcraft2 comes out. the hype will be crancked to max. And i hope that they have a simple security system. if it does not, the game would die like Spore.

  134. Shoku says:

    The fancy lighting and such don’t seem to have that much to do with the growing development times- it is that the artists want to do more with the prettier tools. I’m sure you could produce a game utilizing ultra-pixel shader magic and so forth with barren square rooms in no time flat, or better yet with a one man team.

    Interior decorators and architects are where things go awry and this is where the higher graphics really cause the issue. Instead of saying “I’m tired of placing desks and picking which ones to put a potted plant on” and building a tool that generates random desks to use in all those places where you just need some decoration… they go placing nicer looking desks with more detailed junk on them. Smart people make some tools to automate things like this but if you want that you have to start over when you reach those new graphics cards.

    For anyone about to jump down my neck about that I’m being a bit metaphorical about that.

    Anyway there is sort of some truth to the need for high end graphics. Walking through an aisle of games you can see a variety of titles that look like trash and genuinely play like trash. Gameplay and appearance are obviously not really conected like that but the teams that didn’t have the budget to make the game look good probably didn’t have the budget to make it play good either.

    But I’ve cheated. These titles don’t “look like they were made in 2006.” They aren’t made to fit the middle of the bell curve- but developers probably look at them and see it that way.

  135. ghost4 says:

    The notion that Crysis is a tech demo is nothing but a meme that was never grounded in reality. Judging by how often you’ve complained of struggling with system requirements, I’m assuming you never got around to actually playing it.

    Crysis really is a good game, just like Far Cry was. The environments are more or less open, giving you many ways of approaching the enemy or bypassing them entirely, and the nanosuit’s abilities give you additional tactical options as well. The graphics engine isn’t just there to look pretty, since it enables the vast environments and lush jungles (which can conceal both you and your enemies).

    The game also isn’t “juvenile” or “plotless.” There’s nothing juvenile about it, and there quite obviously is a plot.

    1. Shamus says:

      I played both Far Cry and Crysis. I thought they were both stupid.

      Although Crysis was much better than Far Cry.

  136. Sarah Miller says:

    Come on… lets boast about our ‘super hardcore gaming machines’…

    My computer is about ten years old.
    I played Crysis on it.
    800×600, minimum settings.
    It didn’t run smoothly.

    Still, compared to the cliffhanger ending, the way too linear levels and the allknowing ai (why was there a stealth abillity at all?), the stuttering was bearable.

  137. Rashida Siert says:

    A VPS can also be the suitable answer for a lot of industry organizations.

  138. pranav says:

    I think the biggest example is minecraft with over 100 million sold with worse graphics than PS1
    also put this comment box on topof the comments you have to scroll all the way down to post a comment
    awesome website by the way

Thanks for joining the discussion. Be nice, don't post angry, and enjoy yourself. This is supposed to be fun. Your email address will not be published. Required fields are marked*

You can enclose spoilers in <strike> tags like so:
<strike>Darth Vader is Luke's father!</strike>

You can make things italics like this:
Can you imagine having Darth Vader as your <i>father</i>?

You can make things bold like this:
I'm <b>very</b> glad Darth Vader isn't my father.

You can make links like this:
I'm reading about <a href="http://en.wikipedia.org/wiki/Darth_Vader">Darth Vader</a> on Wikipedia!

You can quote someone like this:
Darth Vader said <blockquote>Luke, I am your father.</blockquote>

Leave a Reply to MadTinkerer Cancel reply

Your email address will not be published.