Experienced Points: What Happened to Gaming Hardware?

By Shamus Posted Wednesday Nov 28, 2018

Filed under: Column 106 comments

My original plan this week was to write about the evolution of graphics and gameplay over the last decade. While writing that article, I realized I needed to discuss gaming hardware so I had some sort of context for all the other stuff. But then that aside on hardware grew into my column for this week. I guess you could think of this entire column as a footnote or parenthetical statement for next week’s column.

While writing this I was thinking about what a disaster the last gaming generation was. The Wii was a waggle controller in search of a game mechanic. The Xbox 360 had a bad habit of dropping dead the moment it went out of warranty. The PlayStation 3 was painfully expensive and ran on mutant hardware that made it hard to develop for. The PC was afflicted with bad ports, DRM, and Games for Windows LIVE.

This was also the generation where the BioShock was hailed as this great moment in gaming, despite the fact that it was pretentious and incredibly shallow compared to its forebears Thief and System Shock. It’s not an awful game or anything (and I still think the scene with Andrew Ryan is one of the great moments in gaming) but it’s overshadowed by the things that came before itThief, Deus Ex, System Shock. and the things that followedPrey 2017, Deus Ex Human Revolution.. This was also the low point of both Grand Theft Auto and Elder Scrolls. What I’m getting at is that being one of the greatest games of the previous generation is kind of like being recognized as the world’s best-smelling hobo or the world’s tallest midget.

I’m not suggesting there weren’t good games. There were lots of great games in this time period. But there were also a lot of annoyances, expenses, and hassles. Regardless of what platform you favor, I think we’re in a much better position now than we were 5 years ago. I certainly like my PlayStation 4 more than I cared for either the PS3Huge. Heavy. Tons of heat. Rounded top so I can’t put things on it. Mushy triggers that feel awful. Dumb tilt gimmick. or the Xbox 360Dead in less than a year of very light use.. I don’t think we’re in a second PC Golden Age right now, but I think things are better now than they’ve been since I started this site. Keep in mind that I’m an incurable cynic who complains about everything, so positive appraisals like this one are a rarity for me.



[1] Thief, Deus Ex, System Shock.

[2] Prey 2017, Deus Ex Human Revolution.

[3] Huge. Heavy. Tons of heat. Rounded top so I can’t put things on it. Mushy triggers that feel awful. Dumb tilt gimmick.

[4] Dead in less than a year of very light use.

From The Archives:

106 thoughts on “Experienced Points: What Happened to Gaming Hardware?

  1. Infinitron says:

    Yes, that generation of gaming was pretty bad, especially the early period (~2006-2010). See also: Oblivion.

    I think a lot of new gamers entered the hobby during those years and they just didn’t know any better. The latest generation of consoles hasn’t had that degree of demographic turnover, so things are different now.

    1. Shamus says:

      Oblivion! I sat there for five minutes, trying to think of the other “low point” game. I’m going to add that to the post.

      Now that I’m thinking about it, Fallout 3 could be considered the low point of Fallout. It wasn’t smart enough to keep up with the 2D games and it can’t keep up with Fallout 4 in terms of fun. Then again, the title of “Low point for Fallout” probably belongs to one of the spinoff games.

      1. Corsair says:

        I gotta disagree with this here. I think Oblivion gets a bad rap. I’d say it’s actually better than Skyrim in a lot of ways. Skyrim didn’t really meaningfully improve on the combat system from Oblivion, it’s still the same basic thing, and while the level scaling is significantly worse in Oblivion no doubt, the actually reasonably competent quest design makes up for it. I mean consider this: In the Oblivion Thieves’ Guild questline you primarily -steal things-.

        1. William Beasley says:

          I would argue that Oblivion was definitely a low point in terms of technological utilization as it was clunky and struggled to use its engine for even the simplest of things. However, it was definitely good in terms of writing for most of it. Stealing the Elder Scroll is always an awesome moment. On the other hand, its dungeons/caves/catacombs were really bad.

          The market was definitely weird at that time. Halo 2 was definitely better than Halo 3. I feel like the industry was overreaching at that point and in recent years they finally have the tools at their disposal to deliver what they were trying to do. However, nowadays they are getting a bit aggressive with questionably ethical business practices.

          1. Joshua says:

            And NWN was definitely better than NWN 2, which came out during this time.

            1. John says:

              I dunno. The original campaign for NWN got a lot of fairly justified scorn for being way too long and almost completely uninspired. NWN is fondly remembered today for the toolset and user-created content and to a lesser extent the expansions. I’m much less familiar with NWN2–I know that Shamus hates the plot doors in the original campaign and that everyone else seems to love, love, love the Mask of the Betrayer expansion–but it wouldn’t take much for the NWN2 campaign to outdo the NWN campaign. Given that there are far fewer user-created modules for NWN2, I suppose that the toolset for NWN2 isn’t as good though.

              1. Joshua says:

                The Toolset is the *point* of the game. The OC for NWN is long and uninspired, but it makes a lot more sense when you view it as a (very long) demo model for what the toolset can allow the user at home to build for themselves or their friends.

                NWN hit the sweet spot for robustness vs. user-friendliness for budding game designers. NWN 2, unfortunately, became too complicated for the average person at home to design their own modules. I’m not sure NWN 2 could have done anything about it, because the effort/time came about largely due to more complicated 3D terrain meshes. It’s one thing to slap a flat terrain tilepiece down that will auto-randomize and you can customize as desired, but it becomes a lot more effort to design full sloping topographic maps.

            2. tmtvl says:

              Did you play the NWN OC? SoU and HotU were pretty decent, but MotB blows both of them out of the water. The NWN2 OC definitely had problems, but at least it wasn’t as empty as the NWN OC.

              1. Hector says:

                I am very find of NWN2, but the big thing is that nwn1 was incredibly clunky to play. It has an interface that was sheer agony to play – probably the worst interface in any serious DND game. And the games/campaigns in nwn1 were awful compared to 2.

                I fully appreciate that modders loved nwn1. But 2 wasn’t trying to do, or be, the same thing.

          2. Hal says:

            I think the technological argument is the key on Oblivion. I enjoyed it, despite not appreciating certain mechanical choices by the developers.

            Still, there’s no defending the potato faces.

            1. Geebs says:

              If Bethesda were to rerelease Oblivion for VR (and allow modding so that I could fix the inverse levelling problem) I would buy it in a second. Oblivion had much better writing and quests than Skyrim – with the notable exception of the terrible main quest.

              I’d even put up with potato faces.

              1. Hal says:

                Yeah, the leveling (and level scaling) were not great. I still remember how I had to ditch my first character because he was too far behind the combat curve to progress in the world any longer.

                I’m going to defend the main quest, though. It wasn’t perfect, to be sure, but there were a lot of good elements there. Closing the Oblivion gates, especially wherein you were defending the cities from assault, had a very heroic feel to it. I also really enjoyed the subversion of the “Chosen One” trope you get in video games; usually the player is “The Chosen One” who is destined by fate and the gods to save the world. This time, it’s that guy over there, and he’s pretty unimpressive, so somebody has to save him from himself. It was a different spin on the usual role of the player, and I found it pretty refreshing.

                1. Geebs says:

                  To be fair both Oblivion and Skyrim spawn a bunch of repetitive, irritating busy-work as a necessary requirement of progression in the main quest, which detracts from the “go anywhere, do anything, at your own pace” vibe of, say, Morrowind. Oblivion gates have a real “hang on, didn’t I just clear up in here?” vibe after a while.

                  That being the case, I’d take fighting big scary dragons (despite their derpy animations, VR makes them look seriously impressive) over repetitive sojourns in the Todd Macfarlane Dimension any day of the week.

                  1. Matt Downie says:

                    I don’t think it detracts too much from the “go anywhere, do anything, at your own pace” vibe; ignoring the main quest is half the fun.

                  2. Hal says:

                    That’s fair. There were a limited number of Oblivion gates, so it’s possible to close all of them, but there were so many it’s understandable that few people would actually do so. Especially because there was a limited number of “dungeons” in the gates.

                    The aesthetic problem of Oblivion (both the plane and the rest of the game) was certainly an issue, though some of it is directly due to repetition. The Oblivion plane was pretty exciting the first time you showed up, but kind of boring and samey every time thereafter. Making the plane more interesting and varied would help; given what they did with the Shivering Isles we know they had it in them, but ultimately if you’re going through to Oblivion a dozen times that problem of repetition will still arise.

                    It’s not as if the dragons didn’t have similar issues, fun as they were. Late in the game they became HP sponges, and many of the fights were less epic than frustrating as you waited impatiently for the dragon to stop faffing about and fight you. And, of course, there was always that feeling of interruption when dragons would show up unannounced at the most inconvenient of times. (Oblivion gates, unmodded, would only spawn a few monsters in their immediate vicinity, so they didn’t cause much chaos.)

                    Edit: As for the “go anywhere, do anything” part, maybe the writing is part of that. IIRC, Morrowind’s main quest had no real sense of urgency; Dagoth Ur was just sitting in his dungeon waiting for someone to stop him. Oblivion, and Skyrim, had main quests where the message was, “This has to be done or the world will end!” Mechanically, it waits for you while you faff about, but narratively it carries a high sense of priority.

                    1. Bubble181 says:

                      Whereas Daggerfall had a main quest that didn’t really *seem* super-urgent, but you could randomly get a note saying “hurry up or the world will end” and, well, if you didn’t hurry up with the main quest, the world ended. Whoopsie!

                  3. Jabberwok says:

                    The reason I didn’t hate the Oblivion gates is because Oblivion let me enchant a full set of armor that made me completely invisible while sneaking all the time. So I just waltzed right past all of the tedium and got some cool artifacts out of the deal. It was craziness like that that made me like Oblivion more than Skyrim. Skyrim didn’t have much in the way of meaningful gameplay improvements (except for archery), but they did manage to design away a huge chunk of the fun from previous games.

                    Also, I kind of liked just leaving the gates open, to watch the local NPCs try to fight off demons.

                2. Jabberwok says:

                  I will also defend it as being better than Skyrim’s main quest. The Blades temple was a cool location, and Martin’s continued presence added a little more character than usual. Seems like ES stories barely even have characters, most of the time. It wasn’t great, but it wasn’t bad, and didn’t seem to get in the way of the rest of the game.

                  The level scaling was kind of hilarious. I remember one-hitting the second to last boss, but the hardest enemies in the game turned out to be just random goblins in caves.

            2. Urthman says:

              I waited 3-4 years to play both Oblivion and Skyrim and really, really did not regret that choice in the slightest. The games were cheap and so was the hardware to run them at high settings. And there were mods to fix almost every problem. Level scaling? Fixed. Interface? Fixed. Potato faces? Fixed. Bugs? Fixed. Lack of variety in quests, dungeons, spells, monsters, equipment, etc? Fixed. Almost any little thing that annoys you or you wish you could tweak? Fixed. Graphics? Better.

          3. shoeboxjeddy says:

            How do you figure Halo 2 is better than Halo 3? Halo 3 has the more complete campaign, all kinds of fun bells and whistles (4 player online co-op campaign with optional score chasing), the introduction of Forge, way more maps and quality DLC maps, a much more balanced weapon set, more secrets to chase (terminals, more skulls, etc), achievements, etc.

            1. Jabberwok says:

              I mostly prefer 2 because I think the writing is worse in 3. 3 made a lot of improvements in gameplay, but every game in the series did that. To me, the improvements from the first to the second were greater than from the second to the third. The second game brought online multiplayer, dual wielding (though removing that in Reach was also a good idea, but it was an interesting experiment), vehicle boarding, and probably the largest enemy variety in the series. I thought the arbiter levels were an interesting change of pace. And it ended with a strange boss fight, instead of just a rehash of the ‘drive the warthog out of the explodey thing’ like 3 did.

              Really, I liked all of them, but 3 would be at or near the bottom for me if I had to rank them.

        2. Hector says:

          Oblivion eventually, with ample modding, became a very good game and arguyably better than even Skyrim with mods today. But that was 2-3 years after launch and mods that had to aggressively re-write or discard fundamental systems that worked horribly.

          That said, there are things the mods didn’t or couldn’t fix, such as the bland setting. The overall concept and writing were good (not great, but solid) but the artistic design was epically, hilariously dull. Most people only seem to remember the bad facial animations – but those were good for the time (and almost unprecedented on the scale of Oblivion) . The textures, world design, and dungeon layourts were almost universally awful. They also cut out many of the game’s more interesting systems, though they at least kept many of the core concepts that Skyrim later dropped.

          Still a good game in its own right, but Bethesda should have really listened to the feedback.

      2. Agammamon says:

        Well, ‘low point for Fallout’ seems to have shifted to FO76 now. Really, ‘low-point for Bethesda Game Studios’/

      3. Mephane says:

        Now that I’m thinking about it, Fallout 3 could be considered the low point of Fallout.

        I will henceforth believe that Bethesda possess a fully functional time machine, read this sentence, thought “hold my beer” and went back in time to make Fallout 76 to release just around the time this column would be posted…

      4. Sartharina says:

        The problem with Oblivion (And all Elder Scrolls games) is that it fell into the Siren Song of “A living, breathing entirely new ‘real’ world inside your computer” that hit a lot of games at the time (And then horrifically mutated into the UbiSanbox). And, like its face generation system, had all the ‘technically correct’ options (All those options really do play into realistic facial structure and variations), but with fundamental flaws and shortcomings in it that turn the whole thing into a comical, uncanny farce.

        It tracked civilians travelling across the world (Meaning sometimes, someone you need to talk to for a minor quest was Eaten By Wolves months ago)! And gave them all their own unique schedules for day-to-day activity (Going to places, and standing around for a few minutes!)! And talked to each other (Where to begin…)!

        … that said, the dungeons were absolutely phenomenal for just random crawling and exploration, dragged down by the limited tilesets, sheer number of them (Diluting the experiences), and limited cell size (Making it impossible to get ‘lost’ in them).

        Everything was Technically Correct Given the Limitations of the Time, but those limitations rendered it Just A Little Bit Off.

        1. Hal says:

          I saw a mudcrab the other day.

          1. Geebs says:

            Horrible creatures!

      5. Fists says:

        Oblivion is the ONLY good Elder Scrolls game. Morrowind is unplayable because all you do is wii-waggle your stick at a scrib until you die or if you leg it to the first town you can take on a job as a delivery boy for the mages’ guild until you realise you could be doing something fun. Skyrim just isn’t worth playing because jack all happens and you might as well be playing Doom or COD rather than watching Draugr walk into their own traps.

        Oblivion has fun and funny quests, interesting characters and a colour palette.

    2. Redrock says:

      I mean, 2010 was the year of Red Dead Redemption, New Vegas and Mass Effect 2. 2009 had Arkham Asylum and Dragon Age. We also got two Uncharted games in that period, the best Hitman game (at least before the reboot), Okami and The Orange Box. So I dunno. The thing is, with the period you’re referring to, there was no way it wasn’t going to be painful. It was the time for PC and console games to mix together like never before. Above everything else, that’s what Bioshock was – a Thief successor you could play on a console, with a controller, sitting on your couch, with all the added slickness and, alas, simplification that entails. Ditto for Fallout 3 and Oblivion. This is obviously a painful era for long-time PC gamers, true, when instead of a linear evolution we were seeing this weird detour into console country that from the perspective of the PC crowd was mostly a step back on all counts. But, like any painful transitional period, it needed to happen. In the end, a sort of barrier between PC and console was mostly broken, “PC Master Race” memes notwithstanding. And now we have Yakuza games on Steam.

      1. ccesarano says:

        Being a console gamer that has tried to go back to play System Shock 2, I think there are some benefits in that “simplification”. I remember so many PC Gamers viewing the keyboard as being superior to a controller because of how many more commands you could have, for example. Even before this generation of systems I was never able to remember which key was a shortcut to what weapon in Aliens vs. Predator 2, so I just used the scroll wheel. Fast forward to shooters on console, and I think it was Valve that was first to try something different. Their Half-Life 2 port on the original Xbox used the D-Pad to map types of guns to different directions, speeding up weapon swapping without having to memorize which was what number key. Now we have radial menus. Is this “an improvement”? I know for me it is, even if the counter-argument would be that the old/traditional method “is quicker”, but it comes with the caveat that it is only quicker if you remember how the weapons are mapped.

        So jumping back to System Shock 2, I’m looking at the menu for the controls and [i]every. stupid. thing. possible.[/i] has its own mapped key. That’s not a more deep, rich gaming experience, that’s clutter. While there are drawbacks to the context-sensitive action button (don’t need to figure out which tool to use), it at least doesn’t require a reference sheet, which I recall my brother’s copy of MechWarrior 2 coming with.

        Which is not to say I cannot see the other ways in which PC Gamers might have been let down. Bioshock was my first Levine game, so it was a pretty big milestone for me to play. Nevertheless, as Western developers have become more accustomed to improved usability and interface design, elements of Bioshock have certainly grown some rust at a rapid pace.

        Perhaps, however, that’s what marks that decade not only as a transitionary one for developers, but gamers as well. PC gamers that have pretty much exclusively played Western games with a “Look at how many keys we have!” are now feeling abandoned for console, and console gamers are invaded by those Western developers that have no clue how to develop for a controller and… hey, wait, what’s going on to Resident Evil? Why is Bionic Commando all grimdark? Where are my JRPG’s?! Why does everything want to be Grand Theft Auto?!?!

        Right now I feel like we’re hitting a slight equilibrium, at least in regards to genre variety, but it also helps that the Japanese gaming industry is reviving somewhat, there’s plenty of room for mid-tier games, and it is easier to port between PC and console. Nevertheless, we still have brand new growing pains and unwelcome trends, and just as we’re stabilizing everyone’s talking 4K gaming, as if they forgot how disastrous it was to increase resolution last generation.

        1. Agammamon says:

          There is certainly something to be said for the discipline designing for a controller puts on you – limited buttons means you can’t just toss in slight variations of capabilities and let the player sort it out. The Force Unleashed did this well and then translated that over to the keyboard. A handful of keys *and using key combos* for a few distinct powers and ways to combine them. Sadly they borked the PC port in so many other ways.

          The major complaint now though is that developers are getting lazy and won’t even attempt a PC UI any more. A lot of them won’t even let you rebind actions – especially combined action keys (like FO4’s ‘throw grenade’ and ‘quick melee’) when there’s legitimate reasons to combine them for a controller but PC has the extra keys.

          Side rant – we’ve got a pretty stable K&KB keybind paradigm and no developer seems to know it any more. Its seriously like the people designing games nowadays have never played on a PC before.

          1. Redrock says:

            All true. To be honest, I don’t do much KB&M gaming these days. My PC is a miniITX machine hooked up to my TV. I think I only use the mouse for classic crpgs.

            1. ccesarano says:

              Yeah, to that end I definitely feel for PC users. Heck, I don’t even know why video games don’t allow that degree of customization sometimes. Street Fighter and Mega Man X allowed custom bindings on the SNES, why not on modern console games? Hopefully that’s something that’s gradually improving, though, even as Japanese companies start getting into the PC porting business.

        2. Echo Tango says:

          Although there’s definitely people who want to use most of the keys on their keyboard for faster inputs, keyboards are a lot worse than joysticks/gamepads for most other situations. Completely digital inputs instead of having any analog inputs, and so many keys that you can accidentally push the wrong button or get your hands mis-aligned. A big loss moving from computers to consoles, was (as Agammamon points out) the loss of re-mappable controls. Some console games let you re-bind keys, but most do not. Also, the mouse is a vastly better device than analog sticks for many tasks; It’s got your whole arm for large or fast movements, and your wrist, fingers and the friction of a mouse-pad for precision control. It’s such a good device, that computer gamers are willing to give up analog controls for key-tapping, and why most FPSs don’t let computer gamers play against console gamers; Even with aim-assist, the mouse usually wins.

          1. John says:

            Oh my, yes. Mouse aiming is almost always superior, and not just in first person shooters. I’ve found that it’s true for most games where moving and shooting are independent actions. Good Robot, for example. Moving with a thumbstick is fine, but aiming is so much more precise with a mouse. Also Bastion. There’s a challenge level in that game where the objective is to eliminate a certain number of targets in as few bow shots as possible. It’s very, very easy when you use the mouse and much more difficult, or at least much more tedious, when using a controller.

            1. ccesarano says:

              A point I always have to concede, though I have never found a mouse comfortable to use. I tend to prefer my thumbsticks, though I’m learning there are ways to do aim assist well and aim assist poorly. There’ve been times I shut it off in console shooters just because it interferes with lining up a shot. Happens in Destiny all the time. I’ll be aiming down my scope at a sniping Vandal or Hobgoblin and next thing you know a Dreg darts in front of me and drags my scope to the left. Thanks game!

              I’m wondering if we’ll find a sweet spot with gyro-aiming in controls. I know the Switch pro-controller has a gyroscope, and while I’m not fond of its implementation in Splatoon 1 or 2 I feel like there’d be some benefit to using the thumbsticks and making adjustments with slight tilts of the controller. Did that a lot for aiming in the Zelda games on 3DS and WiiU and I think even Breath of the Wild. A nice fusion that doesn’t require a return to Wiimote and Nunchuck.

              1. Rack says:

                VR casually solved the motion control issue a couple of years ago. Forcing the players view behind a screen inevitably will detract from that somewhat but the technical issues are all but solved.

              2. Echo Tango says:

                Something I’d like, is more custom-purpose controllers. Now that bluetooth is relatively cheap, and light-guns are very cheap (compared to modern hardware), I think we could do well to have some custom controllers of various shapes, rather than just those plastic things, that bolt onto existing nunchucks, VR-joysticks, etc.[0] It always ends up being a bulky, crude experience, compared to what you could do if you actually moved the buttons around, on your plastic doo-dad. Consider the following: a gun-shaped controller could be built so that the reload-button is physically placed where the mag-dump button is on a real gun. The trigger would be actually located in the proper place. The menu-button could be the safety-latch on the physical gun.

                Heck, if manufacturers want to make bolt-on gadgets, and take advantage of existing supply-chains, I feel like they’ve got it backwards. Instead of making toy plastic gun-stocks that bolt onto VR controllers, they should make controller-electronics that bolt into real gun-stocks. I’ve seen pistols painted like NES guns – why not make it a real gaming gun? Rocksmith has already done something similar to rhythm games[1] by letting people plug their real guitars[2] into computers. I think the world of input devices can be explored much more fully than what we have now. :)

                [0] This of course assumes, that the console(s) wouldn’t have vendor lock-in, and let you hook up arbitrary controllers that conform to a standard. Last time I checked, you still couldn’t swap an Xbox controller for a Playstation controller, even though they’ve both been fairly standard both Batarang-shaped things for a decade. There’s 3rd-party adapters that…sort of work? I don’t own consoles, and I can’t remember what my friend said last time I asked.

                [1] It’s a blurry line between “rhythm game” and “music-training tool” in my opinion. Rocksmith definitely has quite a few game-y elements, like a points system (hidden?) that keeps you on track, and training at an appropriate skill-level, and its UI is very colorful for something that’s ostensibly a serious tool.

                [2] And sound-pickups for drumkits? I can’t remember.

                1. Agammamon says:

                  Jesus man, can you imagine the number of blown up tv’s and neighbors looking through holes in the drywall because someone forgot to ensure the gun was unloaded?

                  1. Echo Tango says:

                    Naturally, you’d want to not have any actual firearm hardware in your controllers. Just have the stocks, and maybe barrels from real guns, and all the rest be electronic doodads and buttons. Even if somebody wanted to put game-controller electronics inside of a full firearm, there wouldn’t really be any room to do so. Gun stocks can be hollow plastic, but there’s no room to spare to put anything inside of a firing mechanism, unless you want a single-button “controller” which is activated by the firing pin hitting an electronic sensor.

        3. John says:

          Sometimes I miss reference sheets.

          Despite the fact that giant military robots have never existed, do not exist, and are unlikely ever to exist, the MechWarrior games are intended to be or to be like simulations. The relative complexity of the controls helps sell the fantasy. You could in principle make a mech sim that worked on a controller, but with only eight to ten buttons to work with I don’t see how you could offer nearly the same level of control over the mech. You might make a good controller-based game, but it would by necessity have to be a different kind of game.

          1. Echo Tango says:

            I think you could get pretty far, with using buttons for quick mini-menu actions, which could control various aspects of the mech. Just like social-button + direction lets you choose different emotes in Overcooked 2 and other games, have a button + direction choose which weapons are active. Are you switching to only your lasers? Missile barrage? Lasers plus miniguns? Alpha-strike? Slower actions, like adjusting individual guns, or dumping overheated ammo, could be actions in a different, larger menu. Also, consider this: Having a gigantic everything-controller (or keyboard) means that different mech-pilots (players) with different preferences all have to memorize the same buttons, and ignore some percentage of them. I think a real mech-pilot would be customizing their interface, just like the physical equipment on their mech. If I only need hot-keys for switching major weapon-groups, I could set that up. Or maybe I only have dedicated buttons for firing and movement, and a mini-menu for switching weapons or other active equipment. Another player might want buttons for individual weapons, but use coarser-grained controls (or mini-menus) on their throttle / jumpjets / other movement options. One player might have a button for chaff, and another put it in a mini-menu.

            1. Sartharina says:

              And then, when you’re dealing with mini-menu spam, you’ve ended up with a different mess than what you started with.

              Something else to keep in mind – when MechWarrior was big, PC Gaming Peripherals were also all the rage – Flight sticks, custom keyboards… they could really sell the Simulation aspect of piloting military technology (And, there were a LOT of other ‘vehicle simulation’ games back then. I remember having a cool missile sub corridor shooter sim, a Commanche Helicopter sim, several Mech sims (Earthseige and Mechwarrior being the biggest), and an M1A2 MBT simulator.)

              Interestingly enough – the original Xbox had the Mech Assault games (Consolized Mech Warrior games), which demonstrated the difference in feel between the simulation-like mess of keyboard commands (Setting navpoints, managing speeds and internal systems), and the streamlined, arcade-like feel of a Controller.

              1. Echo Tango says:

                Real Mechwarrior players build controls out of real switches, levers, and buttons and have custom cockpits in their living rooms.

    3. Lame Duck says:

      I think my one overriding memory of that generation will be textures popping-in all the god-damn time.

  2. Chris says:

    A bit off-topic, but I notice you often use “wrapping up” for your final paragraph instead of “in conclusion” or just “conclusion” is there a reason for that or is it just a thing?

    1. Redrock says:

      I can’t speak for Shamus, but “In Conclusion” seems a bit more academic in style, whereas columns usually go for a lighter, more conversational tone? Just my guess.

    2. Syal says:

      It ends with wrapping up, because the article is a present for the viewers.

  3. Moridin says:

    I realize that you were just trying to dumb things down to the general audience when you were talking about clockspeeds, but I still have to point out that when comparing 2009 and current desktop processors side by side, even if you run them at the same clocks and with the same number of cores/threads, the current processor will be significantly faster. On Intel’s side, IPC(instructions per cycle) only stopped increasing significantly when they released Haswell(4000) series in 2013. AMD is still advancing, but that’s because they actually went backwards when releasing FX-series, and now they have a brand new architecture with plenty of improvements to be made, while Intel has only been iterating on the same architecture for a long time.

    Also, thanks to advanced turbo-features, a modern desktop CPU that’s sold as 3.5GHz processor will spend most of its time at frequencies closer to 4GHz or even significantly above in Intel’s case.

  4. Hal says:

    Also, the phrase “Captain America of horses” is leaving me in giggles, as are the very satisfying results of running that image search.

  5. DangerNorm says:

    Moore’s Law is about the number of transistors in an integrated circuit, not speed. Even as progress in serial speed came to a halt, as you note, Moore’s Law has in fact continued to hold thru to today. Hence how we’ve been able to fit in all those extra cores you mentioned, among other things.

    1. Moridin says:

      Actually it hasn’t. Transistor density has continued increasing(certainly faster than CPU power), but it hasn’t kept pace with predictions based on Moore’s law for years by now. Intel has been on 14nm node since 2014(they’ve made improvements to it since then, but the transistor density is still basically identical) – oh, and the 14nm node itself was late to begin with. TSMC, Samsung and Global Foundries(among others, but those three are basically it for cutting-edge manufacturing) have continued advancements, but they were behind Intel to begin with, and their pace for the past 6 years has been more like double the density every 2 years rather than 18 months. Oh, and GloFo just recently announced that they won’t be manufacturing on 7nm node in the near future after all.

      1. Echo Tango says:

        Do you have a graph / recent data to back this up? This chart from Wikipedia still looks like a fairly straight line (log scale on the Y-axis) to me, but it ends at 2016.

        1. Moridin says:

          The top right of the chart you yourself posted seems comparatively flat to me apart from SPARC M7, but as you yourself say, the chart only shows up to 2016 and incidentally for Intel ends in Broadwell – the first server part manufactured on their current 14nm process. The process itself was launched in 2014, but server and HEDT CPUs traditionally launch after the desktop and laptop CPUs because the larger die size means the manufacturing process needs to be ironed out so they can get reasonable yields(and as I mentioned, Intel had trouble with 14nm to being with, which is why initially they launched laptop CPUs only, and broadwell desktop processors were only produced in limited numbers). It’s now 2018 and the current desktop and laptop CPUs are STILL being manufactured on 14nm process and the 10nm process is supposed to enter mass production for small dice in 2019(see https://www.tomshardware.com/news/intel-cpu-10nm-earnings-amd,36967.html ).

          The first 14nm products from Samsung and TSMC launched in 2014( https://wccftech.com/samsung-globalfoundries-tsmc-finfet-production/ ). It’s now 2018 and Samsung and TSMC just recently(H2, in other words) started producing first products on their 7nm process(and 7nm will only be used in large scale in 2019). That’s 4 years for two nodes, and area reduction of approximately 3/4, or double the density in 2 years.

  6. Fizban says:

    The Wii was a waggle controller in search of a game mechanic.

    I’d still say the Wii was more a game mechanic in search of developer support. Metroid Prime 3 controlled beautifully, I’ve never understood what people’s problem was with Skyward Sword (actually someone took a shot at explaining it here before, but I’ve forgotten, forgive me/remind me?), etc. The controller enabled new control schemes without preventing the old, there was never a problem with it. The problem was Nintendo being idiots and not supporting devs, while at the same time those devs were being drawn away by all the “oooooh graphics” of the other consoles.

    And I still look at all the VR controllers and control schemes and see a bunch of trash compared to the wii-mote from ages ago. Why would I want clippy things on my fingers, or a giant ball wand, or huge gun grips? Why are they covered in buttons and haptic disks when these are motion controllers? Okay, admittedly I haven’t seen all that many VR games or had an opportunity to try the hardware, but come on. Every game that I have seen seems to use all of one or two buttons, and either tacked on scroll menus (curse you into the sun Bethesda), or an over-focus on flailing around with physics objects. Arg. Anyway, same stuff I say every time.

    1. Geebs says:

      VR controllers look odd because their positional accuracy has to be an order of magnitude better than the Wiimotes. The PS Move controllers used by PSVR are only slightly more functional than the Wiimotes and their tracking, while surprisingly accurate, is still almost unusable for many games.

      There are a lot of bad things to be said about the Vive wands (movement via analogue touchpad is pretty miserable, and the menu navigation in Bethesda’s VR games manages to be even worse than vanilla Skyrim), but their tracking is phenomenal.

    2. Echo Tango says:

      “an over-focus on flailing around”
      This happened on the Wii too; Many games (including Mario Galaxy) crammed in motion controls where it made no sense[1] and had no way to adjust the amount of waggle motion needed. I quickly halted play, in all the games that forced me to hurt my body.

      [1] See also, touch-screen controls on the 3DS.

    3. Hal says:

      Gonna agree with you here, with the caveat that a lot of Nintendo’s offerings didn’t seem to know what to do with the motion controls but seemed to use them out of obligation. Some games tacked motions in where simple button pushing was just fine (i.e. Twilight Princess.) In other cases, it seemed like a reason to port over DS games where the stylus controls were a much more elegant solution, but they were feasible (enough) with the Wiimote. Other games just used them for QTEs, which really only served to demonstrate how obnoxious QTEs really are.

      Don’t misunderstand, when it was used well, it was a great mechanic, but it definitely was not managed as well as it could have been.

    4. Clareo Nex says:

      The wiimote’s killer app was the pointer, which was the couch-usable equivalent of a mouse.

      I was really hoping Prime 3 wouldn’t have lock on so I would have an actual FPS type of experience with, like, aiming and stuff. Of course it did and therefore I didn’t. Alternatively they could have had many enemies that reacted well to aiming somewhere other than centre mass.

      Apparently waggle distracted folk so much they thought the pointer was bad and threw it out too.

      1. Griffin says:

        Agree about the pointer. Sin and Punishment: Star Successor is a great and sadly unknown game that makes perfect use of the Wii’s combination of pointer and joystick controller. (Just don’t pay too much attention to the attempted story.)

        1. Asdasd says:

          The New Motion Control series did some good work too. Taking Pikmin 1 & 2 and upgrading from them analogue-controlled cursor-based gameplay (in 3d no less!) to the elegance of the mouse-on-the-couch liberated what were already two of the finest games of the previous generation to new heights of unrealised potential.

          Sometimes the motion controls were best when you *weren’t* getting the most out of them. I remember how in Mario Strikers Charged the input for body-checking an opponent called for a general-purpose shake of the wiimote. There was no sophistication required, any old thwack would do. But that’s what made it so satisfying; the gesture was a perfect evocation in miniature of the on-screen act of shoving someone bodily off the ball.

  7. Christopher says:

    Last gen was probably my favorite gen! Though that’s a statement that needs some qualifications.

    As good as some wii games are(Super Mario Galaxy 1 and 2 remain legendary), the wiimote waggling only worked for party games that don’t have a lot of long-lasting appeal and games with a lot of aiming, which aren’t a ton on the console with the least amount of shooters. The control itself was a bit of a mess for anything else, and hamstrung the controller layout and gameplay design of every Nintendo game released on it. One of my favorites, Punch-Out Wii, might as well have been played on an NES controller.

    (The Nintendo DS was a platform I got a lot more use of, with games like Starfy, the Pokemons, something like five Phoenix Wright Games, Might & Magic Clash of Heroes, the Osu! Tatakeu! Ouendan and Elite Beat Agents games, the Castlevania DS trilogy, Henry Hatsworth and the Mario & Luigi RPGs)

    That lead me, and anecdotally a lot of people, to emigrate from Nintendo where we had been stuck for decades. And while Nintendo has a lot of advantages(family-oriented products and designs rather than chasing realism trends, protags other than gritty white dudes, a focus on pushing gameplay over story/cinematic-focused experiences, the best place to play certain genres like platformers, a general lack of bugginess, a lot of polish etc), there’s a lot of stuff Nintendo don’t or did not do. Few shooters, not a lot of small-time indies, open world games, character action games, fighting games, RPGs, especially of the WRPG variety, and so on. So seeing all of that for the first time on the 360 and PS3 was a great experience. Coming to the 360 in summer 2011, over the course of the next 12 months I could play Bastion, Alan Wake, Batman Arkham Asylum and City, Super Street Fighter 4, Street Fighter III 3rd Strike: Online Edition, Asura’s Wrath, Mass Effect 1, 2 AND 3, Limbo, Borderlands 2, Kingdoms of Amalur: Reckoning, Tales of Vesperia, Bayonetta, Portal 1 and Half-Life 2, Bioshock, Skullgirls, Dragon’s Dogma, Skyrim, Saints Row the Third, Just Cause 2, Assassin’s Creed 2, Dark Souls 1 and 2, Jojo’s Bizarre Adventure and Rayman Origins, more than ten of these bought before I got the console for less than 120 dollars. Later I’d go back to play the MGS series and Metal Gear Rising Revengeance, get into Demon’s Souls, Journey, Dragon’s Crown, The Walking Dead, all that fun stuff.

    That’s not to say all of them are bangers(and I believe I also played a little Fable 3 during this time, too) – and I really came to miss the things Nintendo does better than anybody else. But you can see how that was a lot of great new experiences in a short time.

    Meanwhile, going with PS4 for this console, I’ve enjoyed Bloodborne a lot, Dark Souls 3 is alright, Horizon was okay, Yakuza 0 kicked ass, Tales of the Borderlands is janky and cheap but well written, Detroit Become Human is stupid and tropey but well-produced and fun, Edith Finch was a great little time, Just Cause 3 is basically as good as 2 I suppose, I frankly love Street Fighter V and Persona 5(though that’s also a PS3 game), Nier Automata basically sucked as a game but sucked me in with its themes and story, Spider-Man was amazing, Breath of the Wild blew me away when I borrowed a Switch, and that’s about it. Some bangers. A lot of fluff. More of the same, in a lot of cases.

    This is all up to what we play and like to play, natch, but this gen I feel
    1) very surprised that people are even talking about next gen. I’m like, “was that it??”
    2) that I’ve mostly gotten ports of older games as opposed to new exciting experiences.
    3) that I really should’ve bought a Switch instead. Nintendo might have essentially skipped a gen now since the Wii U crashed and burned, but they got it right with the Switch, and I’m missing their games. See also: the 3DS.

    1. Thomas says:

      It’s my favourite generation too! Fallout: New Vegas, Journey, The Last of Us, Uncharted 2, the completion of the Mass Effect Trilogy, Deus Ex Human Revolution, Minecraft, Valkryia Chronicals, Tomb Raider 2013…

      For me, the PS2 generation had great games but felt very technologically handicapped – long loading times. Only towards the end were games like FfX beginning to really immerse you in the environment itself.

      The PS4 generation uses colour better, but every game is a open world action RPG collectathon. There’s barely a game where I’ve got to see the ending and I’m normally exhuasted by then.

      I’m still waiting for the first good ‘choose how your character looks’ story RPG.

      1. Christopher says:

        Color choices were really at the nadir during the beginning to mid of last gen, at least in the AAA market. I wanna say Far Cry 3, Dragon Age 3 and Metal Gear Solid V were the big eye openers for me that things were changing. Same consoles as the previous games, but both in general graphics and in color choices in particular it was like we leapt a generation ahead. I’m so glad brown gave way to color.

      2. tmtvl says:

        > FFX
        > towards the end

        FFX was released in 2001. The PS2 was announced in 1999.

    2. Redrock says:

      The popularity of Edith Finch is one of the most mysterious aspects of modern gaming culture to me. That game is awful. And I say thay as someonewho enjoys walking simulators. I get why people like stuff like Fortnite even though I don’t share the sentiment. But the response to Edith Finch is an enigma.

      1. Christopher says:

        I dunno dude, maybe we just like different things. I don’t really care about most walking sims, ’cause they involve standing around listening to holograms talk, the kinda item ransacking you do in Skyrim or just basic walking and looking. Meanwhile, Edith Finch tells its stories in all these gameplay vignettes. I was sold when I watched a quick look of the game and it suddenly changed from first person woman rummaging through a house to a little girl, who turned into a cat, who turned into an owl, who turned into a shark, who turned into a sea monster. That’s an attention-grabber.

        So rather than walking around in the countryside listening to talking lights or rummaging around a regular environment, the house is just this weird little hub for cool gameplay short stories. Rolling down hills as a shark, the horror segment with a Tales from the Crypt vibe, chopping that fish with one hand while exploring fantasies with the other, the paranoia mounting until the segment with the vault dweller… I think that was great stuff. Goes a bit downhill for me after the bunker, I would’ve preferred something else, but I still enjoyed it a lot overall. I hope this method of indie storytelling gets more traction, ’cause I like it a lot more than Gone Home.

        1. Geebs says:

          I’ve said this one before, and quite possibly on this site, but I hated Edith Finch. The devs came up with a couple of genuinely clever scenes, like the comic and the fish factory, and then gave up and spent the rest of the game trying to look edgy by tasteless and repeated child murder. The protagonist dying in childbirth is unnecessarily unpleasant and completely unrealistic, and the unattended baby drowning in the bath is really too much. I have no idea why reviewers didn’t call the developers on it, and can only hope that only reflects none of them having kids of their own.

          1. Christopher says:

            I don’t think there’s anything edgy at all about those. Edgy is being immaturely HARD CORE, MAN. My character is a soldier from hell with spikes growing out of his elbows. This zombie virus attached itself to a little baby and you’re now forced to fight the tentacled blades growing out of its back. This dude is so tortured and sad he only wears black and kills people with guns ’cause they deserve it in this f’ed up society of ours.

            The deaths in Edith Finch, even when young children are the victims, are just sad, and that’s all on the presentation. Yeah, it might have been a bit edgy if you just watched a kid drown in a bathtub. Downright vulgar. Very uncool. But presenting that as this playful fantasy metaphor world based on the parent‘s letter? As just this abstract background? I didn’t think they were being tasteless or edgy at all, I thought they contrasted the morose themes and tragic stories with colorful exaggeration and creative presentation. “Here’s something terrible happening, but don’t worry, we’ll help ease you through it”. It could’ve easily been just a crying porn game, with a concept like “A family where every family member dies young”, but with the layer of fantasy on top it never played as gross to me. Guess I might feel differently if I had a kid of my own.

            1. Geebs says:

              There were a couple of problems I had with that story-telling choice:

              1) there certainly are “doomed” families in the real world; for example heritable conditions where everybody gets cancer in their twenties. That’s certainly not beyond the bounds of something you could explore artistically, but the game doesn’t really attempt to convey the family’s humanity at all, and just goes for the Final Destination approach. People have a wide range of reactions to this sort of inherited illness, but I didn’t see any of them depicted here. Something like That Dragon, Cancer gets away with being occasionally whimsical because it also depicts the other aspects of its setting and because of the obvious sincerity of its developers. Edith Finch, for me, doesn’t earn that.

              2) The bit where I went from “I see what they’re going for” to “f*ck this game” was the drowning sequence. That recontextualises everything else in the game from “it’s awful, but what are you gonna do?” to “get social services”. Even the most inexperienced parent knows that you never, ever leave an infant in the bath by themselves. You don’t even step back from the bathtub. This is a family that has already lost multiple children, and the mother seems completely oblivious. That’s not normal, and I mean that in the sense that the people who thought this was OK to depict in a whimsical fashion are showing a disturbing absence of normal human concern. I think that art about actual child neglect needs to be treated extremely carefully and again, I don’t feel that Edith Finch’s creators thought twice about what they were doing.

              1. Christopher says:

                I don’t think they were mindless about it.

                Most of the deaths are based on carelessness and neglect, from either the people themselves or the people around them. It’s not just Gregory, it’s the kid eating things she shouldn’t after getting locked in her room, the kid swinging right out into the ocean from a swing hanging right by the edge of a cliff overlooking the ocean, posing with a not entirely dead deer, flying a kite in a lightning storm, and so on. That goes for Edith, too, climbing around trees while pregnant at 17.

                It entirely seems like the curse is a tall tale fabricated by Edie as a convenient excuse for her own neglect and recklessness(some tragic things do happen, but neglects aside, I don’t think a staggering amount of them), and her “celebration” of and obsession with everyone’s death is pretty clearly presented as the wrong way to go. Edith says as much in the end, how you should value the time and life you have rather than wallow in sorrow and focus so much on the dead you write stories about them, with her kid having no choice but to finally grow up someplace else than the tombs of Finch House. It’s sad, but with nobody to go home to, at least he won’t be haunted by a “curse” anymore.

                Gregory dying in the bathtub is completely in line with all of the others, and I personally felt that depicting his and the others’ death as fantastical stories is not just playing into the point of the game, or the excuse for these innovative ways of storytelling through gameplay bits, but the only way you could depict this sort of tragedy and not have it be a melodramatic crying game a la what David Cage pushes out. I dunno how That Dragon Cancer did it tho, I haven’t played it.

                Incidentally, since we’re in spoiler territory anyway, the true nature of the curse was a disappointment to me. I’d have preferred if there actually was a monster hounding them, ’cause the early short stories that talked about it, culminating in the vault, made me very paranoid and very terrified. Which is a good way to have empathy for how the family must’ve felt, fearing for the curse. Ultimately you want there to be a curse rather than just a series of tragedies, accidents and neglects. As disappointed as I was in the moment, placing it firmly in the Firewatch and Gone Home camp of “we made it seem like something more was going on to make it more exciting”, at least there’s a thematic reason for it.

                1. Geebs says:

                  Just wanted to say, thank you for the interesting discussion! It’s nice to have an opportunity to think through this stuff.

                  1. Christopher says:

                    Same to you!

  8. Jabberwok says:

    I haven’t played Prey (I expect it to be good), but I didn’t think very much of Human Revolution. The story was a mess, I found Adam Jensen boring and dour, the level design regressed from the original in a lot of ways, and the game spent way too much time railroading me [DX 1 dropped you straight into the first level. The whole first section of HR is an exposition dump, and not even a good one]. The conversation system was almost interesting, but the writing wasn’t. I think Bioshock is probably a bit overrated, but I certainly liked the writing much more than Human Revolution. And Bioshock’s gunplay wasn’t great, but its levels at least felt very open and sandboxy.

    1. Echo Tango says:

      Plus, the equipment and augments were all a mess of imbalance. Even ignoring the terrible boss fights (only combat options, no stealth, etc) the game was very skewed towards guns. It’s a real shame, because I as a regular player was finding ways that the numbers could have been tweaked. (Cost of augments, inventory size of items, size of ammo mags, etc) This wouldn’t have even required any expensive changes to have systems – just adjust the numbers that are already in the game!

      1. Jabberwok says:

        Your mention of augments just reminded me of the ‘you have to eat a chocolate bar to punch someone’ problem. I remember the game being very stingy with its aug power, such that I rarely used it. Except in the aforementioned boss fights, to spam typhoon over and over.

        This is what boss fights in a Deus Ex game should be like: https://www.youtube.com/watch?v=z7a0NnoLg4M

        1. Echo Tango says:

          Ugh. Chocolate in this game is broken thematically, and also balance-wise. The choco-bar balance was the first one I actually noticed, which got me thinking about the rest (grenades vs mines, ammo, …). A giant jar of choco-bars restores the least amount of energy per inventory space and also has the least granularity of usage – those combine to disincentivise the player ever putting them in their inventory, because the one-slots or the two-slots[1] are objectively better in every way. Instead, they could have had a meaningful trade-off for the player to ponder, between amount of inventory space, and the granularity of energy restoration. i.e. “I often use weaker augs, which means I only want to spend one energy worth of chocolate, instead of eating three points of chocolate and wasting two. But the bigger chocolates fit more efficiently in my inventory, so if I only hold small chocolates, I won’t be able to use as many powers total, before needing to go back to a safe-zone or store…” At worst, the player would be figuring out the optimal ratios of small:medium:large choco-bars, based on their augs and play-style, and at best, the player would be making level-to-level decisions, about what chocolates to keep, eat, sell, or discard. In the game we actually got, there’s an optimal, best strategy, which is to keep only small bars, and either discard or sell the other sizes, or scarf them on the spot or while you’re hiding in a closet, because they’re objectively worse than the smallest size in every way.

          /rant :)

          [1] I think the two-slot boxes of bars had similar problems to the large 4-slot jars, but to a lesser extent. Either way, the smallest size was optimal, I think. It’s been a while since I played the game.

          1. Philadelphus says:

            I actually put four chocolate bars into my backpack this morning, and now you have me wondering if I couldn’t have used that inventory space better. :)

        2. Mephane says:

          I am generally rather annoyed when games tie special abilities to the use of consumables. From a game mechanical perspective, since you don’t know (at least in the first play-through) what the game will throw at you in the parts you haven’t yet played, you might not want to blow your tightly restricted mana/energy/etc on something you can manage otherwise, lest you stand there drained when you really need it.

          It’s like the problem with one-time usable super-strong items (e.g. potions) – you are never sure whether you might need it later much more than now, and may end up completing the game with the thing still untouched in your inventory. Only worse, because now this applies to one of the core game mechanics and aspects of developing your character.

          In the particular case of DXHR, I remember that I had partial energy regeneration; I think when you are depleted it would passively regenerate one bar, but no more. So my strategy always revolved around only using one bar worth of special ability at any one time, and otherwise focus on using regular combat.

          1. Jabberwok says:

            Yeah, the one bar thing meant that I almost never used any ability that took more than one bar.

            The first game used bio-electric cells, but that was much less of a problem for me. Probably because I had complete control over how much energy I used for an aug based on how long I left it on. Plus, you could carry a ton of cells with you. HR’s tiered battery system was much more discouraging. Not to mention the stupidity of gaining super power energy from eating chocolate, or requiring energy just to knock someone out.

          2. Echo Tango says:

            This problem can be solved, if you give the player information on how rare consumables are. If all consumables are purchaseable, this can be indicated by their price. Actually, they should probably all be purchaseable; Otherwise the game is just inviting this kind of player frustration.

            1. Mephane says:

              My personal solution is full passive regeneration, with consumables acting as a means to quickly replenish yourself in tense situations. The regeneration could be relatively slow, or have a significant delay, or only happen outside of combat altogether (if the game distinguishes between combat/non-combat as definitive states).

              Alternatively, if for whatever reason any resource is to be only replenishable through the use of consumables, the game should provide a technically infinite quantity. As in, there is a method to reasonably acquire more that does not deplete itself, e.g. respawning crafting resource nodes, respawning enemies (that drop loot/money that you can use to buy more consumables).

              But really I prefer just passive regeneration. Also, everything I just mentioned applies not just to mana/energy/etc, but also health.

              1. Jabberwok says:

                Hmm, it really just depends on the game for me. For instance, I’m often not a fan of health regen in shooters. First, because it can push players into tedious behavior patterns. Second, because it can remove a layer of strategy from long-term play. If the game resets my condition automatically, the results of a firefight have no effect on the next firefight. Bad performance becomes equivalent to good performance, which removes an element of soft failure. This is why older shooters like Doom and Quake could make the player extremely powerful without ruining the threat of weaker enemies. Even if you’re unlikely to die, the knowledge that health is a limited resource means you still have to care about the outcome of each engagement.

                The same can be said of energy regen or allowing infinite consumables. Even if energy regen is slow, you might be pushing players into the boring behavior of waiting for it to recharge before moving on, or the even more boring behavior of grinding for more consumables after every single fight. And if you remove health and energy pickups in favor of automatic regen, you have fewer ways to reward the player for exploration (which definitely needs to be a thing in a Deus Ex game).

                Where the balance is is going to be different per game, I think, and being off in either direction is going to push players towards tedious or annoying play styles. Just to give another Deus Ex example, the batteries in the first game were pretty common, and the inventory stack size was high enough that you knew you were in good shape if you went over it. And none of the game’s necessary interactions required energy. All of the augs just gave you an edge in various situations. So using cells frequently would give you a slight boost. Saving them up could reward you with a huge advantage at a particular time. For instance, when JC has to escape New York, I had saved up enough cells to leap off a roof with my leg augs, then cloak and waltz right past all of the guards in the whole level (recharging as I went). That was a pretty satisfying solution, and it would’ve been impossible to balance for that if they had gone with regen instead of a limited supply of consumables. Regen flattens the tactical options for any engagement because the player is always going to start with the same amount of resources. (And even if they don’t, the designers have to assume they will.)

                Even then, players can certainly feel like they should never use their bioelectric cells, but I think the potential depth can be worth that risk.

  9. Knul says:

    The bit about cryptocurrency is completely outdated. Mining using GPUs hasn’t been profitable in years (as in the electricity costs are higher than the coins mined) and ASICs (application-specific integrated circuits) are used instead.

    1. Cilvre says:

      Until january, mining different cryptocurrencies with GPU’s was still profitable. Ethereum was the reason GPU prices skyrocketed last year, and was quite profitable if you were in the bubble, not so much if you held out and didn’t sell when things seemed obscene. I paid for half my gaming rig by mining on my two gpu’s. including the electricity costs.

  10. Cilvre says:

    Hey Shamus, just finished reading the other article. Another thing that shifted greatly and you seemed to have missed out on was “Hard drive speeds” in the last decade we have had SSD’s come out that fully saturate a sata3 connection, so much speed has been gained that we have gone to PCI-E based ssd’s in the form of m.2 drives that are sometimes 6 times faster than a sata 3 connection can even handle.

  11. Droid says:

    Well, Shamus, I don’t know about Bitcoin, but you really got your money’s worth out of those 100 Memebux by now.

  12. sheer_falacy says:

    This should probably have Experienced Points in the article title.

  13. Carlos García says:

    There was also the feeling I got that the better PC capabilities, the more speed and more memory was used more as room for dirty code than to actually do more things, better things. The reason were F1 games, specially. If you compare 1997’s Monaco Geand Prix Racing Simulator 2 with the EA’s F1-2002 or F1-1999-2002, or even Codemaster’s F1 2010, you find that the more complete game was the first and the better in everything that mattered: MGPRS2 had practice with phantom car, that later games remove, it allowed you to save in mid race (good when you don’t have the time for full race in a single go but refuse to play shortened races), some of the later didn’t even allow to save mid weekend! MGPRS2 also had more in depth car settings, with F1 2010 simplifying them greatly, AI car were better as they were capable of taking your presence into account, in later games you need to break really late if you have a car right behind you because they’ll ram into yours (and to add insult to injury the game will blame you!), it had a career mode with no set end in which you could start in one of the lower teams and you moved to other teams or not depending on the season’s performance, F1 2010 recovered it but limiting it to three seasons AFAIK… A lot of good stuff and good gameplay removed from the newer games just to have a full 3D (MGRPS2 made the car cockpits with a bitmap overlay over the 3D display or circuit, other cars and your car’s wheels and mirrors). In the end later games removed many features for a minimal graphic quality gain in exchange for huge system requirements increase.

    In fact, I came to buy the Football Manager series every year after FM09, instead of buying each five years or so, because FM05 ran kind of janky in my PC, I bought FM09 expecting it to be hard to play and I found it ran much better! It needed less RAM, less HDD space… and had plenty of feature improvements including the first 3D match engine! I had to reward FM’s programmers for being the first I saw that with time made their game to have more features and even get it to need less resources.

  14. shoeboxjeddy says:

    The 360 definitely had maintenance troubles, but Microsoft fixed both times I had an issue and paid for everything, including shipping. Versus the Ps2 which broke more times and was only free to fix because we paid for an extended warranty up front. I feel like writing off the whole console because it didn’t work all the time shows a huge blind spot. My NES CONSTANTLY didn’t start properly, queue blowing into the games or sticking a pry bar into the console so the game would sit deeper or etc. My Toshiba laptop shipped to me already broken and had to be replaced. My TiVo had one of the baseline features (DVR) that NEVER worked out of the box. Technology just has mechanical faults, it doesn’t mean the device is crap because of that. I also think your opinion on games varies wildly from the agreed upon consensus. You’d be laughed out of a serious discussion for saying that the consoles that housed Oblivion and Skyrim were a worse era for those series than Daggerfall and Arena. Or that GTA IV and V were worse than I and II. These opinions seem like trolling or pure contrarianism to be honest. It’d be like you were talking about the low quality of certain TV shows that were very popular and then suddenly declared that TV was pathetic compared to the depth of silent films or radio. Even if you had a strong belief that those mediums were really good, this isn’t something you could convince a large number of people of.

    Specifically, your critique of Bioshock comes off as pretentious because you know for a fact the mainstream would find almost no enjoyment from the games you think are better. What’s the point of “these games… from the perspective of gaming in 1998 are far better. If you played them now, they’d seem pretty terrible,”? A remake of System Shock 2 could possibly overturn Bioshock as a more sophisticated game, but I would argue the remake would owe a debt TO Bioshock in a lot of ways, so things would still be questionable.

    1. Shamus says:

      It’s nice that Microsoft replaced your console for you twice, but that just proves my point. Your console died twice. Are you really suggesting the 360 had a lower failure rate than the PS2?

      “I also think your opinion on games varies wildly from the agreed upon consensus.”

      Hi, I’m Shamus Young. This is my very own website where I give my take on things. If I was interested in consensus I’d be reviewing Battlefield or streaming Fortnite. If you don’t want my opinion, then you are at the wrong website.

      “What’s the point of “these games… from the perspective of gaming in 1998 are far better. ”

      Gotta read the whole paragraph. I’m comparing it not just to 1998, but also to 2018.

      1. Cilvre says:

        My xbox 360 story: owned the console two months, kept it in well ventilated area and played about 20 hours at most on the console during that time. Red ringed. Got it replaced, sold it as soon as I got it back.

      2. shoeboxjeddy says:

        I couldn’t find a good resource for the failure rate of the Ps2. When I searched for that, I did find about twenty different forums suggesting that the reliability of the first model of the Ps2 was complete shit, which was certainly my experience with it. It’s worth noting that Microsoft admitted the problem and paid for a VERY expensive free replacement program while Sony did jack and shit about their problems.

        I don’t mind your opinions at all. I just think “the worst era” goes beyond a simple opinion of “I liked this and didn’t like this.” It seems like you were stating those things as a well accepted truism rather than a completely fringe opinion. I would be fascinated to read something on how the era of insane games like the one covered on this site by Rutskarn were superior to some of the most widely beloved RPGs on the market. I wonder if you didn’t mean that you like Morrowind more than IV and V and not “IV and V are the worst in the series.” Similarly, I think you probably meant you like GTA III, VC, and SA more than IV and V. If this is what you meant, that makes a great deal more sense to me. I’m sure tons of people feel that way.

        I feel like your experiences with System Shock 2 would NOT be had by a new player with a different set of references and experiences though. At the time, SS2 was actually more advanced than similar shooters instead of horribly, confusingly dated as it is now. I’m not trying to dispute it’s your favorite or anything like that, just trying to say when you make statements to other people, you have to consider the audience. I love the HomestarRunner cartoons, but would kids my brother’s age also like them? I’m not sure if they would. A lot of the humor was based on stuff like the Flash player and what web pages were like at the time. They might not have the framework to be surprised and delighted by the Youtube versions as I was by the Flash version a decade ago.

    2. tmtvl says:

      Daggerfall -is- the best ES game, though. Source

    3. Dreadjaws says:

      My NES CONSTANTLY didn’t start properly, queue blowing into the games or sticking a pry bar into the console so the game would sit deeper or etc. My Toshiba laptop shipped to me already broken and had to be replaced. My TiVo had one of the baseline features (DVR) that NEVER worked out of the box.

      There’s a major difference between “These things have not worked properly for me particularly” and “This goddamn thing doesn’t work properly for most people, like is the case for the Xbox360. I bought a Wacom tablet and it didn’t work out of the box. I had a Nintendo DS whose lower screen wouldn’t properly calibrate. I bought a PS2 gamepad that stopped working days later. But guess what? When those devices come up in conversation, their malfunction is never a subject brought up, because it’s simply not extensive; it’s the exception rather than the rule. But start talking about the Xbox360 and I guarantee you someone will mention theirs being RROD’d.

      Also, man, you’re being very, very selective about how you like to interpret Shamus’ words here.

      1. shoeboxjeddy says:

        My point was that electronics failure rate is to be expected to some degree and then I specifically noted that the 360 actually had a free replacement plan unlike all these other things. But the NES cartridge failures were EXTREMELY widely experienced, so that was a weird one for you to complain about. “Blowing in the game cartridge” is a meme at this point.

        Not sure what this second point means.

        1. evilmrhenry says:

          I’m pretty sure all of the first run of 360s that were played for any length of time failed due to RROD. There have been other consoles with issues, but the 360’s problems were universal, occurred rapidly, and make the console useless.

          I’ve had a PS2 fail (laser issue), PS1 fail (not sure), DS fail (ribbon cable), Gamegear fail (capacitor failure), and a 360 fail (RROD). The 360 failure occurred after the lowest hours played of all of these except the Gamegear, and that one was just due to age.

          The first run of the Xbox 360 had a hardware issue, possibly the worst hardware issue of any major console. The PS2 might fail after a decade; the 360 would be lucky to last a year under the same conditions.

        2. Dreadjaws says:

          I specifically noted that the 360 actually had a free replacement plan

          You make it sound like it’s some kind of pro. It’s not. Of course it had that plan, it’d be suicide if they didn’t. That would have been the last console Microsoft would ever sell. They weren’t doing it out of kindness, it was literally the only way to stay in the bussiness.

  15. Dreadjaws says:

    While writing this I was thinking about what a disaster the last gaming generation was.

    I’m glad I’m not the only one who thinks the same. Yes, some of my favorite games of all time came out that generation, but when I compare it to the previous one… Picking any random game in the PS2 era had a much larger chance of it being something you’d love to play, while the PS3 era was strictly hit-or-miss. Hell, many of the games people fondly remember from the PS3 era were actually HD remakes of PS2 games, like Resident Evil 4 or Shadow of the Colossus.

    1. Daimbert says:

      The thing is, though, I’m not sure you can say that this current generation is any better. What are the big games from the PS4 era? I find that I don’t actually HAVE a lot of PS4 games, and the system has been out for a while now. In fact, I have as many if not more VITA games than PS4 games (to be fair, a lot of those games were on the PS4 as well, but that’s not exactly a glowing recommendation for the PS4 either). Now, I’m playing less games, but there aren’t as many games that are interesting to me when I browse in the stores as there were in the previous generations. So while I agree that the PS2 generation was outstanding, I’m not sure I agree with the second half of Shamus’ comment that the current generation is better than the PS3 generation.

      1. Cubic says:

        I was a loyal playstation owner with droves of games but haven’t upgraded to PS4 yet. Part of it could be that the PS4 is just a glorified cheap, low-end PC. The magic is gone, man.

        By the way, I wonder why PS3 was considered so hard to program? The hardware is basically a more regular (and more powerful) version of the PS2. Were all the PS2 programmers taken out the back and shot before the release or something? Or is it as prosaic as ‘there’s no DirectX’.

        1. shoeboxjeddy says:

          The Cell processor was a bear to program for and basically nobody would argue that it wasn’t.

          1. Cubic says:

            Still … Cell had basically the same architecture as the PS2 Emotion Engine etc, except actually simplified as far as I can tell. (The PS2 units were asymmetrical for example.) So why not put the old PS2 hackers on wrestling the hardware?

            My best guess is there was a big toolchain difference. The Xbox was basically a PC with MS tooling while the PS3, like many previous consoles, came from the more primitive embedded systems side of things.

            Some more discussion: https://arstechnica.com/gadgets/2008/09/game-console-architecture-in-depth/

  16. Zekiel says:

    Keep in mind that I’m an incurable cynic who complains about everything, so positive appraisals like this one are a rarity for me.

    I appreciate positive Shamus when he makes an appearance :-)

  17. Cinnamon Noir says:

    Wow. I had no idea that you had such bile saved up for Bioshock, Shamus. Personally it’s one of my favorite games, admittedly not for its mechanics. What I loved about Bioshock was that it conjured up a believable fictional society with an incredible attention to detail. Bioshock was immersive for me in a way that many games aren’t because it let me explore an engaging setting. System Shock 2 didn’t really do that; a spaceship, even one with lots of compartments, doesn’t compare to a whole city in terms of interest.

    Why exactly do you think Bioshock is pretentious? It covered heavy themes, sure, but I actually found it to be quite thoughtful, certainly much more so than Bioshock Infinite, which lazily brought up issues like racism and class warfare without actually exploring them. Bioshock may have been shallow from a mechanical perspective, but I don’t think it had a shallow story or setting.

    Incidentally, I own both Thief: The Dark Project and System Shock 2, and I have never managed to get past the first couple of levels on either. Both games are confusing, crammed with details that are hard to keep track of, and incredibly unforgiving when you slip up. I appreciate these games for their unique achievements, especially Thief for its brilliant story and level design, but I don’t really enjoy playing them.

  18. parkenf says:

    Hi folks would appreciate some help here – relevant to this article. My son has decided he wants to move onto Gaming PC from his current PS4 based gaming – mainly because he wants to play Flight Simulator and AAA games on the same platform. I’m a bit clueless on this, it’s late in the day for Christmas so don’t have a lot of time for research so we’re looking at pre-builds like this one on Amazon but I don’t know what I should be looking for?
    Is there a difference between DDR3 and DDR4 RAM?
    Is there a significant difference between a Nvidia GTX 1050Ti 4GB, a Nvidia GTX 1050 2GB and a NVIDIA GeForce GTX 1060 GDDR5X 4K Pascal architecture? And Pascal architecture? Why not fortran?
    Is ADMI a good mark?

    I’m not an IT novice by any stretch, but I don’t know about PC gaming at all. Any assitance gratefully received.

Thanks for joining the discussion. Be nice, don't post angry, and enjoy yourself. This is supposed to be fun. Your email address will not be published. Required fields are marked*

You can enclose spoilers in <strike> tags like so:
<strike>Darth Vader is Luke's father!</strike>

You can make things italics like this:
Can you imagine having Darth Vader as your <i>father</i>?

You can make things bold like this:
I'm <b>very</b> glad Darth Vader isn't my father.

You can make links like this:
I'm reading about <a href="http://en.wikipedia.org/wiki/Darth_Vader">Darth Vader</a> on Wikipedia!

You can quote someone like this:
Darth Vader said <blockquote>Luke, I am your father.</blockquote>

Leave a Reply

Your email address will not be published. Required fields are marked *