Experienced Points: What Makes Gaming Hardware Become Obsolete?

By Shamus Posted Tuesday Sep 8, 2015

Filed under: Column 81 comments

My column this week is aimed mostly at the hardware newbies who aren’t sure what things inside their PC are called, how they relate to each other, or why they need to be replaced. I doubt the audience for this sort of thing is very big, but I’m hoping it will unravel the mysteries for a couple of people.

Although, if you need advice on what graphics card to buy, you’re on your own. I can only suggest you get a red one.

 


From The Archives:
 

81 thoughts on “Experienced Points: What Makes Gaming Hardware Become Obsolete?

  1. Warrax the Chaos Warrior says:

    I think your next project after Good Robot needs to be a game called “Shoot Guy”. It can pioneer your bling-mapping tech. You’ve been teasing us with the concept for long enough, it’s time to make it happen :)

    1. Lame Duck says:

      It should also have a melee combat system and a female protagonist.

      1. Scourge says:

        And an anti hero named Bad Robot

    2. Wide And Nerdy says:

      Just a couple of posts ago Shamus was complaining about how overworking can leave guys shot.

      SHOOT GUY IS CONFIRMED!

      1. boota says:

        ShootGuy 3 CONFIRMED!

    3. krellen says:

      You joke, but Good Robot basically is Shoot Guy.

  2. Abnaxis says:

    Credit-card spam notwithstanding, it’s fun to look at the comments for those old posts. I find something inherently funny about the hardware snobs of yesteryear. Y’know, the ones that would happily jump into any thread talking about hardware to lambaste you for the choices you made as a consumer?

    Are those people a thing any more? Well, I mean I know it’s a thing for console vs. console, but I haven’t been seeing it so prevalent for individual PC hardware. Silver lining to consolitus, maybe?

    1. The Snide Sniper says:

      Could be “consolitus”, but I’d like to propose an alternate theory:

      At this point in time, the hardware that “snobs” would buy is not significantly better than “budget” hardware. It might not even be noticeably different.

      To illustrate my point, consider the following:
      * When a dedicated audio card made a noticeable difference in sound quality, we had audio card snobs. They’ve disappeared, or are have otherwise become unnoticeable.
      * When memory was low enough to affect everything, rather than just high-end software, we had memory snobs.

      These days, your typical CPU has a dedicated graphics chip good enough to run many modern games, built-in sound is as good as the old sound cards, and 4GB of RAM is considered normal.
      It’s not that people have stopped caring about performance, nor that they’re only complaining about consoles; rather, there’s so little difference between “bad” and “good” that even most “snobs” recognize that their complaints will sound hollow.

      1. GTB says:

        Lemme snob this up for you.

        I agree that I think we’ve sort of hit a plateau with sound and memory (though I know a few audio nutjobs here and there, and a few guys with stupid amounts of ram that they somehow work into every conversation about PCs.)

        But I disagree that “These days, your typical CPU has a dedicated graphics chip good enough to run many modern games.”

        Not… really. You could argue that with fast enough RAM and a willingness to turn down a -lot- of settings, you can make do with one of the newer A-10 AMD chips. I’m actually a proponent of the things, especially for cheap builds for people who want to play bejeweled or run a emulator for SNES games or whatever. In fact, after doing several A-10 builds i’m pretty impressed with what you can get from a machine without a graphics card. (The intel integrated gpus are still pretty awful, compatibility wise)

        But. We’re not at the point yet where a person who wants to game heavily on their system doesn’t need a graphics card. I think we’re close, but I also think it doesn’t make a lot of sense for CPU manufacturers to push the APU envelope any faster than they already have. The combo CPU/GPU fills a niche, but if they make them any more expensive, then it would be better for consumers to spend the money on a cpu and a graphics card. So I don’t think there will be any big leaps on the APU front any time soon. The one caveat to that is if console hardware pushes APU development faster. Both current-gen consoles use custom APUs from AMD, and I imagine that trend will continue, so PC users who want some kind of integrated thing may reap the benefits of that continuing development as new consoles require new APUs. That sort of makes me throw up a little bit in my mouth.

        But I do think that is the way it will go eventually, unless some crazy new hardware thing shows up. The Oculus Rift requires some pretty hefty GPU power, so assuming this VR thing takes off the way everyone hope it will and doesn’t end up like the VR thing in the 90s and the 00s, that may keep the need for a separate card well into the future.

        1. Felblood says:

          It’s kind of amazing how much I’ve been able to squeeze out of my ancient AMD Q45 on-board chipset, but that doesn’t mean I’m not looking forward to plugging a real graphics card into this machine. (Estimated delivery date: This Saturday)

          Just getting access to modern shader models will be nice, since I’ll be able to run certain games, like Darkest Dungeon and Civ Beyond Earth.

          — but I think the biggest benefit will be getting to use my entire 2GB of RAM, instead of losing 266 MB of it to emulated graphics memory and shaders. I might even be able to play Terraria, Starbound or Warframe at my native resolution!

        2. Catiff says:

          Please Please please tell me that the A-10 AMD Chips are code named “Warthog”

    2. Agamo says:

      Oh yeah, they’re still around. nVidia vs Radeon and Intel vs AMD are probably the most common, though Radeon and AMD fans don’t really put up much of a fight from what I’ve seen. I’m sure you could find more lively fights in more niche hardware circles, though.

      1. GTB says:

        This is true. Typically the fight ends with “Yes, I agree that intel/nvidia is better, no, I do not want to pay that much more money for that percentage of ‘better.'”

  3. Jonathan says:

    Making things even more confusing is that a HDD is, in fact, the computer’s long term memory… so we are calling a memory function not a memory, because Memory is a shorthand for “short-term volatile memory.”

    My “gaming”?? PC is a 5+ year old XP box hand-me-down from work. Some time in the next few years I’ll probably upgrade to a free hand-me-down Win7 box from work, or do a rebuilding and go Linux. Right now, it runs every game I want, although CK2 is a bit on the slow side (1 second per day).

    Windows 10 can take their privacy violations, dump expired blood from the blood bank over themselves, and go swimming with the Great White Sharks.

    1. Ahiya says:

      Windows 10 has gotten my non-technical boss to finally agree to test driving Linux Mint. The idea that our PCs would be in P2P connections to other PCs was the final shove.

      1. AileTheAlien says:

        Whoa! I haven’t kept up with the Windows news, so the P2P thing is new to me. I can totally understand Microsoft wanting to use P2P to distribute updates*, since it lessens the load on their servers.** I really hope though, that there’s a setting to limit the bandwidth and/or total data transfer per month, so users don’t get their data caps used up. :S

        * At the end of this article, it’s covered briefly.

        ** Plus, the network could be set up, to have less total traffic over any given connection on the internet, by biasing transfers to happen between closer-together computers instead of farther-away ones. i.e. Instead of needing to make 10000 transfers all bottlenecked on a fat expensive line from the server-farm(s), you have one smaller connection from Microsoft to the outside world, then let shorter hops take it and distribute it to everyone else.

        1. krellen says:

          You can disable the P2P aspect altogether.

        2. I understand it, but man I HATED it with Blizzard. Their custom downloader was nowhere near as stable and fast as the torrent software I was already using. A couple of hours waiting when a direct download would have taken minutes, that can get frustrating quick.
          But I sincerely doubt Microsoft has the ability to make OS updates as exciting as patch days in WoW used to be. Love to see ’em try it as an April Fool’s thing though.
          I also like to be able to control when my PC updates and reboots. With 8, I had to find and disable a service to stop the auto reboot, which is a bit of a pain (but far less of one than rebooting this thing as it seems to try to divide by zero fairly early in the OS boot process most of the time).

    2. MichaelGC says:

      Careful with the Win7 box if privacy is a particular concern – they’re rolling out some of the data-harvesting to older versions via Windows Update:

      http://www.ghacks.net/2015/08/28/microsoft-intensifies-data-collection-on-windows-7-and-8-systems/

  4. Ahiya says:

    I’ve just been trying to walk coworkers and family members though multiple hardware purchases, and oh man this is so true.

    Also, that old post is hysterical.

  5. NoneCallMeTim says:

    I thought I was pretty (self taught) computer literate, but then you said that sounds are processed by the CPU.

    What is the difference between what the CPU does, and what the sound card does?

    1. Ingvar M says:

      The sound card takes a sequence of waveforms and turn that into Actual Sound. It may also have smaller segments of sounds loaded into it and played on demand. It probably can take multiple sound streams and mix them in various ways. It may even be able to do 3D sound mapping (“here’s a sound, play it as if it is 10 metres behind and 2 metres to the left of the player”).

      But, just like the GPU, the sound card must be fed on a regular basis by the CPU.

    2. guy says:

      I think the CPU determines what sounds to play and then the sound card turns it into the sequence of electrical pulses that your speakers/headphones turn into sounds.

      Also, while there are all kinds of specialized pieces of hardware, the CPU can do basically anything you might want, just not necessarily very well. As CPUs have gotten faster, it’s probably become more practical to do most of the work on the CPU itself.

    3. Abnaxis says:

      In general, the way all peripherals (be they speakers, game controllers, printer, video cards, etc.) work, is they all sort of have their own purpose built “smarts” to them. The CPU is the mastro that directs all of these devices, and coordinates them to whatever sort of task needs doing.

      When Shamus says sounds are “processed” in a CPU for a game, he’s talking about the processing the CPU is doing so it can properly tell the sound card what to do. For example, if terrorist #3117 is shooting at Shoot Guy, the CPU is triangulating the position of said terrorist’s gun, figuring out how loud of a sound needs to come out of each speak for the *pop* of the bullet, and sending a request to the “sound card” to “please play X digital waveform at Y volume with Z balance.”

      The sound card then does some sort of black magic to transform the request from the CPU into vibrations on a speaker. That, in itself, requires a not-trivial amount of processing.

      Incidentally, this is also the way video cards work, except the video card is doing such a massive, monumental amount of processing at the behest of the CPU that it deserves a special mention.

      EDIT: See what giving a long-winded answer gets you? Two people beting you to the punch :p

  6. Daemian Lucifer says:

    I always wondered if there is a way to completely preload a small game(below 1 gig*) into memory.One that was not made to do so itself,I mean.And would it make a significant difference if you were to do so.

    *Man,have we really reached a time where I can put something as large as 1 gigabyte in the category of small?Thats crazy!

    1. Mike S. says:

      You could create a virtual disk in RAM, aka a RAMdisk. (I remember discussions of using the technique to speed up floppy-based programs as far back as the early 80s, and I doubt it was new then– though who had that kind of RAM to waste?)

      Googling for discussions in the past year in a gaming context gets a lot of hits for a commercial product called DIMMdrive. Not sure if that’s because it works well or because they have a good SEO strategy.

      1. Daemian Lucifer says:

        Thats worth checking out.Thanks.

    2. guy says:

      Um, probably not in a straightforward manner. You could get it into the disk cache, which is a chunk of memory used to store data recently accessed from disk on the theory it’s likely to be accessed in the future, but that’s managed by the operating system and may not behave the way you’d like.

    3. Jabrwock says:

      This is essentially what hybrid HDDs do now. They load certain information from a disk drive onto an on-board SSD. The SSD is much smaller than the HDD, but is much faster (although not as fast as RAM). It’s a tradeoff.

      1. Peter H. Coffin says:

        Sadly the Hybrid (and SSD) drive data still needs to chunk through a comparatively slow drive interface link (which never had much reason to be faster than a spinning disc before), get uncompressed if necessary, stored and indexed where the application needs it, and THEN it’s ready for instant access. System disk cache can help with the interface slowness, but it still needs the other steps. The only real way to speed those processes up is to have the game itself be able to recognize that it’s on a box with 12GB of usable memory instead of 3 and adjust its preloading appropriately. In MMORPG terms that may be “when loading a map, keep the map you just left in memory, plus the maps for any fast-travel hub areas”. And as “simple” as that may be, it’s more than an arena game like TF2 or Call of Duty could do for practical purposes.

    4. alfa says:

      With linux, that’d be quite easy to do, since it has the notion of a “tmpfs”, a “temporary filesystem”, that exists in RAM.

      With many linuxen, you’d copy your game to /tmp, which is often a tmpfs (and hence cleared when power is off).

      That brings me to one thing Shamus didn’t mention in the article: swap – your computer doesn’t _crash_ when you run out of memory, at least not directly. Instead it writes some of its memory to your hard drive in an effort to free some space so new reservations can be served. Often this is quite slow (and if memory and swap are exhausted the best you can do is called “OOM-killer” in linux – where OOM stands for “Out Of Memory”, which goes around and stops programs – “kills” them in UNIX-speak – in a more or less random manner).

      1. guy says:

        The running out of memory thing can cause a crash; the OS might simply refuse to allocate more memory than it physically has to a single program, or the program may need to insist* a certain amount of data is in physical memory. Additionally, your memory addresses are finite in length and each process can only get as much “memory” as can be uniquely identified, though that might be higher (or lower!) than the amount of physical memory.

        Alternately, it might make the program so slow the OS thinks it’s locked up.

        *the OS itself, for instance, needs to keep the stuff used to control accessing virtual memory in physical memory at all times. I vaguely recall hearing that some operating systems let user programs get more limited guarantees, but I can’t remember if that applied to any in common use.

    5. Lanthanide says:

      Shamus is suggesting that loading times are caused by stuff being loaded from the slow hard drive.

      While this is a large portion of where loading times come from (especially in older games), there also a very large CPU component as well.

      When a game loads a map for example, it has to load the map geometry/models and textures etc into memory. But it also has to set up a lot of structures to deal with pathing, the position of objects in the world, what the weather might be, setting up all the AI for the enemies etc. All of those things are largely independent of the hard drive, but simply take time for the CPU to chug through it all.

      I recently upgraded my CPU, motherboard and RAM, while keeping my SSD and graphics card the same – I did go from 4GB of RAM to 8GB and the new RAM is ‘faster’ although I wouldn’t expect it to make a big difference. I was pretty surprised by the massive loading speed increase I had playing Ori and the Blind Forest. On my old CPU (Q6600 at 2.4GHz) the game would take about 45 seconds from starting to getting to the main menu screen. With my new system (ie-4590 @ 3.3GHz), it only takes about 15. Windows task manager shows it is using 1GB of RAM so it seems unlikely the RAM increase is playing any part in this speedup – it’s almost all just from the CPU alone.

    6. Richard says:

      Windows* already does this automatically.

      When a program asks Windows to open a particular file on the hard disk and starts reading it, Windows guesses that it’s probably going to want the whole entire file sooner or later, and starts copying the whole thing into a bit of spare RAM.

      The program keeps reading, processing, and doing whatever it needs to do with the file.

      If the program keeps the file open for long enough, the entire file will end up in RAM.
      It stays there until the program either closes the file or Windows runs out of ‘spare’ RAM and decides that something else is more important.

      If the program writes any data into that file, it gets written to RAM (really fast), and won’t get stored onto the actual disk until Windows thinks it’s a good time to do so. This might be a long time later.

      Windows also does a small amount of prediction as to which files a user is going to want to open, and starts loading them before you even run the program…
      (The boot-up hard disk thrashing is partly Windows preloading some commonly-needed files. It’s mostly the DLL libraries used by lots of different programs.)

      The program has no idea that any of this happened.

      This leads to a few surprising facts:
      1) It is much faster to open one big file than to open many small files, even if they contain the exact same data.

      2) It may be faster still to compress that big file, if you don’t need all of the data inside it at once – even though it takes CPU power to decompress the bits you need right now.

      3) Opening and closing files is slow.
      Reading and writing the data is fast.

      4) Just because you saved it, doesn’t mean it’s actually on the disk.
      – You’ve also no idea how long it actually takes to save to disk.

      5) You can have too many files open at once.

      * Linux and OSX almost certainly do the same kind of thing, but I don’t know much about their internal guts.

      1. Daemian Lucifer says:

        The thing is,games are not just a single file,but a bunch of separate files(usually).So while it does store a bunch of stuff on the ram,it doesnt store all of it*.Which is why load times on HDDs are longer than on SSDs or ramdisks.

        1. Richard says:

          Actually, almost all games do indeed pack all those itty-bitty files into a small number of big ones.

          Doom used WADs
          Freespace 1 & 2 used “Volition Packages” (.vp)
          Source Engine games use “Valve Packages” (.vpk)
          etc.

          Half-Life 2 puts almost all (2.55GB) of the game data into a mere 25 VPKs.

          Some of these are compressed packages, some aren’t – though even then many of the files inside are individually compressed, using various different lossy or lossless techniques as appropriate.

  7. Daemian Lucifer says:

    mid-aughts

    I know it shouldnt,but this still sounds weird to me.While the more innacurate “2000s” sounds just fine.

    1. Syal says:

      Should be mid-Naughts. Or mid-Naughtys.

      1. Erik says:

        Naughties. Definitely Naughties.

    2. Mike S. says:

      I like “aughts”. It’s short and unambiguous (unlike “2000s”, which could refer to the decade, century, or millennium depending on context), which is all I really demand of a decade name.

      1. Daemian Lucifer says:

        I like “aughts”. It's short and unambiguous (unlike “2000s”, which could refer to the decade, century, or millennium depending on context)

        Um,aughts is just as ambiguous as 2000s,because depending on the context it can me 2000-2010,1900-1910,1800-1810,…which is true for every ‘X0 shorthand.And really,until we near the 2100,the 2000s will not refer to the century,just how it will not mean the millennium until we near the 3000s(also the same reason why ’20 still means 1920s and not 2020s).

  8. Blackbird71 says:

    Shamus, overall the article was a good read. While I personally am probably more familiar with computer components than the average individual, I still appreciate these sorts of basic breakdowns designed for those with less experience (particularly as they come in handy as references to send friends/family members to read so they can understand what I’m trying to tell them).

    I did have a small complaint though; I hope you find the feedback useful, but if not then feel free to disregard my opinion. You introduce the “primer” section of your post as this:

    “First, a primer. Skip this if you know what’s what inside a computer case.”

    To me, this implies that if you have no idea what’s inside a computer case, reading the next portion will educate you as to such. The upcoming section is even titled, “A Newbie’s Primer on the Parts of a Computer and How They Apply to Games”, giving even further indication that you are about to impart information on the parts of the computer itself.

    And for the most part, the primer delivers, giving good explanations and descriptions of hard drives, memory, and the graphics card. Where it fell apart for me was this sentence: “I think everyone has a good idea of what the CPU is.” This is making a dangerous assumption about the reader’s knowledge, and is the type of assumption that should never be made in anything purporting to be a beginner’s guide, or “Newbie’s Primer,” or any other such introductory material. To be fair, you do follow it up with a one-sentence rundown of the CPU’s job description, but the “everyone has a good idea” statement can really put people off. As soon as they read that, someone who doesn’t understand it will often think along the lines of, “oh, he expects me to know this already, so I’m probably not going to follow what he has to say,” and then they tune out. Whether or not they would have understood everything else, they have now been mentally primed to expect failure.

    What’s more, you haven’t even explained what “CPU” stands for; any introductory guide should always define the common acronyms as well as common terms used to refer to the object (which you did for the GPU, so I don’t understand why you skipped it for the CPU). On page 2 in your discussion of cores, you mention the “processor” and “processors” without ever explaining that this is referring to the CPU. The only part of your primer specifically refers to a component as a “processor” is your discussion of the GPU, and it would be very easy for someone truly unfamiliar with the parts inside the case to assume that you were specifically referring to the GPU in this part of the discussion.

    Of course I’m sure that most of your readers got through this article just fine, and understood what you intended without difficulty. Still, personal experience has taught me that there is always someone who doesn’t have the knowledge base that you expect, and it is better to assume ignorance when delivering basic, ground-level information, and to cover your bases as thoroughly as possible.

    Again, this is just one reader’s opinion, take it for whatever you feel it is worth.

    1. Zekiel says:

      Actually I agree.

      Even though I do have a reasonable grasp of what the basic components of the computer are, I definitely benefitted from the primer Shamus provided. And I think that what a HDD does (just sit there storing stuff) is easier to grasp than what a CPU does.

  9. Hector says:

    Da red wunz awwaez go fasta!

    1. Groboclown says:

      But what’s a computer discussion without a bad car analogy?

      “Giving your computer a red video card is like adding yellow to your car. It’s an instant additional 2 horsepower.”

  10. Amstrad says:

    While I understand why you did it I think you’re remiss in stating that the hard drive has no effect on load times. Now that SSDs are so much more common (especially in gaming rigs) the fact of the matter is: installing your games to a SSD rather than a traditional HDD will absolutely result in faster launch and loading times. This is a reality that anyone building or upgrading a system must be aware of so they can make properly informed decisions.

    1. lethal_guitar says:

      I agree, especially if you consider that installing a SSD drastically improves what people usually refer to when they claim “my computer is so slow!” (assuming there are no other problems, like malware/bloated OS install etc.)

      After all, most everyday computer use is almost entirely IO bound nowadays. The vastly reduced OS and program startup times make such a big difference in how fast and responsive a system feels.

    2. AileTheAlien says:

      SSDs also lessen the effect of swapping stuff between memory and disk. So instead of grinding to a halt, your stuff just slows down to a crawl. I’ve experienced this at work. Running a VM without enough free memory now means I can close programs to free some memory, and then get on with doing things. Before my machine would be so slow, that all I could do was hope like hell that one of my clicks would be registered to kill the VM, or to reboot. ^^;

    3. Alexander The 1st says:

      I think you were missing the other part of that statement – he said that making the hard drive *larger* wouldn’t have any effect on load times.

      If you change the hard drive type, you can get performance gains – but it’s not like the regular volatile memory that is RAM where the more you have of it, the less memory swapping your programs do and the more you can run at any one time.

      Hard drive size doesn’t make it run better – hard drive bandwidth does.

      1. Mistwraithe says:

        I agree with Amstrad. Yes Shamus was talking about larger hard drives but that was because he was writing as though the main thing that was technically evolving with hard drives was size. That was certainly true for decades but there was a quantum leap between HDDs and SSDs. You could argue that this is too technical a point for the newbies that the article was targeting, but the article was also written to cover the topic of performance problems with old computers.

        In that context the difference between HDDs and SSDs is important as upgrading from an HDD to a SSD will make a far more noticeable difference to your general computer speed than most other changes you could make.

        For a while there SSD speeds were also increasing at a sufficiently fast rate that you could notice a further significant speed boost from replacing an older SSD with a new SSD (we’re probably approaching the point that only people with specialist data needs would notice a future upgrade from the current generation of SSDs, they can already pretty much saturate every other component of most computers).

        1. boota says:

          the thing is, though, when it comes to consoles, the arrival of SSD is pretty much irrelevant since none of the consoles feature an SSD, (yeah, i know that it’s possible to install one, but it’s not something that developers can spec their game for nor something that someone who’s not very technically skilled is going to do), so not including ssd into a console-centric discussion was probably the right thing to do.

  11. Florian the Mediocre says:

    Out of curiosity: Did anybody actually skip the primer bit?

    I almost never listen when an article tells me to skip part X if I already know Y, and I’m wondering if that’s as common as it feels to me.
    (I guess reading what someone writes about something I’m familiar with helps me judge their authority on parts I know less about.)

    1. Rosseloh says:

      I always read stuff I ostensibly am supposed to know. Helps me get an idea of where the writer stands, and if I happen to find something (that makes sense) that I didn’t know before, we’re both better off for it.

      Nothing in this article was new to me, but we all already know Shamus isn’t a hardware guy.

    2. Joe Informatico says:

      I know the basics (if not much more), but I’m occasionally called upon to explain them to others. So it’s helpful to see how other people explain things for the layperson, especially someone like Shamus who’s pretty damn good it at.

  12. MrGuy says:

    I really like the car analogy because it so perfectly describes a lot of hardware snobs I’ve met.

    I’ve met more than a few people who brag about their car having a 10-to-1 compression rating or dual overhead cams without really knowing what those things are or why they’re better (I.e. What the specific effect on performance is). All they know is “x is supposed to be good, and I just paid a lot of money for a lot of X.”

    It reminds me of the Megahertz Wars and their spawn so much it’s painful to remember. “My machine has 512M of L2 cache!” Wow, that’s great. What kinds of things do you use it for regularly that need that much L2? Blank stare.

  13. Rosseloh says:

    My dad got a popup saying his hard drive was full and asked if he needed to buy more memory

    While I try not to be smug about it (customer service and all), I do wish the distinction was more commonly known among the non-tech populace. I don’t know how many times I’ve had to correct people about this, because making sure they understand the difference is part of my job. It gets old after a while.

    but everyone else just sort of muddles through by absorbing it from the people around them

    Hear hear. My roommate, bless his soul, was for the longest time ADAMANT to me that the tower unit was called the hard drive. Here I was, the actual computer nerd, and he’d argue with me that his version of the terminology was right because “that’s what I was told by {guy from our church who knows some stuff but not as much as he could}.”

    …but I could go on all day about the crap we “retail” tech people deal with. And it’s not even what I went to school for…

    1. modus0 says:

      It’s even worse when you realise there are people who think you can download more RAM.

      1. Lanthanide says:

        Possibly because there are various dodgy ‘apps’ that advertise just that. Aren’t so prominent as they used to be a few years ago (especially during Windows XP times), but I have seen ads that literally said you could download more RAM.

        Generally they were ‘memory acceleration programs’ that would kill background tasks and thereby ‘free up’ memory.

        If your system was seriously low on memory and forced to use the swap, then this really would improve your computer speed. But if your computer wasn’t swapping, then it’s unlikely you’d get anything other than placebo out of it.

      2. Rosseloh says:

        To be fair, downloadmoreram.com is completely legit. A legit JOKE site, mind you, but legit.

        (Or it used to be, it’s been a long time since I visited.)

        1. mhoff12358 says:

          Yeah, I haven’t checked it out in a while either. I downloaded a /bunch/ a few years ago just in case something happened to the site and haven’t needed more.

  14. John says:

    I’d like to build a new PC sometime within the next year, so I’ve been reading up on the minimum system requirements for the games I’d most like to play. My not-at-all-rigorous survey of Linux games on Steam and GOG has suggested that I am much more likely to be bottlenecked by the graphics card than by the processor. I usually prefer strategy and role-playing games, and many of these seem to require no more than a dual-core processor with a clock speed somewhere between 2 and 3 GHz. That’s an easy requirement to meet. Five years ago, I did it accidentally when I built a PC featuring the very cheapest Intel dual-core processor that I could find. The processor requirements are also relatively easy to understand. All else being equal, more cores is better than less cores and more GHz is better than less GHz.

    Graphics cards, on the other hand, are tricky like the devil. Generally, what I do is find a game’s minimum requirement and look it up on a benchmarking site. I then compare the requirement’s benchmark value to the benchmark value of a card that I might purchase. I’ve figured out–I think–that if I spend about $70, Pillars of Eternity will probably work. If I spend, say, $100, it will pretty much assuredly work. This is an obviously imperfect process. For example, as a person with opinions about Tolkien I very much do not want to play Shadows of Mordor . But if I did, I have no idea how much I would need to spend on a graphics card because I still haven’t figured out what card to look up when Steam says “nVidia 9xx series card or better with driver version 352.21 or later”. I’d forgo all the expense and confusion if I had not been so frequently disappointed by Intel integrated graphics.

    1. Scerro says:

      Graphics cards are so good anymore, get a gaming entry-level card (Nvidia – X50 GTX | Radeon R7 260X) and you’re good.

      You can always drop in a new card every 2-3 years if you’re feeling like it’s hurting you. If your processor/RAM is hurting you, you might as well build a new PC and have the old one to sell/give away/pass on to your kids or family.

      My personal preference is to get a really solid CPU/Mobo/RAM when making a build. Upgrading $70 here and there is way easier than dropping a whole new machine.

      Of course, I think you’re looking at a budget machine more than I am. Right now $700-900 is a real sweet spot for a machine, and in my opinion is your best choice for cost to performance.

      Keep in mind a lot of games will max your CPU. Cites: Skylines at higher speeds eats a ton of CPU power, and Civ V is also a CPU hog. I have the best core i5 from last gen and Cities: Skylines with higher populations pretty much maxes all my cores.

      If you’re looking for really good advice on GPU tiers, just google “Toms hardware best graphics cards for the money”. It has good layouts of prices and cost/performance ratios.

      1. John says:

        I looked at several “best graphics cards for the money” articles to come up with the $70 and $100 figures. The recommendations were all pretty similar.

        This is honestly the most research I have ever done before buying a PC. Two of the three PCs I’ve owned have been laptops ordered from Dell and HP. You don’t get too many GPU choices in those circumstances, and so my my rule of thumb has been to go one step up from the bottom. When I built my current desktop, I had an extremely limited budget and couldn’t justify a discrete graphics card.

      2. Zekiel says:

        Is that really true? I’ve got a Radeon R7 250 (which is a couple of steps behind the 260X) and it is below the minimum specs for a lot of games from the last couple of years (Wolfenstein NO, Witcher 3, Shadow of Mordor etc etc). In the case of Witcher 3, the 260X is a couple of steps below the minimum graphics card required. So are those minimum system specs just a lie?

        1. They can be, or your standards for “does the game work” might be different than the developers. I played City of Heroes on a graphics card one rung below min specs, and so long as I didn’t mind not being able to see anything in character creation, and could put up with the world looking quite weird (textures were fine, color was not, all characters were the same colors as the backgrounds with correct textures), it was fine. Not optimal, but playable.
          Tried the same with Oblivion, hello white screen with cursor. Not playable. (Of course we all know Oblivion was not optimized well).
          I suspect most min specs are based on “well, we don’t want the game to look or play worse than this,” instead of “will it run?”

        2. Robyrt says:

          Each game is different. Some games have a hard limit below which they will refuse to run at all, or will have really bad graphical problems (because your card doesn’t have the bling mapping it needs). Some games will valiantly attempt to run on a toaster. Total Annihilation could famously run on a ten-year-old computer, but it would take 20+ minutes to show the main menu.

    2. 4th Dimension says:

      As Scerro said, today pretty mnuch everything will allow you to run things fine. I would like to add a couple of things.

      When I shop for graphic cards I basically do it by bying something in 75-~100€ range. Basically pretty much everything in that price range will allow you to run all games fine even though you might need to knock down the details in some of them. But even with reduced details games today look wonderfull. If you are a westerner what this boils down is going to Tom’s hardware and choosing best performing card in this price range.

      Another thing I need to point out is that clock speed is not really a good indicator of core performance. Remember long time ago we hit the 3GHz mark and we found out that increases in clock speed past that don’t make processing speeds much better. Ever since most cores occupy the space between 2.5-3GHz even though there has been improvment in processing speeds. These improvments were made possible not by increasing clock but by building processors more efficiently so operations take less of clock. Basically what it boils down is that a 4 core 2.5GHz processor could quite easily outperform a 4 core 2.7GHz processor, if the first one is a generation newer. Again find how much you are willing to spend, and find the best performing processor in that price range.

    3. GTB says:

      If you are intent on running linux, I would wait until the Vulkan API has hit and standardized before buying a new computer, just to make sure you get hardware that supports it best. it’s going to be a game changer for linux. ..literally, I guess.

      1. John says:

        I have no idea what that is. Grumble, grumble, more work, I suppose.

        1. Richard says:

          Vulkan is the replacement for OpenGL.

          It’s a new way to talk to the graphics card, announced earlier this year.

          However, it’s not going to get fully ratified any times soon, and actual properly running hardware and drivers will be a little after that.

          My personal guesstimate is that Vulkan is a year to 18 months away from ‘mainstream support’, and all GPU hardware currently on the market will support it, because otherwise there won’t be any hardware for the GPU manufacturers to test against.

    4. Humanoid says:

      It might be technically okay on the requirements fronts, but buying a dual-core today is the wrong thing to do, and has been for some years now (including the whole of this decade). When CPUs start falling short in terms of being able to run modern programs, it won’t be because of single-threaded performance. The best advice is to buy the best upper-mainstream CPU you can get, and keep it for the next 5+ years. You can’t do this approach with video cards, but it works very well for CPUs.

      A simple example is the extremely popular 2011-vintage “Sandy Bridge” second generation quad-core, which still holds up very well today against a “Skylake” sixth-generation quad-core. Clock speeds are broadly the same since then: an i5-2500K ran at 3.3GHz and a newly released i5-6600K runs at 3.5GHz. IPC, that is, instructions per clock, has improved about 5-10% per generation (closer to 5% really).

      So what we have is a four-generation gap in this example, plus about a 5% clock speed increase: an estimate of performance gain to expect is 4×5% + 5% = 25% improvement using this rule of thumb. I Googled the actual tested difference as per professional reviews between the two and lo and behold, that number is pretty damn close.

      _____________

      For GPUs, the opposite advice applies. You get better results buying a mid-range card every couple of years instead of buying a monster one and hoping it lasts for 4-5 years. While the rate of improvement in graphics tech is nowhere near as fast as it was last decade, it’s still fast enough that a mid-range card this year is as fast or faster than the flagship of two years ago. Actually in some cases, the mid-range card this year *is* the flagship of two years ago.

      EDIT: Another reason this approach works is because GPUs are by their nature drop-in upgrades. You buy a modern video card and can drop it in a PC from several years ago and it’ll work, PCI-E is designed to be backwards compatible. It’s therefore a trivial thing to replace regularly. On the other hand, if you buy a new Intel CPU today, it won’t even fit in a brand-new, newly-released motherboard from earlier THIS year. Intel love their planned obsolescence and you can expect to need to buy a new motherboard Every. Single. Time. Don’t play that game, buy a good CPU and keep it for as long as possible.

    5. Mistwraithe says:

      Like several others have said I wouldn’t skimp on the CPU, particularly if you like strategy games. The next generation of strategy games is likely to farm AI across multiple threads (some already are) while single core speed (Mhz) is the limiting factor while waiting for the AI to finish its turn in a lot of the current generation of games (eg Civ 5). It is also easier to replace the GPU, in fact depending on your exact requirements you could possibly start with a top level Intel CPU and no extra GPU at all for a while (ie use the built in GPU), then add an extra GPU in a year or two. (Disclaimer I don’t know anything about Linux so I don’t know if it has decent drivers for the Intel on chip graphics). OTOH if you want Witcher 3 to look great you need to aim higher ;-)

  15. Scerro says:

    I feel like this imgur post really sums up building a PC: http://imgur.com/gallery/2liGJJp

    Building a machine is just so intimidating. Any reasonably smart person that’s careful can put together the 7 piece puzzle.

  16. 4th Dimension says:

    “But the awful thing about computer hardware is that this knowledge isn’t usually taught in schools as part of the general curriculum. (I’m sure this varies by region, but I’ve never met anyone who has taken some kind of mandatory computer literacy class. I imagine this will change in the coming years. But maybe not. We all drive cars but only a fraction of us can open the hood and say what’s what.)”

    You don’t get such classes in school*? I would expect the quality of the class to vary based on the theacher and school and that people forget since they are not interested in computers past how to open facebook at that stage at school, but I expect a basics of computer science class to be present basically in any curicullum. It wouldn’t cover much of the science but it would talk a bit about the history of computing, basics of construction and how to perform basic operations in Windows and Office.

    * Well not you Shamus since it has been a while since you were in school but students of this milenium.

    1. guy says:

      I went to one of the top US public school systems, and we did have courses like that, but purely as electives. The supply of electives was very limited, so most people didn’t take those.

      1. Robyrt says:

        I went to a science & tech magnet school, and the Computers 101 course was mandatory for 9th graders (age 15). It covered basic hardware, programming and electrical engineering concepts – making the computer play its bell tone, making an oscilloscope do funny shapes, that sort of thing.

      2. 4th Dimension says:

        Huh, Comp Sci. or as it’s called over here Informatika is one of the basic subjects that all High schoolers must attend/take during first grade of highschool (~15 years old). Along with others like Our Language, English Language, Math, PE, Physics, Geography and History. Other subjects can vary dependent on which school or course you are attending, but these are a must.
        Now the level of what is required from the student to pass can vary from school to school. In some it’s taken seriosly while in others they mostly play games, but the subject itself is mandatory.

        1. guy says:

          In the US, the core subjects Math, English, Science, and History are near-universally required for four years of high school and some years of one foreign language are usually required. Requiring some PE has also been getting more common. In general, everything else is elective. This all varies by state and to an extent by school.

    2. Gravebound says:

      I went to a small-ish middle school (early 90’s) and we learned the basics (very basics) of how computers operated (the teacher wasn’t very good at it, though, and few of the students thought computers were important in their lives :P). Nothing about assembly or the like, just functions of the various components. We also learned binary code and basic DOS operations.

      Being a smaller school, there were only two non-‘faculty only’ PCs in the whole place, and one classroom of about twelve Macintosh SE/30.

    3. Xeorm says:

      I graduated high school in ’08 and didn’t get any knowledge of the sort in my classes, (aside from the programming technical courses I took as part of entry-to-college stuff). If they taught us computer stuff, it was usually how to use it and general safety. Knowing what bits of the hardware do what seems kinda useless for the general population anyway.

      Much as computer geeks do like knowing that knowledge and dealing with it on a general basis, it’s not surprising to see other people not care. I know I don’t care one bit about the general parts in my car. Similar to computers are to most people, all I care about is performance, maintenance, and safety. I don’t need to know why part x is better than y, just tell em which one I should buy. And give me the little warning symbols so I know to take it to someone who can fix it.

  17. drlemaster says:

    Looks like my graphics card just died, so I am reduced to using the onboard one for now. I did manage to scavenge one from the scrap pile at work. In honor of Shamus’ old post, I grabbed a red one.

Thanks for joining the discussion. Be nice, don't post angry, and enjoy yourself. This is supposed to be fun. Your email address will not be published. Required fields are marked*

You can enclose spoilers in <strike> tags like so:
<strike>Darth Vader is Luke's father!</strike>

You can make things italics like this:
Can you imagine having Darth Vader as your <i>father</i>?

You can make things bold like this:
I'm <b>very</b> glad Darth Vader isn't my father.

You can make links like this:
I'm reading about <a href="http://en.wikipedia.org/wiki/Darth_Vader">Darth Vader</a> on Wikipedia!

You can quote someone like this:
Darth Vader said <blockquote>Luke, I am your father.</blockquote>

Leave a Reply to Warrax the Chaos Warrior Cancel reply

Your email address will not be published.