The New Computer

By Shamus Posted Tuesday Jul 30, 2019

Filed under: Personal 56 comments

Three months ago someone gifted me a gaming PC. I mentioned it on the podcast, but I know not everyone listens to the podcast. So I thought I’d give the machine a proper write-up.

The machine in question is a Corsair One. This is a pre-built gaming PC with a custom case. This machine is a first for me in a lot of different ways:

First time with a liquid-cooled machine.

It’s quite a slick setup. The tubes run over one side of the case to cool the CPU, then flow to the others side to pass over the GPU. It’s amazingly quiet. My old PC was a box of large whirring fans, while the new one doesn’t seem to make any noise at all, even under load. If it wasn’t for all the lights, I wouldn’t be able to tell it was on.

First time with a really high-end card.

Those screens on the left are showing real reflections. If I was standing in the right spot, I'd be able to see my reflection in the glass.
Those screens on the left are showing real reflections. If I was standing in the right spot, I'd be able to see my reflection in the glass.

The new machine comes equipped with a GeForce RTX 2080. I tried to put the card through its paces, but then I realized I didn’t have anything that could remotely challenge it. Even at 2k resolution and 144fps, I couldn’t get Shadow of the Tomb Raider to drop frames.

The RTX means it can handle NVIDIA’s new raytracing technology, which means I got to try the Quake 2 raytracing demo. Really interesting. I’ll probably do a column on this soon.

First time owning a machine with NO removable media.

It’s weird to have a machine that can’t read any disks, but this seems to be the norm these days. I picked up a USB DVD drive so I can install the disk-based programs I still use. There aren’t many.

First time since the 90s that I was able to appreciate framerates above 60fps.

I can't show you a screenshot that will illustrate smooth framerate, so here is another shot of Quake 2 with raytracing.
I can't show you a screenshot that will illustrate smooth framerate, so here is another shot of Quake 2 with raytracing.

I’ve said many times in the past that I can barely tell the difference between 30fps and 60fps. I remember really noticing the difference back in the mid-90s, but over the last few years I’ve messed around with higher framerates and found they didn’t make much of a difference. I’ve always blamed this on my aging eyes.

As it turns out, I can tell the difference. It’s not even subtle. I’m not sure why I struggled on the old machine. Was the old monitor not showing the updates cleanly? Was 60fps too uneven on the old machine? I have no idea, but it makes a huge difference now. In fact, I can even tell the difference between 60fps and 144fps. It’s  not as big as the jump from 30 to 60, but it’s still a noticeable improvement.

I’m trying hard to not become a framerate snob, but I’d really hate to go back to 30fps.

First time with a 2k monitor.

The large monitor on the left is the old one. The one on the right is the new 2k gaming monitor. You can't see the difference in this bloom-covered photograph, but it's pretty stark in person.
The large monitor on the left is the old one. The one on the right is the new 2k gaming monitor. You can't see the difference in this bloom-covered photograph, but it's pretty stark in person.

I honestly thought “gaming monitors” were a bit of a swindle. My standard monitors always looked just fine to me, so I couldn’t imagine what sorts of benefits there were to a gaming monitor.

But now I’ve got a 2k gaming monitor next to my old 1080p office monitor, and the difference is pretty drastic. The most notable thing is the higher dynamic range. On the old monitor, black areas of an image are dark grey, while the new monitor shows them as proper black. This makes the colors pop more. The old monitor is also quite a bit dimmer. You’d think it would be harder on your eyes to stare at brighter whites for extended periods of timeParticularly since I keep my office dark. but that doesn’t seem to be the case.

Oddly enough, the extra resolution helps my work more than my games. I can have a lot more text on screen without sacrificing readability. This is really nice in complex applications like music production and programming tools. Nothing helps your workflow like being able to see more stuff.

Wrapping Up

This upgrade came at just the right time. My Wolfenstein series last year had a lot of screenshots that showed the game in “Ultra Crap Mode”. This was unfortunate, since one of my major complaints was that the game looked artistically terrible and demanded too much power given the substandard visuals. Playing on lowest settings left me open to the easy dismissal that the game would look just fine on proper hardware. To really sell my thesis, it would have been better to have a high-end machine so I could say, “This is how the game looks on max settings, and it’s still dull and repetitive.”

This upgrade has made my entire job easier and significantly less stressful. Thanks again to the benefactor who provided it. I’m grateful for this thing every day.

 

 

 

Footnotes:

[1] Particularly since I keep my office dark.



From The Archives:
 

56 thoughts on “The New Computer

  1. boz says:

    Framerate is one of those “ignorance is bliss” issues. Until you see the difference and played with it you never know what you are missing. It’s too easy to get used to mediocre LCD performance if that’s the only thing you knew.

    1. Mattias42 says:

      Used to scoff big time at wide-screen, 1080p monitors, though. Really didn’t see the point… the same games, videos & stuff, but you need an even more beefy rig, for less performence? Does not compute.

      Until my old 4:3 postage stamp of a screen got so smudged with old dirt & grease, that you could barely read with it, and I begrudgingly got myself a cheap, curved one, with decent reviews.

      (A Philips 278E8QJAB / 27″, if anybody cares slash is looking for recommendations. Think the old one was… 14 inches? Very old, very tiny little LCD thing, at any rate.)

      And, MAN, what a difference! You just have so much [i]space![/i] And it’s so much easier on the eyes! Doubly so since the curve makes your stupid monkey brain think it’s even flatter then an actually flat screen!

      Really one of those things you can never go back from, once you’ve experienced it yourself, yeah.

      Has really made me curios about 2K and 4K… but alas, that’s just simply out of my budget right now.

    2. Yeah, I was recently playing Breath of the Wild using CEMU and had to go back to 60fps due to some technical glitches and my God the difference was night and day even with a pretty crap monitor.

  2. Karma The Alligator says:

    So it’s NOT the first time you have a computer that looks like it’s been used as the sole light source in a disco?

  3. Redrock says:

    Eh, I still maintain that anything above 60fps is nonessential, outside of some really hardcore fps gaming. But, then again, these days I do my PC gaming on my living room TV and use a controller 99% of the time, so what do I know.

    I am getting stronger and stronger urges to upgrade these days, but with a GTX 1070 it’s still hard to justify. I can do 1080p on most games, and it looks great on my 4k tv. Hell, with so many games including resolutin scaling these days I can do 60-70% of 4k with decent framerates. I’d like a new CPU – made the mistake of getting a 6600k back in the day, and the poor thing thing really struggles with some of the worse optimized open-world stuff, like Ubi titles and FF 15. But a new CPU means a new mobo, and then you might just as well change up everything but the RAM.

    1. Raygereio says:

      Eh, I still maintain that anything above 60fps is nonessential, outside of some really hardcore fps gaming.

      Same.
      Imo framerate has become a marketing thing where everyone just wants it to be higher because we’re trained that higher == better, without really taking into consideration what makes a moving picture actually look good.
      The real answer is off course that stability is the thing that’s actually important. It doesn’t help that in benchmarks often only the max or average fps is given. Okay. that card achieved 160fps in a Doom4 scene. Great. Those dips down into >24fps will still made it look annoying and shitty.
      Yeah, if you directly compare a stable 30fps with 60fps, 90fps & 120fps, the scenes in the higher framerates will look more fluid. But on it’s own the 60fps gameplay will look good. And honestly, even the dreaded 30fps will give you a fine gameplay experience. As long as it’s stable.

      If you have the money and want to spend it on the necessary hardware, go nuts. But I never felt it was worth the increased cost to push beyond 60fps.

      1. Duoae says:

        Yes, but most reviewers worth their salt moved to 90th/10th percentiles and min/max fps with an average literally years ago. I don’t know which sites you’re visiting for your technical reviews but they are past their prime… Or are you referring only to “benchmarks”? In which case, do you only review the games you buy? It’s a strange statement to my understanding of the issue….

        It also depends on the context of play as well – even though these crappy (sorry, they’re not to my taste) popular experiences like CoD, Fortnite, Overwatch, TF2 and PUBG are not that graphically intensive, they are competitive games and higher frames result in better performance (this is academically tested) at a given latency. Even worse, some games couple output with their fps…. meaning that you can (or could) deal more damage over the same period of ‘real’ time. Of course, this is an exception to the standard now but still sometimes happens.

      2. Wide And Nerdy says:

        If you have the money and want to spend it on the necessary hardware, go nuts. But I never felt it was worth the increased cost to push beyond 60fps.

        Having an RTX 2070 myself and being able to experience 144fps on my gsync monitor, I’d say the difference is noticeable but yeah you can live without it. I do like having at least 60fps though in games and if the framerate gets below that I find it annoying especially since, unlike Shamus, I actually bought my video card. The experience is just so much smoother at 60 and above.

        I find it really annoying when I have screen tearing which happens worryingly often in spite of the fact that screen tearing probably shouldn’t happen at all on a gsync monitor (like what am I paying for?)

        Where having this card has really paid off for me and where it will pay off for Shamus if it hasn’t already is with VR gaming. These cards deliver rock solid smooth VR gaming experiences even on Fallout 4 VR and Skyrim VR where I’ve piled up the graphics mods. And remember VR has to render twice at 80 to 90 fps depending on which headset you’re using.

      3. DeadlyDark says:

        Speaking from experience, difference between 60 fps and 144 fps is noticeable even in Windows itself (I hope, Shamus didn’t forgot to switch frequency for windows desktop). Games being able to play at such frequency is also a noticeably nice thing. Though my videocard and CPU is a limiting factor here

  4. John says:

    Now that you have a computer with a radiator, automotive analogies are more apt than ever!

    Except that the automotive industry is moving toward radiator-less electric vehicles. Curse you, March of Progress!

    1. Stanley Raymond says:

      Electric motors aren’t 100% efficient though. If they’re big enough, they’d still need some type of cooling, since their heat would scale with the volume, but their ability to dissipate heat would scale with their surface-area.

      1. John says:

        That is a more serious reply than my deliberately silly comment deserved. However, I’ll just note that the pure electric vehicles I see on the street–various Teslas, the Nissan Leaf–don’t have visible radiator grills. The largest electric vehicle of which I am aware, the Rivian electric pick-up truck, may or may not have one. There’s a thin strip just above the front bumper that could possibly be a very small radiator grill. It’s hard to tell from the images available to me. The Rivian truck is not a production vehicle–i.e., you can’t buy one yet–and the final version may be different.

        1. Raygereio says:

          Electric cars like the Tesla or Leaf do have radiators. The concept of a heat exchanger isn’t going anywhere.

          What is being toyed with is the grille as a design concept. Electric cars don’t need the huge amount of air flow a grille provides for combustion engines, so car designers have some freedom to play around with it.
          On the other hand a car like the Jaguar I-Pace still has that huge grill because that grill is one the design features that makes a Jaguar car a Jaguar.

  5. Dev Null says:

    If it’s not a sensitive question; do you know what the pricetag on that rig is? I’ve been a last-gen gamer for decades and never felt like I missed the cutting edge, but it’s always good to know what it would cost for the upgrade…

    1. Geebs says:

      The corsair one is about $2500-$3500, depending on the GPU. You could build the same boxes for about $500-1000 less if you don’t care about the form factor.

      1. Rabbit Sage says:

        This is what I did, although my PC isn’t as extensively good as a Corsair. But I bought the components for my PC individually and spent about $700-$800 altogether.

        I admit it was the best investment I ever made into a good PC, but once this one gives out, I’m already saving up for a better, ultra-2k PC, like Shamus’ new rig.

    2. Dreadjaws says:

      It’s right there on the link he gives.

  6. Stanley Raymond says:

    But now I’ve got a 2k gaming monitor […] The most notable thing is the higher dynamic range.

    The resolution technically isn’t related to the dynamic range, although there might not be any low-res HDR monitors on the market.

    1. Shamus says:

      I was attributing the HDR to the “gaming” part of the monitor, not the “2k” part.

      Although, it seems like really strong HDR would be MOST important for an artist. For gaming, it’s nice, but for (say) image manipulation it’s essential. But I guess “2K Artist Monitor” doesn’t sound terribly sexy to marketers.

      1. Stanley Raymond says:

        Thanks; The phrasing was a bit ambiguous.

        As for artists vs gamers and HDR – for your average action game, I think you’d be correct. However, any kind of game that deals with dark shadows would benefit from HDR. The stealth parts of the Metro games often have light-bulbs right next to pitch black shadows. Horror games of all varieties might also benefit, especially if they can get the player reliant on their super-vision, and then slowly turn down the HDR when monsters are nearby, so they can more easily spook the player from the now-murky shadows.

      2. Calmre says:

        They’ve gotten heavily into “creator” advertised products the last few months. Motherboards, monitors, even laptops marketed toward and branded as “creator” options. It’s like gaming branded stuff but with double the premium of the simple “gaming” stuff. It’s quite ridiculous actually.

  7. Duoae says:

    Although i’m not a snob about framerate or resolution (I play on consoles as well, after all!) but I don’t really see as big a benefit past 1440p and 100-120 fps. For me, (although I don’t own a freesync/g-sync monitor that can do 1440p and 120 fps) I think 120 fps would be a good middle point to sync with movies/TV and be ideal for gaming as well.

    Most films are around 24-25 fps (some exceptions are 48 fps, but basically a multiple), TV is basically 30 fps (okay, around 29.997) and games are usually either trying to be locked to 30 or 60 fps with some games uncapped.

    120 fps is 5*24 fps, 4*30 fps and is 2*60 fps. It’s the perfect frame rate for marrying our current systems. So WHY IN THE NAME OF [insert figure of cultural relevence to yourself here] are high-end monitors 144 Hz?

    Now, here’s the confusing part (and maybe the answer to Shamus’ original problem seeing the difference between 60/120/etc frame rates: the monitor and TV manufacturers mis-state refresh rate on purpose. So, it’s possible that Shamus didn’t see the difference of higher frame rates because he was comparing it on a monitor with a locked refresh rate which was lower than the frame rate.

    How does this fit into the monitor 144 Hz question? I’ve no idea… but I’m not sure about the relation between gaming monitors and refresh rate “trueness”. So, is a 240 Hz gaming monitor working at 120 fps with inserted black frames or somesuch? Does that mean that a 144 Hz monitor is actually only displaying 72 fps of actual content? It’s very confusing….

    As for what I *AM* a snob about? FOV. Goddamnit!! Developers better give me at least a 100 FOV or I’m annoyed – even on console I prefer a bigger than measely 50-60 FOV they usually provide…. it’s so claustraphobic! (I’m not actually claustraphobic….)

    1. Guest says:

      If the monitor frame rate is higher, you get duplicate frames. Blacks would flicker. If the frame rate is lower, yeah, it gets capped.

      Syncing up arithmetically really doesn’t matter, especially since frame rate is constantly changing, and the point of that g sync stuff is to prevent tearing by syncing up the refresh rates.

      1. Duoae says:

        Well, in theory yes. But since frame pacing/ timing is a thing, it doesn’t work out that way.

        Duplicate frames work when all frames are delivered at the correct rate and without any dropping. So, even with a 60hz display, the can be irritating issues with frame stuttering and such.

        (See some digital foundry articles on eurogamer, they’re one of the most technically informed groups I’ve come across)

        In regards to black frame inserts resulting in flickering… if you’d read the article, you’d see that it doesn’t unless done badly in the implementation. It actually reduces blur caused by interpolation by our physical optical processing.

        And synching up arithmetically, as you suggest, does matter because if the implementation is not delivering pure rendered frames, it will cause images stability issues.

  8. deiseach says:

    Is it corny for me to say that this makes me happy? You deserve it, dude, you really do.

    1. Simplex says:

      It’s not corny if it’s true ;) It almost literally pained me that Shamus was slumming on a low-end PC all these years because so many other expenses were of higher priority. So that PC was a godsend and the person who gifted it deserves praises.

  9. Hector says:

    I want to do a joke social media post like: #Jealous. That’s a nice setup there. I’ll have to upgrade myself next year, but nothing like that’s in the cards.

    To whomever donated: You’re breathtaking!

    1. Duoae says:

      Keanu dedicated the PC to Shamus?!!! :D

    2. Wide And Nerdy says:

      I just “You’re breathtaking” the other day at the end of some kid’s otherwise sincere post about his crippling issues and thought this was just an odd word choice on his part. Now I know its a meme.

      Also Shamus, glad you’re going to be able to keep up with the Joneses for a while. You deserve more than just current gen hardware for the work you do.

      1. Hector says:

        It came from Keanu Reeves’ appearance for Cyberpunk at E3.

        And him portraying Johnny Silverhand makes Johnny Mnemonic even funnier in retrospect.

  10. Wide And Nerdy says:

    I’ve always been nervous about introducing liquid cooling to my rig because an accident could damage the machine but now I want to try it. I’d heard about the benefits in terms of improved cooling but it never occurred to me that the machine would run more quietly due to a lack of fans. Plus I’m already on all solid state drives so that should eliminate all the noise the machine makes. Of course I keep some kind of media on in my room most of the time so I’d seldom be able appreciate it but it still might be worth it.

    1. Len says:

      It’s a common misconception that water cooling is quieter or better at dissipating heat. Water cooling does also require a radiator and fans, and can be more expensive. For some numbers on temperatures, see https://youtu.be/hr0qLLv3dKc

      The main advantages of water cooling is that it looks better, and is generally more compact and can fit into smaller cases better.

      1. Richard says:

        My previous build was liquid cooled, my current one is air-cooled.
        The new system is slightly quieter, however the old system dissipated considerably more heat. The new PC is just more efficient overall so it didn’t need the same cooling capacity.

        Direct air-cooling is limited to what will physically bolt to the top of the chips.
        A direct-air heatsink generally tops out at around 1kg of ‘stuff’ in total before they get so big as to be either impossible to transport or just impossible to fit at all without breaking the motherboard, and the ‘radiator’ section has to fit around the motherboard components.

        Liquid cooling doesn’t have those physical limits.
        You could have several tonnes of liquid in the system and a radiator the size of the Hudson. Or in the Hudson, should you so desire.

        A pretty sensible size of liquid system might have 1500ml of liquid, and a 1kg radiator. That’s 2.5kg of ‘stuff to heat up’, and the radiator can be placed outside the case and so have a much larger surface area in much cooler air.

        1. len says:

          The increased thermal mass of liquid coolers is good if your workloads are short and bursty, but under a sustained load (e.g. gaming), will make no difference.

          1. Richard says:

            True, though even gaming is somewhat ‘bursty’.

            A few kg of thermal mass will give the system a chance to dissipate the ‘intense’ sections during the less intense ones.

            But yes, for a sustained workload the long-term gain is due to being able to have a larger radiator, with bigger (hence quieter) fans further away from the rest of the PC than is physically possible with a direct air system.

    2. Chris says:

      Don’t water cooled rigs use demi water or some sort of oil for better cooling and to avoid shorting the system with a leak? Because then you wont instantly fry your system if it breaks.

      1. Mattias42 says:

        Seen some cool (:D) builds using all sorts of liquids, honestly. The important bit seems to be that they A, store heat, and B, aren’t reactive with the cooling system.

        One of the semi-standard cool/exotic builds I’ve seen is Mineral Oil Cooling, where-in you basically sink an entire PC into an aquarium full of… well, mineral oil. They’re a bit of pain to get right, apperently, but offer some pretty good cooling and look really dang cool.

        Linus Tech Tips made a whole series on building one, and some of the head-aches they got afterwards:

        https://www.youtube.com/watch?v=2V06LLTNxc4

        Most amusingly crazy build I’ve seen was easily the vodka cooled gaming PC, though!

        https://www.youtube.com/watch?v=IYTJfLyo_vE

    3. Philadelphus says:

      Water cooling doesn’t necessarily mean you don’t need fans; my current water-cooled rig has five fans, for instance. Though due to the efficiency of transferring heat my CPU never gets hot enough for them to really ramp up, so they basically spend all their time at their slowest speed and you can hardly tell the computer’s on if you’re more than a few feet away. So yeah, not perfectly quiet, but I’ve certainly never been bothered by noise from my computer.

  11. Chris says:

    Gratz, youve made it shamus, it took years to build up the community, but you got it, time to cash out and live on your private island with your new computer.

  12. Mike P. says:

    The thing that makes ME want a liquid cooled PC is that it would mean that I don’t have to constantly clean pet hair out of the damn thing. =/

    1. Cilvre says:

      As long as you have fans in the system, you’ll be doing that. Though I recommend adding a fine mesh cover to the vents to grab as much as it can.

      1. Richard says:

        In theory you could put the radiator of a liquid cooled system in your fishtank, thus avoiding the need for any fans at all.

        In practice the fish probably wouldn’t like it much, and you’d be cleaning algae out instead. Swings and roundabouts.

      2. Guest says:

        A lot less though. Radiator fan will be your only fan getting enough use to need regular cleaning.

        1. Nessus says:

          This. Since you no longer need internal airflow, you can basically hermetically seal the case, and only ever have to dust the one external radiator. That’s enough to make it tempting for me.

          Pity it costs so much to set up. Or it did last time I looked into it (which admittedly was a while ago). Plus most graphics cards come with integrated air cooling, so I’m not up on how/if one adds liquid cooling to them without voiding the warranty.

    2. Jeff says:

      You just need a better case.

      For example, I’m using the Fractal Design R4, and all the vents have filters on brackets. You just slide them out to clean, and there’s no other access points unless you took off a port cover in the back and left it empty.

      1. Simplex says:

        I also have this case. It’s great and very quiet.

  13. Dreadjaws says:

    I’m so happy for you and not even a little bit jealous.

    I. SWEAR.

  14. Cilvre says:

    Hey Shamus, a step I always refer to when moving to a machine without removable media (since the netbook days), is to convert your discs that you often use into ISO’s, as windows 10 will mount them when you open them, and you can just eject them normally to remove them from the mounted disks.

  15. Mr. Wolf says:

    It’s really bad, but every time you upgrade (or somebody upgrades for you), I want to say:

    “Finally you can play Oblivion properly!”

  16. Raynor says:

    I don’t understand why people keep calling 1440p monitors 2k. If we go by vertical lines it’s 1440p, by columns it’s 2.5k, and the pixel count is 3.5m. There is nothing 2k about them.

    1. Zak McKracken says:

      I had to scroll down to this post to even get what Shamus means by “2k”, if not a regular old 1080p monitor

      So technically, I have an actual 2k monitor (2048 x 1152), but there’s no way I would have associated 2560×1440 with the expression. Are they marketed that way in the USA?

      The least implausible reason to call them that would be that they have twice as many pixels as full HD monitors, and those have 1080 lines, but that’s mixing units, so it’s a bit nonsensical.

  17. @Shamus you say 2K do you mean https://en.wikipedia.org/wiki/2K_resolution ?

    2K = 1080p

    The “k” thing is based on the width of a display, and as such you take the width and then round up or down to the nearest full 1000 value.
    So 1920×1080 = 2K.

    I’ll assume you meant 1440p though so 2560×1440? That would be called 3K if going by the K prefix.

    But Things can be messy when you add super widescreen into the mix like 2880×1080 which is also 3K.

    Then there is super widescreen resolutions like 5120×1440 which is 5K or 1440P depending on which direction you are referencing.

    My advise is to use the full resolution 1920×1080 or if really nerdy use (1920×1080)=~2 Megapixels, (2560×1440)=~3.6 MP, (3840×2160)=~8.3 MP (8 times more pixels than 1080p).

    I guess you could also use 1440p 16:9 as the aspect ratio would make it less ambiguous but 2560×1440 is just as easy to type.

    Megapixels plus framerate determines how hard it is to drive a resolution, HDMI and Displayport have limitations on how much bandwidth they have (latest standards has improved this though) so 4k60fps may not be possible with older cables or cards.

    As a side note, pixel density is also important, you want the pixels dense enough (small/close together) that you can’t see jagged lines, but you do not want the display too small either. Ideally you’d want around 300 pixels per square inch. Been a while since I calculated this but I think a 6K (5760×3240) display with a size of 22 inches is close to this, if your sitting an arms length away you should not see the individual pixels at this point.

    A 4K (7680×4320) display at 29 inches would be close to 300PPI. If you wanted a 58 inch display you’d need 15K (15360×8640) resolution (for some reason this is called 16K though).

    Incidentally, the max DirectX (and max Windows desktop size) is 32768×32768, so a 32K or 32768p resolution is the max, which is over 1 billion pixels per frame, at say 120Hz that would be 120billion pixels, forget about raytracing, even a simple sprite UI change may tax todays GFX cards at resolutions and framerates like that. (you’d need a framebuffer of 128MB per frame too).

    I’d like to see VA and IPS or preferably Micro LED panels of 4320×2160 (2:1 aspect ratio aka 18:9) freesync 32inch display able to do 150 Hz, why 150? Because 120/2=60fps, 150/3=50fps, 144/3=48fps, 150/5=30fps, 150/6=25fps, 144/6=24fps so video playback would be stutter free as they are divisible with integer framerates.
    I did not calculate if latest HDMI and Displayport standards are able to do 4320×2160@150fps per second in bandwidth as lossless and in 10bit though, so compromises may need to be made.

    1. Shamus says:

      It’s 2560×1440. Apparently, this resolution is properly called Quad HD? I’ve never even heard of that. That same Wikipedia page claims that 3K is exactly 3000×2000. Go figure. I really thought that 2k was the next step up from 1080p, so I sort of assumed that 2560×1440 was “2K”. It’s true that it doesn’t make a lot of sense to refer to 1440p as “2k”, but resolutions always seem to have randomly confusing names so I didn’t question it.

      So now after reading your comment and educating myself on Wikipedia, I’m better informed but much more confused.

      1. “I’m better informed but much more confused.”

        Haha. I know how you feel, and it reminds me of this classic XKCD https://xkcd.com/927/

      2. Liam O'Hagan says:

        The ‘quad HD’ comes from WQHD, which is wide quad high definition.

        The quad part is because it’s 4x the resolution of HDTV (720p)

        I’ll go on record as saying that 2560×1440 is my current favourite resolution; high enough that it looks crisp (on my good monitor at least. The same resolution on my work monitor looks pretty murky) but not too taxing for my old PC (GTX770 runs it without issues)

  18. Wiseman says:

    Is the Experienced Points column done? We haven’t seen new ones the past few weeks.

Thanks for joining the discussion. Be nice, don't post angry, and enjoy yourself. This is supposed to be fun. Your email address will not be published. Required fields are marked*

You can enclose spoilers in <strike> tags like so:
<strike>Darth Vader is Luke's father!</strike>

You can make things italics like this:
Can you imagine having Darth Vader as your <i>father</i>?

You can make things bold like this:
I'm <b>very</b> glad Darth Vader isn't my father.

You can make links like this:
I'm reading about <a href="http://en.wikipedia.org/wiki/Darth_Vader">Darth Vader</a> on Wikipedia!

You can quote someone like this:
Darth Vader said <blockquote>Luke, I am your father.</blockquote>

Leave a Reply to Stanley Raymond Cancel reply

Your email address will not be published.