Diecast #223: GPU, Linux vs. Windows, Net Neutrality

By Shamus Posted Monday Aug 27, 2018

Filed under: Diecast 50 comments

In the show I claimed we answered ALL mailbag questions. Technically we cut one because neither one of us had anything to say on the topic. Still, the ratio of useful / cut questions usually goes in the other direction so this was a pretty good week for the mailbag. As always, the address is in the header image.

Hosts: Paul, Shamus. Episode edited by Issac.

Show notes:

01:03 Graphics card naming

Dear Diecast

With Nvidia’s announcement of their new RTX 2000 cards and all their talk of Giga Rays it reminded me of a problem I used to have back when I worked in gaming retail.

Due to the fact that most PC games had/have limited use one time codes we had a no returns policy on PC games which is obviously a problem for people who do not know if a particular game will run on their hardware. Customers would often ask me if a particular game will run on their PC and I would try and answer to the best of my ability but sometimes it is not always easy to help. Sometimes their hardware would be older or newer and I would have to try guesstimate how it compared to the listed requirements or they would ask why 2 graphics cards were so different in price or performance when they had the same amount of VRAM, etc.

My question at the end of all of this is could there be a better way to classify and quantify hardware performance so that people who are not experts can more easily figure out what they have and what they need?

Regards Eric

10:51 Linux vs. Windows

Dear casters of die,

recently Steam released Proton, their custom Wine implementation allowing
Linux users to run Windows games. Do you guys think this could cause a shift
in the market?

As always, thanks for the great show!


Looking into this after the show, I can see why I didn’t hear about this sooner. Proton isn’t really ready for casual adoption yet. You can tell because you don’t get it from Valve. Instead, you get it from GitHub.

On one hand, getting Windows games running on Linux is a HARD problem to solve. On the other hand, Valve has deep pockets and it is solvable. It’s too soon to tell right now, but it will be very interesting to see how this works out in the long term. Maybe this will be another SteamBox style failure, or maybe this will be another paradigm shift like Steam itself. Either way, it’s a good thing for PC Gamers. Anything that keeps Microsoft from getting too complacent is good for us.

19:20 Satisfactory

Given you’re Minecraft and Factorio fans, thoughts / impressions / hopes for Satisfactory?



Link (YouTube)

22:16 Bully

Dear rollers of analogue pseudo random number generators,

I am back and I have an inquiry upon your opinions in regards to DLC. You might want to cut out the next bit for brevity.

The history of DLC since it inception is quite intriguing and turbulent. First there were cosmetic stuff and useless bits (Horse Armor). Followed by the wave of cut out content (Battle for Forli in AC2). Then there was the phase of some additional missions without anything really of interest in regards to gameplay and/or story. And lastly we had an increase of either mini expansions or experimental story/mission pieces. Things like the Fallout DLC comes to mind or the DLC packs for Mafia 3 and the newer Far Cry games (4 and beyond). The last thing i find especially interesting given that these are sometimes used to prototype mechanics and ideas for future games. Question: Do you think about this practice of using DLC as prototype testbed?

An addendum:
You stated San Andreas is the best Rockstar game because it had heart. I personally think Bully is the best one for similar reasons even if most characters are asshats. Have you ever played Bully or would you consider trying it out?

Furthermore I was wondering which kind of speedrun you prefer? The ones where one glitch is repeated over and over again and is abused ad absurdum or more of the technical kind where a lot of skills are being used?

There is a lot of bashing going on in the direction of Ubisoft due to their open world games all being the same. Or at least sameish. I personally have the feeling that this open world sameyness already existed long before the Ubisoft formula was established. Sure Open World games are samey within their genre but if you compare entries from a specific developer you will always have quite a bit of sameness. Just Cause and Mad Max are somewhat self similar. But given your recent experiences with some Rockstar games would you say that they are similar to each other. I personally would extend them being similar to all Rockstar games be it GTA, Bully or Read Dead. Please discuss.

Cheers and a nice weekend from Austria,

PS: Would you be willing to share the distribution of your readers over the globe if you have access to such data?

I should add to what I said on the show: I loved Bully, but the classroom minigames were pretty bad.

26:20 Distribution of readers

Like I said on the show: We’re coming up on the anniversary of the site. I’ll do a post with a bunch of random factoids about the site. I suspect that post will happen next Tuesday, the 4th. If you have any questions about the readership of this site, the database that holds it, or the technology that holds it all together, now is the time to ask.

26:59 Net Neutrality

To the last die caster.

What are your thoughts on Net Neutrality now that it was officially repealed as law in the US?

I know that when you first wrote about it Net Neutrality was an unspoken rule no-one broke because it would ruin it for everyone, but since then companies started to break it and ruined it for everyone, which in turn forced the US government to sign it into law.

Now that companies are free to break it, and since they’ve clearly paid handsomely for this “benefit” they will, how would it affect American costumers? And how it would affect other countries that have their connection go through US cables, like South American costumers trying to access servers in Europe and the like?

RCN, your Brazilian fan

Whenever I caution people against talking politics, I always get a little pushback in the form of, “But Shamus, EVERYTHING is politics!” Note here that when I say “politics” I’m talking specifically about partisan politics of the classic Red vs. Blue variety. These arguments will always spiral into ugly tribalism. There will always be one or two people who feel the need to unleash their pent-up frustration and irritation on anyone from the Other Party. They mean well, but these arguments are a dead-end and I don’t care to watch my readers tear each other apart over things we can’t change.

To put it more specifically: I really enjoy abstract thought experiments talking about how you balance the desire for freedom against the need for regulation and the drawbacks of going too far in either direction. That’s an interesting discussion. For contrast, I don’t care what you think of Donald Trump or Ajit Pai and I don’t care to moderate the argument that will result if you decide to start an argument centered American partisan politics.

There are a lot of interesting topics this week. Please don’t make me close the comments because you can’t participate in this discussion without picking a fight.

40:27 Modding

Have you had any experience making mods? Of so, what’s the most/least fun you’ve had doing so?

Thanks, Echo Tango!

Here is the Doom2 mod I made back in 1995. That’s the last major mod I made. It’s also just before I got married. (The release date is actually just a couple of weeks after the wedding.) I imagine these two facts are at least somewhat related.


From The Archives:

50 thoughts on “Diecast #223: GPU, Linux vs. Windows, Net Neutrality

  1. Knut says:

    You can actually get Proton from Valve if you enroll in the Steam beta. I tried it this weekend, and it seemed to work fine, although not many games are supported yet. It just installed the DirectX library when I started the game for the first time from Steam.

    1. bigben01985 says:

      Apparently non-Proton supported games can also work, it’s just not tested, hence they’re not in the list. I read an article claiming that at least.

      1. tmtvl says:

        According to GamingOnLinux people are busily reporting test results for a lot of games. Having looked at the Google Doc I can see that there’s a bunch of differing results from different people for the same game.

    2. Echo Tango says:

      I’m downloading the Steam beta as I type this. I always had to run Steam in PlayOnLinux[1], btu maybe I won’t have to do that anymore? It was always pretty buggy and I only got something like 75% of smaller (i.e. less beefy computer needed) indie games to work, and 25% of bigger / better-graphics titles to work. Plus, the specific font used in the store of Steam was some comes-with-Windows thing. Not all of Steam, just the store. So I could play games just fine, but I could never purchase anything from the store while I was running Steam in faked-Windows mode.

      [1] basically a pretty wrapper over Wine, which knows what specific versions of Wine it needs for each game, and other game-specific things like that.

      1. Echo Tango says:

        Gosh, finding the section of the settings to allow Proton for all games is needlessly hidden. It’s called “Steam Play”, but the tool is called Proton. At least this naming is still better than video cards… ^^;

        This doesn’t bode well. The first game I tried (Into The Breach) is on the official list from Valve, and it’s a simple title (2D graphics), AND it was one that worked just fine with Wine/PlayOnLinux. It just sits here with glitched-out graphics on the Window, without loading anything. :S

        1. Echo Tango says:

          Well, rebooting helped. I guess Steam thinks Linux should act like Windows… :S

    3. rabs says:

      At least, Good Robot works flawlessly with Proton 3.7-3 currently bundled with Steam beta.
      Proton 3.7-4 beta is also listed, and people can build their own if they want.

      Don’t know if it shows up in dev stats, I guess there are some metrics with players OS and stuff like that.

      1. John says:

        It’s not too surprising that Good Robot runs well under Proton. It already ran perfectly well under Wine. As far as I can tell, the major difference between Proton and regular old Wine is support for DX11 and DX12 and I’m fairly certain that Good Robot is an OpenGL game.

  2. Lee says:

    I’d love to hear about the tech stack behind the site. WordPress the main driver, but what OS version do you run? Pretty sure you said you run a dedicated server, so that’s up to you, right? How much RAM, CPU, disk space?

    Maybe I should just stop. ;)

  3. kunedog says:

    On my first computer, I modded nibbles.bas into a Tron speedcycle game, almost certainly my first act of programming ever.

    High school was a post-Doom and pre-internet time period for me. I bought a couple books about Doom modding and spent hours making levels and showing them to a couple of friends. Word and curiosity got around (most of the kids knew about Doom, but had no idea custom levels were possible) so one day I showed a larger group my latest work on the library computer: a recreation of the basketball gym, complete with the upstairs film room and classroom (sans some rooms directly above or below others, which is a well-known “2.5D” limitation of the Doom engine). For the classroom, I even copied that Doom room with a Baron of Hell teaching(?) a room full of imps.

    It went over well, so I started a project to recreate the whole school building. I then got into Duke 3D level-making partly because of the 2.5D thing, and decided to use it instead.

    I only finished the school’s front yard and never followed through with the rest, which is probably for the best because Columbine happened within a few months of me starting it.

  4. default_ex says:

    That graphics card problem seems to be getting worse. Now we are more often running into the problem of thermal throttling. So you can easily pick up an upgrade but yet get worse performance than your existing graphics card because it’s being throttled down to abysmally low speeds trying to save itself from itself. With how ridiculous graphics card prices have become, that’s a massive problem that is very likely going to result in a class action lawsuit against one or more offenders. Not like there aren’t options available for better coolers, they just aren’t sticking them on the cards or dress them in so much plastic that it can’t radiate heat.

    My current graphics card had a plastic shield over the heat sink, only opening was for the fan. It took awhile but it did thermal throttle the crap out of me at the worst possible times in games. Removed the plastic shroud and seen a massive improvement. Only time it thermal throttled since was when my room went over 80F while I was cleaning the AC (normally keep the room at 72F).

    1. Paul Spooner says:

      Just wanted to throw in my opinion here as a mechanical engineer who is ostensibly competent to design heat sinks. Regarding performance in general, heat sinks work on a thermal gradient, so they work better when the room is cold. Obviously it works better at 72 than at 80, but the designer doesn’t know your room temperature, which is my point.

      Concerning the plastic cover, yes, they do reduce environmental air flow, but they are (supposed to be, anyway) designed to channel the high-velocity air from the fan across the full surface of the heat sink. Perhaps the specific card you have wasn’t well designed, and removing the plastic cover makes it work better. Or perhaps it made it work worse, but other factors made up for it. Depending on your case layout, it’s possible to entrain hot air from the power supply and cpu into the graphics card fan, in which case removing the plastic case could improve the overall system performance, even though it’s making the heat exchanger worse as a sub-system.

      The other thing is, you should clean the fans and heat sinks often. Dust buildup will radically reduce the heat transfer. This problem gets worse when you have a single fan providing airflow, so taking the cover off could help to bypass this problem, but it’s a short-term solution.

      You can’t just add a bigger heat sink, because then it costs more, and then you buy a different graphics card. The solution is a self-defeating one. I’d expect most consumer hardware to be designed for a clean 70F environment, so if your room was 10 degrees over design temperature, and the fins were dirty, and your case setup is less than ideal, it’s no wonder that you hit the thermal limit.

      Point being, it’s complicated. Would certainly be nice if all of our computational devices were perfectly efficient and didn’t generate heat, or if we could afford giant heat sinks everywhere, but there are tradeoffs. The plastic cover is supposed to be part of the solution, whether or not it was operating properly in your specific case. I’m not saying it’s entirely your fault that your graphics card was thermal throttling. I’m just pointing out there are very plausible scenarios in which the designer can’t really be blamed for it either.

      1. Nessus says:

        Well, one problem I’ve had with previous cards was I couldn’t get to the fins to clean them without removing the shroud, and the shroud was mounted to the heat sink on the inside, and mounted to the board from the outside, with warranty void stickers over the screws, The shroud basically doubled as the bracket holding the heat sink to the chips.

        So if I merely wanted to clean the fins, I’d have to not just remove the plastic shroud, but actually remove the entire heat sink from the GPU chip, and void my warranty in the process.

        I don’t know how widespread that practice is. I kind of hope not very.

    2. Mephane says:

      That graphics card problem seems to be getting worse. Now we are more often running into the problem of thermal throttling. So you can easily pick up an upgrade but yet get worse performance than your existing graphics card because it’s being throttled down to abysmally low speeds trying to save itself from itself.

      There was a time when this was correct, but in the recent years not so much. Newer cards are nowadays generally more efficient, meaning more computing power per amount of consumed energy (and thus, quite directly, produced heat). That means even if your new card cannot reach its full capacity due to heat throttling, your old card at that same or lower temperature would have a weaker performance.

      And the inverse is also correct: if you don’t want to use that extra performance of a stronger card for more FPS or graphical bling, the new card would then reach a lower temperature, the fans would be less loud etc.

  5. John says:

    Shamus, I think you’re confusing the concept of the Nash equilibrium with The Prisoners’ Dilemma. A Nash equilibrium is a set of strategies such that each player is making the best possible response to the other players’ strategies. The Prisoners’ Dilemma is a specific game in which each of two prisoners has the option to inform on the other prisoner or keep silent. The dominant strategy for each prisoner is to inform on the other player and the game’s unique Nash equilibrium (some games have multiple equilibria and some have none) is that each prisoner chooses to inform on the other, even though they would have been better off if they had both kept silent. Not all Nash equilibria are like this. It depends on the game in question.

    Frankly, A Beautiful Mind has a lot to answer for, because the advice that the Nash character gives his friends in the scene you described–while it may or may not be good–does not constitute a Nash equilibrium. If none of the other men approach the most attractive woman, then the best response for the final man is to approach the most attractive woman. If any of the other men approach the most attractive woman, the best response for the final man is to approach a different woman. Consequently, if the men take Nash’s advice, none of them are making the best response to the others’ actions. The game’s real Nash equilibrium is this: one man approaches the most attractive woman and the others approach the other women. Incidentally, this is where we run up against the limits of the Nash equilibrium as a predictive tool. Note that the Nash equilibrium I described is not unique. In equilibrium, only one man approaches the most attractive woman but that man can in principle be any of the men. There’s no way to predict which one.

    Oh, and I’m sorry. That’s maybe more detail than you (or anyone) really needed, but thinking about this stuff used to be my job. Sorry again.

  6. Steve C says:

    Regarding graphics cards names etc, it is rare that I disagree with everything said on a Diecast topic. This is one of those times… Coming up with useful product names for consumers is a solvable problem. It is even fairly easy as business problems go. It is called the product mix. It consists of the product line length, product line width, product line depth and product line consistency. It is covered in any undergraduate business degree.

    Having a small number of brands was part of Apple’s philosophy when Steve Jobs was alive to make it easier on consumers. That is one option. Even with a large number of products it is still very possible. Proctor & Gamble sell a ridiculous number of products and is constantly changing them every year. For example a partial list– Detergents: Arial, Arial oxyblue, Ariel bar, Tide, Tide naturals, Tide bleach, Tide plus. Shampoos: Head & Shoulders, Head & Shoulders anti dandruff, Pantene, Pantene damage repair, Pantene pro-v. And then within all those brands are various scents, hair types, etc that further segments their lines. P&G carries more products than GPU manufacturers and yet the supermarket isle is understandable to consumers.

    All there needs to be is a brand that is built to be understandable as a standard naming convention. For example the date of the product could be built into the name. A name like “RTX 2017Q2” automatically provides more useful consumer information at a glance than the “RTX 2000”. And that’s not even a good name. However a date at least is something meaningful where as “RTX 2000” is just a bad name.

    Yes, there are countless chip manufacturers which makes it more complex. Except that does not matter. Those chips are being manufactured under license. The terms of that license could include a mandatory standardized naming convention. It can also prohibit manufacturers from using certain terms. If Nvidia wanted to use “Enterprise” or “Gold” to mean specific things, they have it within their power to make those licensing companies obey.

    The confusing product names will not be due to language reasons. It is an international multi-billion dollar industry. Guaranteed they have marketing employees who are fluent in the language, the market and the importance of branding. Nvidia and AMD don’t need to be American to figure this out. Their English divisions are huge in their own right. More likely these companies have market research that shows that confusing names is more profitable than clear ones.

    1. kdansky says:

      The confusing names of nVidia GPUs is not because they are incompetent. It’s because they want to mislead customers into spending more for less. There are way too many examples of nVidia deliberately labelling products incorrectly.

      The whole 9 -> 10 -> 20 jump is just a marketing trick, as is going from GTX to RTX. This new line is a mediocre product, sold at the highest price they ever charged. Hell, they made it pretty clear that instead of having xx80 as the flagship and xx70 as the second place, this time the xx80 is the second place (as the xx80 TI is now taking over that spot), with the new xx70 just being a xx60 in reality.

      And yet the new xx70 is $100 more than the current xx70 (both MSRP at release time).

      If you buy an RTX 2080, you’re essentially getting the GTX 1170 for the price of a GTX1080 TI, but the name tries to cover up that you’re getting scammed. And since nobody has an alternative and nVidia keeps pushing their proprietary tech into games (Hairworks screwed Witcher 3 performance) they can get away with it. Worst of all from the perspective of an enthusiast: The new cards don’t have impressive specs at all. The current 1080 will probably be faster than the new 2070 (which should actually be called 1160).

      This isn’t all. nVidia says their new 20-series is on 12nm, when in reality it’s just a tuned 16nm process. It’s all lies and marketing.

      1. Shamus says:

        I didn’t even know about this. I knew NVIDIA had a stupid terrible naming scheme, but I attributed it to sloth rather than deception.

        1. kdansky says:

          The crazy part is that their scheme was perfectly sensible. It works like this: AB0 – A is generation, B is relative power level (8 is best, 7 is enthusiast, 6 is average, 5 is budget, 4 is trash and 3 is not even a GPU). A is either 6, 7, 9, 10 or 20 (nice jump there), and there is optionally a TI signifier behind the thing which basically denominates the midpoint to the next tier. There is a trailing 0 because that’s cool. Power level above 8 is just called “Titan”.

          So a 1070 TI is the tenth generation (Pascal), and basically the 7 + 0.5 on a power scale of ~5 to ~9.

          But the new scheme moved from 10 to 20 (usually we have 11 first, but neither nVidia nor Microsoft can count), and also moved the power level curve downwards. They had a mostly working system which made sense, but now they abuse it to upsell their new generation.

          But they only had that system for a short time, and there are a ton of exceptions where they deliberately mislabelled a card.

      2. WarlockOfOz says:

        I’m substantially more positive about the recent Nvidia cards than you, even though I don’t intend to buy them.

        As I understand the new cards, it appears that the xx70 is roughly comparable to the previous generations xx80, similar to previous refreshes. It also gets the raytracing woo, though it’s too soon to know whether that is going to be a big thing. If I was planning on buying a new video card in that price range, I’d choose a 2070 over a 1080 – about the same price, about the same performance in current games, potentially superior in future ones.

        What they haven’t delivered is +50% performance for the same price like previous generations, but I doubt they’ll have trouble selling the new cards. I’ll be interested to see how their competition responds.

    2. Shamus says:

      You acted like you’re disagreeing with me, but I agree with almost everything you said so I’m not sure what you’re arguing with?

      Yes,coming up with product names is totally do-able. Agreed. Getting all these manufacturers to embrace a single coherent system is the hard part. They’re free to make the cards however they like. Cheap fans. Extra memory. Smaller footprint. This is how they compete with each other. Assuming they continue to do this, how would you solve this problem?

      If you tell them to use a fixed list of product names, then you wind up with the problem where the AlphaWidget 7 from one company is fundamentally different from the AlphaWidget 7 from another company. Result: Confusion.

      If you give each unique card a unique name, then you’re back to the situation we have now where one chip generation has a thousand variants and the consumer doesn’t have a good way to search through them except to read the technical specs. Result: Confusion.

      When I said “they” needed to have a grasp of English, I was talking about “Obscure cardmaker #672”, not “NIVIDIA”. These places put NVIDIA chips on their own circuit boards, and they have no marketing expertise. Now, if NVIDIA is going to do the marketing FOR them, then NVIDIA needs to start imposing standards onto these companies, which leads us back to the need for some fundamental change in the way the system works.

      Yes, there are countless chip manufacturers which makes it more complex. Except that does not matter. Those chips are being manufactured under license. The terms of that license could include a mandatory standardized naming convention. It can also prohibit manufacturers from using certain terms. If Nvidia wanted to use “Enterprise” or “Gold” to mean specific things, they have it within their power to make those licensing companies obey.

      This is the part I think is difficult. If NIVIDIA has the power to impose this sort of structure then they absolutely should, but I doubt it’s that easy. If I was one of these companies, I would resist any change that forbids me from differentiating my product from the competition. We (and NVIDIA) WANT them to make a small, coherent list of products with simple names. They want to fuss around with little details, cutting corners in one place undercut the competition, or adding an extra somewhere else to carve out another sub-sub market of medium-high end to undercut the high-high end, or some gimmick to draw away rich customers by creating a ultra-high end to draw away the high-high end.

      1. Hector says:

        I had been thinking that the GPU manufacturers could assign every major hardware release a “year,” just as a basic promise (not guarantee) that if your PC otherwise meets the specifications, you can should be able to play anything from that year and newer, bad software optimization notwithstanding. This is easy to understand and simple to remember, and at least provides a good starting point for manufacturers to add their own clarifications. You might object that it’s impossible to see the needed specifications for what’s coming down the technology pipeline, but I’d respond that Nvidia and Radeon are in exactly the position needed to estimate this.

        I sometimes wonder if Nvidia misses that people *don’t* upgrade because it’s such a pain to untangle and research. For mjyself, I definitely don’t feel like paying for the latest GPU’s when I can’t trust that they actually work as claimed.

        I also want to see externalized GPU’s developed more. This makes using and “installing” them far easier more the layman. I find it a pain to grab a computer, pop open the case, install an upgraded power supply, use proper cable management, add a better case fan and so on even if it’s not actually hard. For an ordinary consumer, that’s a huge step, and almost nobody is going to build a custom PC themselves anymore. (Yes, it’s probably fun to techheads. It’s also a huge hassle for anyone else.) And pre-built computers are good but very pricey compared to consoles.

      2. Steve C says:

        Maybe I misunderstood you? I listened to that section of the Diecast a second time. I believe I did understand what you both were saying. (Nvidia is just an example. The same thing applies to AMD.)

        I’m saying there is nothing special about GPUs compared to every other kind of product made. From laundry soap to oil filters to power tools– GPUs are not special. They are not more complex to other goods in terms of product mix. Proctor & Gamble sells the PGC25574. The Amazon version of it is the B00FT48O6Y. Nobody cares about the codes. That is a 24oz bottle of Ivory Dish Detergent, Classic Scent, Liquid. Which has a different code to the 709ml version (which is also 24oz but different for product purposes), the 12oz version, or the lemon version, or the french version, or the yellow packaging instead of white, or powder, or the renamed for a store’s signature brand … you get the idea. That is a single bottle of dish soap. It has countless variations all covered by the product codes. A consumer doesn’t need to sort out what PGC25574 means. They’ll just know it as a “medium bottle of ivory dish soap”.

        Think about that level of product complexity, then add in all the possible chemical variations of “dish soap” and now compare it to how many different options of GPU there are. GPUs are not special. They are not more complex. The name the customer sees should reduce the complexity. Nvidia can solve this problem if they wish to do so. (They don’t.)

        Cheap fans. Extra memory. Smaller footprint. This is how they compete with each other. Assuming they continue to do this, how would you solve this problem?

        With contractual agreements attached to the license that mandates a naming convention common to all Nvidia + a set of names to subdivide quality for specific chip manufacturers.

        For example Nvidia might have 3 terms, (eg Gold, Platinum, Titanium) and Gigabyte might have 3 terms (eg Plus, Premium, Ultra). So a product name might be “Gigabyte 2016 Premium Platinum”. The details of fan speed, footprint, etc don’t matter for that advertised name. Those are listed as the part number in the specifications. What matters is for the brand name to express the relationship of one product to another. The relative value, not the absolute value. It says it is a 2yr old card that was middle of the road out of both Nvidia’s and Gigabyte’s offerings when it was manufactured. Is that what you want? Maybe? You can check the specs once and now you have a means of evaluating the rest of the product mix. Maybe that’s your old card. You just want the same thing but newer– Maybe the Asus 2019 Premium Platinum next time. Oh wait, the Asus 2019 Ultra Platinum is only $10 more. I’ll get that.

        I was talking about “Obscure cardmaker #672”, not “NIVIDIA”.

        Nvidia can give all cardmakers a list of mandatory words, a list of prohibited terms, a list of recommended words, and a ranking system. Obscure cardmaker #672 must call their most expensive product produced this year the Ultra 2018. Is it worth it compared to cardmaker #673? That’s up to #672’s and #673’s build quality and marketing. Obscure manufacturers might not like it, but Nvidia is calling the shots with their license. And there is still plenty of room for manufacturers to maneuver.

        However what GPU manufacturers have decided to do is more similar to oil filter method of product mix. (The B&S 492932s, WIX 57035, and kohler 12-050-01 are all the same oil filter but slightly different if cut apart.) Nvidia doesn’t care and the confusion is probably better for them.

        1. Echo Tango says:

          “Nvidia doesn’t care and the confusion is probably better for them.”

          It might be better in short-term sales, or for pulling money from the top X% of people who have enough money to waste on new cards frequently. However, I’d wager that many people simply go an extra year or two on their current hardware, rather than pay a large amount on what is effectively a gamble. I know I do it myself, and it was pointed out in somebody else’s comment as well, that this is a behaviour customers can choose. Why should I take the risk of getting ripped off, because I can’t decipher the mess of product names? The average hardware specs go up over time, so if I wait a couple years until the average laptop / graphics card can play all the games I care about, I guarantee that I’m at least not getting ripped off by large amount.

          1. utzel says:

            But the comparison can be so easy, just look at this handy chart ;)

            To be serious though, I don’t think it’s really that hard with GPUs, if you just look for the chip first and ignore the vendor stuff. (Or it would be if Nvidia and AMD could stick to their own naming schemes). Generally speaking the chip is the most important part, there aren’t many variants with different RAM sizes around any more and if you aren’t tech savy you probably won’t miss or even use a special feature. There hasn’t been a hard cut in a while where you can’t even start a game, so a card not supporting DX12 for example (are there DX12 only games except MS Store exclusives?) would be so ancient it would be too slow anyhow. Same with PCIe, a new card in an older 2.0 slot will just work, just a tiny bit slower.
            With a specific chip you got a small performance and price range, and can check if that’s in your ballpark (or if you want to play in the next one over that’s even bigger, but with higher entry fee).
            All the confusing vendor version names can mostly be ignored, and chip name and RAM size are included somewhere. They will only have different heat sinks and fans (only really interesting if you watch for noise and/or size) and overclocked stuff (a bit faster, but also more expensive, which closes the gap between two chips). Generally not worth worrying about if you just want to get something to play and don’t have any particular needs (like a tiny case, PC in bedroom that needs to be real quiet, want to overclock).

            1. Echo Tango says:

              Unfortunately for me, having a computer that’s smaller than my microwave, and quieter aren’t optional features. That’s a slight exaggeration, but only slight; Modern computer parts are made for people who want raw power at any cost, and don’t mind having a computer that throws out as much thermal energy as a typical space heater. Computer parts / cases are roughly 50% bigger (average) now, as when I last cared about making my own PCs, back in the early 2000s. :S

    3. Nessus says:

      “P&G carries more products than GPU manufacturers and yet the supermarket isle is understandable to consumers.”

      Not nearly as true as you’re framing it. For example: “Head & Shoulders” is an anti-dandruff shampoo. That’s always been its entire brand persona. If I see a bottle of “Head & Shoulders” next to a bottle of “Head & Shoulders: Anti Dandruff”, I get confused, because that wording implies that one isn’t an anti dandruff shampoo. It might mean one is regular strength, and the other extra strength, but that’s not the naming convention being used: the convention that is being used implies the non-anti-dandruff one is the original, and the “anti-dandruff” one is a new variant.

      I see this all the time. Big companies, in their zeal to compete for finite shelf space, over proliferate variants of a given product and vaguely word the labels so customers can’t parse which variant they should choose for their particular needs. Instead of “something for everyone” variety, you just have a confusing morass where you might as well ignore the labels and do blind buy-and-try. Y’know: just like it was with multiple brands offering their own versions of the same product.

      Cynical as I am, I suspect this is deliberate. If every customer has to try multiple variants to find the one they need, that means more total sales, and more distribution of sales across the product sub-line, making those variants less risk to produce. It’s not about giving the customer choices, it’s about denying them the ability to choose competitors by hogging store shelf space*. That’s not a failure from a short-term profitability standpoint, but it is a failure from a branding standpoint. If a competitor can give me a smaller, more clearly explained product line, I’ll be more likely to buy from them as they’ll have me feeling much better about my ability to make an informed choice, and thus about my chances of a successful choice.

      *I suppose it’s tempting to think internet shopping would spell the downfall of this strategy, but no: just replace “physical store shelf space” with “search algorithm optimization”, and it still works.

      1. Droid says:

        There are brands like that where I come from. They’re usually discounter-held brands, but they do a good job explaining themselves to you. The ones I have in mind are S-Budget (no good images to be found) and Jeden Tag (“every day”) which come really close to the obligatory XKCD stance on the topic.

        1. Droid says:

          Did I just manage to post a comment with two links in it without it going into moderation?

          (I AM GOD!!!) *ahem* Must be my lucky day!

      2. Echo Tango says:

        Sure, *some* customers might try multiple different brands to get the best thing, but many will simply go with what they already know, or choose the cheap discount brand, to lessen the gamble they’re taking on un-decipherable brand-names and product names. Also, we live in the age of smartphones and customer reviews. I can pull out of my pocket, a device which lets me look up what other people thought of a product – no purchase necessary!

        1. Nessus says:

          Looking up reviews for everything on ones phone takes forever when you’re actually standing in the isle, and is a pain in the ass even under ideal circumstances because of how much both mobile browsers and mobile web pages suck. I know literally no-one who actually does this in the field on any kind of standard basis for anything other than picking restaurants off Yelp.

          Most people, as you initially say, just grab whatever they’ve heard of, have already used, or just whatever randomly catches their eye instead of actually comparing anything. But the fact that that’s what most people do doesn’t mean its the best or even a smart thing to do. Quite the opposite, often enough.

  7. Cainis says:

    The Download folder problem is because of a setting you can change. If you right click on the Downloads folder, then select properties. Click on the Customize tab then select ‘General items’ in the ‘Optimize this folder for:’ list. That should fix it. It’s trying to decide if it needs to generate thumbnails for video/images in the folder. Changing it to ‘general items’ prevents it from scanning all the files in the directory. IDK why they decided it was a good idea, but here we are.

    1. Droid says:

      Assuming a huge folder is not set to “[small/normal/large] symbols”, but rather to something more reasonable like “list” or “details”; does that setting still affect load speed? Those latter two don’t have thumbnails at all afaik, only icons.

      1. Cainis says:

        Yes, I always use detailed view when I can. It still scans all the files in the folder to get length, generate a thumbnail, etc. It’s how it displays different information in the detail view if it’s a folder of video items. Pictures says ‘tags’, video gives ‘length’, for example. That’s part of the information it gathers and makes it take so long to respond. IDK why they don’t do a better job caching the results and maybe do a comparison on the last modified instead, but it scans constantly.

        This doesn’t apply to network shares, even if it’s a mapped drive.

    2. Paul Spooner says:

      That would explain why I’ve never encountered this particular problem, as I don’t have folders in my downloads directory, and keep it fairly clean. Still no excuse for making the file browser nonresponsive in the name of “optimization”. How is it optimal to not be able to interact with the operating system?

      BTW, the “Save my name, email, and website in this browser for the next time I comment.” checkbox seems to be working perfectly. So, that’s nice.

      1. galacticplumber says:

        Huh. So it is. Yay. I can finally use this hassle free again.

    3. FortrFire says:

      Just wanted to chime in on the download folder lag thing: I had the same problem, and that solution fixed it for me. It also happens to be the first google result I get for it, which makes me think that Shamus has likely already tried it. :(

  8. galacticplumber says:

    I think I actually like the new theme better now that I’ve had time to settle in. It’s simpler on the whole, but comments are easier to read due to what I think might be a difference in reply nesting? Also the typing fields for necessary criteria are back to being reasonable from the left as expected. This is excellent.

  9. Olivier FAURE says:

    Net neutrality is a very complicated topic, and a lot of people get really angry about it; that combination means it’s really hard to get good information on the topic under the sea of unfounded speculation and fear-mongering (eg the 20-yo image that goes around with the different websites listed by price tag, “THIS IS WHAT YOUR FUTURE LOOKS LIKE WITHOUT NET NEUTRALITY”).

    Fortunately I did some personal research on the subject a few months ago and I kept some notes, so here’s what I have.


    Just kidding. But seriously, even after spending dozens of hours on the subject, I still hesitate to say that anyone is right or wrong. The only thing I know for sure is that the people indignantly shouting that *of course* the ends of NN regulations mean that ISPs will start throttling everything have no idea what they’re talking about.

    I’m not going to go in depth into BGP, IXPs, CLECs (and why they’re necessary and underrated), CDNs, or Google Global Cache, but I’d like to point out that these things exist and that they’re central to the internet architecture, yet almost nobody knows anything about them (me included).

    That being said, there are generally a few things people worry about when Net Neutrality comes up. Let’s say our Brazilian commenter downloads a video from a Canadian company (we’ll call it CanadaTube), and the video transits through a Canadian ISP (call it CanadaNet), an American ISP (let’s say Comcast) and a Brazilian ISP. Here are some ways the ISPs can influence the speed of that download:

    – Comcast might slow down (or speed up for a price) the download because CanadaTube is their competitor. In other words, Comcast might tax the CanadaTube data it’s relaying.

    – CanadaNet may try to charge CanadaTube for additional peering ports (basically saying “I can send more of your data to the internet at once, but I’ll charge you more for it”). Alternatively, CanadaNet may sell a faster connection to CanadaTube; this connection would benefit Canadian users more than Brazilian users, who still need the data to go through Comcast & co.

    – The Brazilian ISP may agree with CanadaTube to install some servers keeping a cache of CanadaTube’s most popular videos. The way, when these videos are downloaded, the data only transits through Brazil, instead of the entire continent.

    – Alternatively, CanadaTube may buy server space in Brazil, and redirect Brazilian users to this server for video downloads, without ever interacting with any ISP.

    Scenario 1 is what most users are afraid of, and violates net neutrality, but I’ve seen almost no evidence of it ever happening on a large scale at any point now or in the future. Also, I think technical constraints make it more difficult than it looks, and that cases like this in the US can be handled by the Federal Trade Committee since the FCC reform.

    Scenario 2 is standard business practice. In fact, I’m suspicious that 90% of “ISP X throttled content provider Y” articles that people share on Reddit are variations of Scenario 2. People might say that CanadaTube should be given as much bandwidth as it needs, and that withholding bandwidth if CanadaTube doesn’t pay more counts as throttling and violates Net Neutrality, but I strongly disagree. Bandwidth (and super-fast transit) is a scarce service that CanadaNet provides to CanadaTube (yes, I know you’re tired of me repeating “Canada” over and over again). In a healthy market, they should be able to sell that bandwidth however they want; people don’t like the idea of scarce things being more expensive, but allowing that to happen is how you get infrastructure built, that eventually leads to more competition.

    Scenario 3 and 4 are rarely mentioned. They’re Google Global Cache, and Netflix Open Connect, and CDNs like CloudFlare. I don’t think they’re too controversial, except in that they’re another way in which big companies like Google have an edge over small companies (though in practice small companies can totally buy distributed server space from CloudFlare & co).

    Anyway, my personal opinion is that a lot of fears people have over Net Neutrality are unfounded and based on a lack of comprehension of how internet architecture really works.

    Also people in the US should worry less about NN and more about why their internet is so expensive for its speed, but that’s another rabbit hole.

    1. Paul Spooner says:

      Both internet speed and public transit in the USA share the fundamental root cause of the country being large, low population density, and egalitarian. I thought Shamus had made a post about this in the past, but I can’t seem to find it. Anyway, yeah, you can get twice the bandwidth for half the cost if you live in Singapore (and nearly free public transit too), but you can also walk across the whole country in a day, and it has the same population as Nebraska and Kansas put together.

    2. Zak McKracken says:

      I think one problem with these scenarios is that users will often find it hard to tell the difference between them. If the video loads slowly in Brazil, most users will either blame their ISP or CanadaTube. So then Brazilian ISPs and Canadian content providers will have an incentive to increase transmission speeds through the US, and at least one way to do that is to pay the bridge troll… having cached copies on the other side of the US will help but will only get you so far.

      Another problem with this is that several ISPs around the world (I know of some instances in Germany and the UK) have already attempted to sell non-neutral access, where the traffic to paying providers is handled regularly but everything else is throttled. Again, for a consumer, it is very hard to tell the difference between this scenario and one where “the pipes” are actually too narrow. Although, generally, they’re not too narrow, so I’d be reluctant to believe any ISP claiming otherwise.

    3. Zak McKracken says:

      There is also the extra problem in a market like the US, where most people don’t have much choice as to who their ISP is, which is that ISPs can use non-NN trickery to squeeze smaller competitors out of the market (by treating traffic from their competitors worse than other peoples’ traffic), to keep things nice and cozy for themselves, and permit themselves to treat their own customers successively worse, because the only choice is between the huge big ISP and nothing, which isn’t actually very much of a choice.

      I understand that “make it illegal” may look heavy-handed to some. I’m not actually certain whether there would be other ways of dealing with this. In much or Europe, the power grids and the power suppliers used to he the same companies, which led to problems. After splitting that up, things have become quite a bit better, and the grids are more stable because the company generating power pays another company to transport the power to the customer, and the grid operator is heavily regulated so they can’t give preference to some power company’s wishes, or charge different power sources differently. Similar with DSL access: There used to be state-owned monopolies on phone and data grids, but the now-privatized companies still have to obey strict laws which force them to give equal access to those lines to all their competitors, to allow a market to develop — otherwise they’d just be squatting the nice grid which they inherited from the state, charge users through the nose for it, because they could effectively keep their customers hostage. All of that doesn’t directly require NN rules, but it does require relatively strong regulation, or else it would end up a straight-up monopoly and nobody (except the monopolist) profits from that.

  10. John says:

    Unfortunately, I’m a bit skeptical about Proton. I do almost all my gaming on a Linux machine. I mostly buy and play games with Linux ports, but I’ve run a variety of Windows games, from the Windows 95 era through DirectX 10, on the same machine using Wine. Take it from me, it’s a finicky process. Some games just work. Some games, somehow, work even better with Wine than they do on the same hardware under Windows. (It’s pretty rare though. I’ve only ever seen it happen once.) Some games will work with just a little tweaking. Sometimes it’s as simple as specifying the correct version of Windows for the game in the Wine config utility. But a lot of games require substantial tweaking. Maybe you’ve got to install the game–and Steam!–in a 32-bit Wineprefix. Maybe you need to enable, disable, or redirect a particular dll. Maybe you need to edit some ini file somewhere. And some games just flat out won’t work because you haven’t got some obscure video codec even though the game’s page at the Wine AppDB says it should work just fine. Eh, not that I’m bitter.

    The point of all this is that there’s more, often a lot more, to getting Windows games to run on Linux than providing DX11 and DX12 support, which is as far as I can tell the major difference between Proton and Wine. It requires, or at least can require, a lot of game-specific effort involving work by actual humans. I don’t think that’s something you can automate. The problem is that I have a hard time believing that Valve is willing or able to sink a lot of man-hours into doing this sort of thing. Now, you can in principle crowd-source it, which sounds like a Valve move. If you took the Wine AppDB and PlayOnLinux and sort of smushed them together you might get something workable. For some people. Some of the time. But for as much success as Echo Tango seems to have had, PlayOnLinux has never really worked properly for me. I once used it to get the Steam client for Windows running, but it’s been a total failure when it comes to actual games. So I’m not confident that a crowd-sourced approach would produce something that meets the reliability we (should be able to) expect in a piece of commercial software.

    In the end, the most I expect from Proton–as distinct from Wine–is support for a relatively small number of relatively big games using recent versions of DirectX that don’t already have Linux ports. I suppose that’s an improvement. It’s not the world I’d like to see, though. The world I’d like to see has more Linux ports in it. People often say that there are no games on Linux. That isn’t true, even if we ignore Wine. There are a lot of games on Linux, some of them quite big. Outfits like Feral and Aspyr port big games–Civilization, Shadow of Mordor, Deus Ex, Hitman, to name a few just off the top of my head–on a regular basis. I think things have been moving in a positive direction and I’d like them to keep moving in a positive direction. If Proton discourages a publisher or developer from making a Linux port because “oh, Valve will take care of it” then that’s not an improvement, not as far as I’m concerned.

    1. Echo Tango says:

      The one thing that Proton might do for Linux gaming, is that all players will count as Linux, and not count for Windows. (Allegedly, Wine might count as Windows, but I don’t know if Valve is already counting regular Wine for Linux.) So, if publishers see that the market share is larger, they might make more native Linux ports. However, as you point out, Proton might just make publishers even less likely to make a proper port.

      A game’s problems running on Linux could be the fault of Steam, the game engine (plural, if the “engine” is a mix of different graphics, audio, networking, etc, engines from different companies), the Linux distro the player is using, or the actual game itself. These problems also exist for Windows and Mac, but since there’s larger numbers of people using those systems, there’s more money being spent fixing the problems in each of those layers / different companies.

  11. Rick C says:

    Hmm. That Doom mod (and the other ones mentioned) sure seems like shams.

  12. Olivier FAURE says:

    Hey Shamus, I made a really long comments with a ton of links about net neutrality, I think it was swallowed by the spam filter. Please help :(

    1. Shamus says:

      Wow. That didn’t just go into moderation, that went all the way into the spam abyss. Which is strange, since it only has one link and no red-flag keywords. If you hadn’t let me know, I never would have found it.

      It’s restored now.

      1. RCN says:

        The system doesn’t tell you what flagged a comment?

        You’d think a spam filter would at least give you that much data.

        1. Paul Spooner says:

          IIRC “the system” is five different layers of spam filters. The lowest level ones don’t notify because then there would be hundreds (or even thousands?) of notifications per hour. They also don’t flag triggers, because then spammers could just install the filter on their own dummy site and use the flags to engineer spam that would sneak through the filters undetected.

  13. Leipävelho says:

    “This is a nine-level WAD file with new graphics, music, and sound FX. The levels are very detailed, and are designed to mimic the “style” of the original shareware Doom. I have always thought that the best official Doom levels were the shareware levels. Even though they were too easy, and didn’t have the same weapons/monsters variety that Doom 2 does, they LOOKED better than any of the other levels. I think this is because Doom is much better at simulating man-made structures than it is at simulating the “hell” type levels.

    This is an attempt at bringing back the “classic”, realistic look of the Doom 1 levels. The story is the same as Doom 1 – Knee Deep in the Dead. You must fight your way through the Phobos moon base to the anomaly that monsters are pouring through. Check out the additional notes at the end of this file for more level info.”

    Somehow that description is exactly what I expected.

Thanks for joining the discussion. Be nice, don't post angry, and enjoy yourself. This is supposed to be fun. Your email address will not be published. Required fields are marked*

You can enclose spoilers in <strike> tags like so:
<strike>Darth Vader is Luke's father!</strike>

You can make things italics like this:
Can you imagine having Darth Vader as your <i>father</i>?

You can make things bold like this:
I'm <b>very</b> glad Darth Vader isn't my father.

You can make links like this:
I'm reading about <a href="http://en.wikipedia.org/wiki/Darth_Vader">Darth Vader</a> on Wikipedia!

You can quote someone like this:
Darth Vader said <blockquote>Luke, I am your father.</blockquote>

Leave a Reply

Your email address will not be published.