Hey man, I need a new toaster. You know all about kitchen stuff. Have any suggestions?
The KitchenAid4000 series just came out.
Are those good?
I have a KA4510, and it’s really good.
Does it have 4 slots?
Oh you want 4 slots? Well, the KA4510 XN goes up to four slots, but it only toasts one side.
Let’s pretend I want to toast both sides.
Then you probably don’t want a KitchenAid. Their 4000 series 4-slicers aren’t very good. You could get one of the old KA3510 XN or XNS for cheap these days, but they take like, twenty minutes to toast the bread.
Er. What else is there?
The Cuisinart 7000 series is comparable to the KA 4000 series. The 7420, 7520, and the 7420 all do four slices. Just don’t get any of the SIP models because they can’t do bagels.
“Slim Insertion Port”. The units are small, but only regular sliced bread will fit. KA has the same thing on many of their units. Actually, if you want to do bagels with a KA you’ll need the ASI units.
“Adaptable Slot Interface”. It just means it can handle bread of varying widths.
So I should get a Cuisinart ASI?
No no no. That’s nonsense. In Cuisinart the units all handle wide bread unless they are SIP.
My head hurts. So I want a Cuisinart 7000 series, but not a SIP, right?
Pretty much. Now, the 7000 series is actually two generations. You don’t want anything before the 7400, because the pre-7400 units actually took up two wall plugs. The 7100 and 7200 four-slotters were actually two dual-slot units strapped together, so they had two cords. Plus, they didn’t have a timer so you had to stand over them yourself.
All I want is to toast bread! Four slices! Both sides!
Then the C7520 T series is for you. You can pick one up at Wall-Mart for about $400 these days.
FOUR HUNDRED DOLLARS! I could buy an oven for that! I could just go out to eat every morning for that kind of money!
Ah, if you’re worried about price then the KitchenAid 4510 ES is a good pick. It’s only got three slots but it’s retailing for about $90.
I’m looking in the Wal-Mart flyer, but I don’t see that model.
Sure you do. Right here: The “Magitoast 7”. See how underneath it says “KA4510 Ex”? That means it’s the KitchenAid 4510 ES or the KitchenAid 4510 EP, just with a brand name slapped onto it.
KitchenAid and Cuisinart don’t actually sell models directly. They make the insides parts of toasters, then other companies buy them, put the fancy shell on them, and give them a new brand name. But if you want to know what you’re getting, you have to look at which design the unit is based on.
Ah! I get it! Then why don’t I get this “TastyToast 2000”, which is like that 7520 you mentioned earlier. This one is only $50.
Er. That’s not the same thing. That’s a 7520 OS. The OS means “One Slice”. Total bargain unit for suckers. Some goes for the 6000 series and anything with a MRQ after it.
You know what? I’ve decided I don’t want toast anymore. I’m switching to breakfast cereal.
I’m shopping for a graphics card, and this is exactly what I’m going through, except I don’t have a know-it-all to help me out. I have never seen such rampant ineptitude at marketing products. I’m even savvy enough to know what I’m looking for, but the endless chipset numbers and sub-types and varying configurations makes it impossible to get any sort of handle on the thing. It’s actually worse than my example above, since higher numbers aren’t always better. I’ve searched around, and I have yet to find a breakdown as clear as the conversation above. What is the difference between these two generations of cards? What does this suffix mean? Why am I seeing this chipset in one place for $119.99 and elsewhere for $299.99? Is this the same product with a huge markup, or is this second unit different in some way I can’t discern?
Features get added in the middle of numeric series. Like, an NVIDIA 7800 supports 3.0 pixel shaders, and earlier 7000 models don’t. (Or don’t list it among their features.) So it’s impossible to do any real comparison shopping until you’ve memorized all the feature sets for all the chipset numbers for both NVIDIA and ATI. Yeah, let me get right on that.
Game developers who keep cranking up the system specs are killing themselves. They’re making sure that their only customers are people who are willing to wade through this idiocy, fork over hundreds of bucks, and then muck about inside of their computers to do the upgrade. You shouldn’t need to be Seth Godin to realize most people would rather drop that same $400 on a console and have done with it. In fact, it’s pretty clear that this is exactly what people are doing by the millions.
The main advantage of the PC as a gaming platform was its sheer ubiquity. But while PCs are probably more common than televisions, PCs which are equipped with the latest hardware are pretty rare, and graphics card manufacturers seem to be doing their level best to keep it that way.
This is the second time this year I looked into upgrading, and both times it seemed like such a stupid, pointless hassle. Like our toaster-buying friend above, I know what I want, but its the sellers job to tell me what they got. Offering someone a Fargleblaster 9672 XTQ is stupid and meaningless.
It really is a shame to watch this aggregate stupidity suck all of the fun out of this hobby. Buying other electronics is fun, but buying graphics hardware is homework. ATI and NVIDIA need to adopt a policy of sensible naming of product lines, fewer products, greater differences between products, and (most importantly) clearly delineated graphics generations, so that consumers can look at a product and know what it is without needing to read the long list of specs. In an ideal world, they shouldn’t even need to understand the meaning of things like DirectX 9.0c and 3.0 pixel shaders. They should know that X is better than Y, and buy accordingly.
What did web browsers look like 20 years ago, and what kind of crazy features did they have?
So what happens when a SOFTWARE engineer tries to review hardware? This. This happens.
Even allegedly smart people can make life-changing blunders that seem very, very obvious in retrospect.
The Game That Ruined Me
Be careful what you learn with your muscle-memory, because it will be very hard to un-learn it.
This is a massive step down in story, gameplay, and art design when compared to the 2014 soft reboot. Yet critics rated this one much higher. What's going on here?