Experienced Points: Why the PS4 Doesn’t do PS3 Games

 By Shamus Mar 5, 2013 106 comments

The recently-announced PS4 is getting a bit of bad press lately because of the lack of backwards compatibility. My column this week explains why this was inevitable.

While reading up on how the Cell Architecture works for this article, my eye began to twitch s I pictured just what I complete mess it would be to emulate this beast. The cell lets a bunch of processors share different levels of cache. There’s all this stuff governing when memory writes are performed and it’s basically a bunch of distributed computers shoved in the same case. That’s awesome if you’re doing brute-force cryptography or building a render farm, but as a system for making interactive games it comes off as… weird.

I’ll repeat the borderline conspiracy theory I’ve floated in the past:

The Playstation 2 has the largest and most impressive library of any console, ever. It was competing against the still-not-ready-for-prime-time Xbox and the under-supported GameCube. The PS2 had solid hardware, a good price point, and weak opposition, which gave it a powerful and self-sustaining market dominance. All the developers went for the PS2, which means all the tools and engines were aimed at or optimized for the PS2, which made the games better, which made the platform more attractive to both consumers and developers, which led to more games, etc.

My crackpot theory: Sony looked at this perfect storm of fortune and assumed they were just the best by divine right. A majority of console gamers were PS2 owners, and those people would just naturally buy whatever Sony offered, and so they could use their position as leader to reinforce their position as leader. They chose the six-headed cell design not because it made technological sense, but because it would make porting FROM the Playstation 3 very, very hard.

Then they released their system long after the rival Xbox 360, at a stratospheric price point, with a shameful collection of launch titles, and sketchy or non-existent backwards compatibility. They promoted the platform based on “graphical performance” but the actual visual difference between the Xbox 360 and the PS3 was lost in the noise. Worse, Lair – one of the early launch titles – had abysmal performance, which undercut the whole “The PS3 is a powerhouse” idea. Sony discovered that the PS2 didn’t lead the market by divine right or brand loyalty, but based on crazy concepts like features, library, and cost. While it was the best Blu-Ray player on the market, as a game system the PS3 was a day late and a dollar short.

Then their own hubris came back to bite them. That developmental wall the built around the cell-based system wasn’t keeping developers IN. Instead it was keeping them OUT, thus making the library even smaller.

I could be wrong about the reasons why they chose the cell. That’s conjecture on my part and there’s no way to know the truth. But the outcome was clear. What a shame.

EDIT: The gamecube was under SUPPORTED, not under POWERED, Shamus. Fixed.

A Hundred!6106 comments. Quick! Add another to see if this message changes!


  1. lazerblade says:

    Second to last paragraph, second sentence:

    “That developmental wall the built around…” Should be -> “That developmental wall they built around…”

  2. Shamus, if they let you play your PS2 games on the PS4, then they couldn’t charge you again to play it on the Vita.

    Backwards-compatibility, ladies and gentlemen. >_>

  3. bickerdyke says:

    “What? Sony and Hubris??!?! Never EVER!!!”

    Yeah… whatever…

  4. Dragomok says:

    Order of the Stick is updating daily, Experienced Points are updating weekly and Firefall will most likely get a huge patch this month. What is this, heaven?

    Anyway, as I once said: while the article on the Escapist is informative, the one on your site seems a lot more interesting. Good job as always.

  5. Erik says:

    Now, what I wonder is what Microsoft will do with this. Considering they didn’t experiment with weird architecture they could just make a more powerful Xbox and keep backwards compatibility. If they do I imagine they’ll have a massive leg up on Sony.

    But then again, they might think “Sony didn’t have backwards compatibility, so then we are not obliged to have it either”.

    • GiantRaven says:

      If I was Microsoft, I’d be rushing around to make sure backwards compatibility was included in the next console. Having it whilst Sony doesn’t would a massive leap ahead.

      • TouToTheHouYo says:

        They’ll probably restrict it to XBL Gold members in yet another pathetic attempt to justify the cost of their pitiful service.

        Gotta make money off them old games somehow.

        • Tony Kebell says:

          Pitiful service, how? Why? Explain your reasoning please, I’m curious.

          • EagleEye says:

            I don’t think he means the service itself is pitiful, but rather that the fact you have to pay Microsoft money for seemingly no reason (they don’t provide you internet service like an ISP, after all) to connect the Xbox to the internet is pitiful.

              • James says:

                PSN is free and does the same thing PlayStation Plus is not free and you get huge benifits, XboxLive Gold, is like if Blizzard charged people to be on battle.net (No Bobby, no stay away)

                • Cannibalguppy says:

                  Yeah and they dont even encrypt your credit card data or have proper firewalls so random idiots online can hack your service and steal your identity. Xbox live DOES protect your info.Free is not allways better.

                  • Asimech says:

                    “Paid” doesn’t mean the money you pay goes to things that benefit you. The costs for proper security aren’t the reason why Sony sucks so hard on that aspect, it’s because the company is ran by different sort of idiots than Microsoft.

                    The costs of running the XBL shouldn’t be higher than Steam. And they run ads on XBL. Microsoft is demanding for payment not because they need it, but because they get away with it.

                    Edit: Also a lack of profit motive on Sony’s part isn’t a reason for them to suck at it, since they benefit from proper security. Customers that have lost their money to thieves are customers who can’t afford to buy stuff from you.

              • Klay F. says:

                Isn’t that what the absurdly intrusive ads are for?

  6. Daemian Lucifer says:

    I get why PS4 isnt backwards compatible.What baffles me is why windows 7 isnt backwards compatible.It even has a built in emulation for older windowses,and yet even that doesnt work most of the time.

    As for this console thing,we really need to stop with all these different console builds.I mean look at the PC:It doesnt matter who is making your graphics card,or your memory,or your processor,or your mother board,all of those will always come together nicely*.Why cant we have the same with consoles?It would be great if I could finally just buy a game,and play it on whatever machine I have,without stupid porting.

    *Yes,I know that there is some synergy involved and that certain builds work better than others,but thats besides the point.

    • Dragomok says:

      What baffles me is why windows 7 isnt backwards compatible.It even has a built in emulation for older windowses,and yet even that doesnt work most of the time.

      Furthermore, the emulator is available only for 2 out of 5 versions of the system as far as non-corporate users are concerned (gaah, why did I have to buy Home Premium?).

    • X2-Eliah says:

      Because the entire beneft of a console is the ability to access very well-known hardware at a very low instruction level, without needing a whole abstraction layer that converts generic instructions into a specific instruction set catered to a particular videocard/memorychip/processor. That’s the reasoning behind every claim of “you can’t compare pc’s and console game performances hertz by hertz” and so on.

      Also, the kind of console you described already exists, and it is called a pc. The business model of xbox and playstation divisions relies entirely on having a custom os on their computer so that developers have to code the games from scratch for their platform. Somehow, it works.

      • Daemian Lucifer says:

        You misunderstood.I dont mind consoles being of a fixed hardware/operating system.I mind them being of non-compatible hardware/os.I mind that you cannot run a psp game on xbox,or pc.And really,how much easier would it be for developers if all the porting they needed to do revolved only around different controllers and nothing else.

        Yes,I get the thinking behind “console exclusive games are what is giving us money”,but take a look at nvidia and amd graphics cards.You dont see any game that are exclusive for just one of these companies,and yet both sell their cards by the truckloads.

        • Scampi says:

          Wrong. I don’t know if it has been fixed by now, but back, when GTA IV was published, it wouldn’t run on systems with ATI-GPUs, only Nvidia…I know, since I was a GTA-fan forever and GTA IV was the first one I never bought-for several reasons…which were
          a) the absurd amount of DRM measures
          b) inability to run on systems with Radeon-GPUs…
          I had a similar problem with Arkham Asylum, also to the point where I curse the time I wasted even buying it. Glad the retail store took it back…TWICE(!) I tested it on 3(!) different GPUs of several generations, installed into a system far beyond minimum requirements, well into ‘recommended’-territory-none of them were supported (AA also caused severe damage to my HD, but that’s another story).

          Then I finally understood, what ‘The way it’s meant to be played’ is supposed to mean.

          • Daemian Lucifer says:

            Those are mostly flukes,and not something that was deliberately planned.And they usually have issues with just one series of cards,and not all of the cards from one company,because the developers didnt think to check for that one feature the series has.

            Compare that to console exclusive games,that are deliberately made to work just for that one console,and nothing else.Thats the difference I was talking about.

        • 4th Dimension says:

          Sooo what you are proposing is that all consoles have the same hardware and OSs? Than why choose to buy one product over another?

          nVidia and ATI sell cards by the truckload because they keep pushing goalposts ALL the time, so every once in a while you need to buy a new one. Also both have a diversity of cards for different markets. And the only reason they don’t try locking things in (some games couldn’t be run on some cards because driver issues (ATI was infamous for that)) was is because they both depend on PC platform. If they made a step into non compatibility they will starve themselves off because they are not a solution, but part of it, and the other one is always there to step in and take their market share.

          • Daemian Lucifer says:

            “Sooo what you are proposing is that all consoles have the same hardware and OSs?”

            No,not the same,compatible.

            • Jabor says:

              The entire purpose of console hardware is to homogenize specs so that:
              1. The user doesn’t need to worry if they can run a particular game – if it’s released for that console, it’s guaranteed to work.
              2. You can get higher levels of performance from the same hardware if you can tune your software specifically to that.

              … It honestly sounds like you just want another PC ecosystem that plugs into TVs instead of monitors. Why is that actually necessary, when we already have PC gaming?

              • Ciennas says:

                I know what DL is saying. They’ve said it earlier- they want to be able to play any disc based game on any disc reading console, and no nevermind to the hardware manufacturers, beyond what we do for PC purchases- reliability, power, whatever.

                The Big 3 will block any attempt at this. They have enough money and power that they can sabotage any company that would attempt such a thing, similar to what happened to FM radio.

                These would range from legal issues (proprietary formats or whatever,) to outright stupid (but then where would we possibly make money?)

                There would also be issues with the linux crowd- basically, those people who are versed enough in computer lore that they can unlock everything that the companies attempt to lock up.

        • Fuzzyghost says:

          “I mind that you cannot run a psp game on xbox,or pc.And really,how much easier would it be for developers if all the porting they needed to do revolved only around different controllers and nothing else.”

          What incentive would there be to buying an XBox360, if the Wii/U would play it also? If the companies made their consoles in the exact same way, then whoever came out first (by even just a day) would get the market lead, and there would be no incentive to get the others. Companies would duck out of the market because the possibility of profit is gone.

          • Daemian Lucifer says:

            What incentive is there to buy ati graphic cards when nvidia ones do the same things?What incentive is there to go to burger king when mcdonnalds sells you the same things?What incentive is there to buy bmws when mercedes does the same job?Etc,etc,etc…

            My point is:Just because they run the same programs doesnt mean they have to be identical to the T.

            • Fuzzyghost says:

              “My point is:Just because they run the same programs doesnt mean they have to be identical to the T.”

              For the same software to run across the board, they’d have to be. Otherwise, you’d have to code multiple versions of the software anyway, or just deal with OS as a middle man (my post below goes more indepth there).

              To make this clearer, an example: When the XBox 1 came out, it used an nVidia GPU. Every game made for it was programmed specifically for the hardware in it, not for any other piece of hardware on the market.

              Now, when Microsoft made the XBox360, someone in the technical department decided they should use an ATi GPU. When they did this, they had to use software emulation, which left some games unplayable for a while.

              In short, it isn’t easy to “just make it all work together.” The hardware vendors have their vision of how their product should work, they operate in different ways, and interoperability is lost in the process. The only way that “they’ll all work together,” is if there is only one monolithic system, which no company will go for.

      • Cannibalguppy says:

        IF they based everything on the x86 architecture none of that would matter.

    • 4th Dimension says:

      But than it wouldn’t truly be a console now would it? it would essentially be a bit locked in PC. Also remember even if all next consoles base themselves on PC architecture, it would depend on their OSs weather you could run same build of software on all of them. After all even on PC we have different OSes that are not compatible.

      Also big part of the positives in the eyes of makers of consoles and developers of it is that consoles are stable. On one hand makers can sell a stable proven product that you simply need to play some games, no other small company can come along and sell something better that can play all your games. Also since the hardware is not changing you can set up long serial production to lessen the costs. To top it off it allows you to spread the development costs over a long period of time since your console is going to be bought for quite some time, and won’t be supplanted in 6 months like it’s in PC land.

      On game dev side, if you know exactly what hardware you are programming against you can simplify your problem massively. No longer need you worry about forward or backwards compatibility, or weird hardware combinations. On top of it all, if you get familiar with hardware you can do a lot of hardware specific optimizations, that won’t work on other hardware but give significant boost. That’s one of only reasons that today’s consoles have comparable graphics to computers which beat them in pure horsepower competition.

      Oh great I made a probably badly worded wall of text.

      Oh, and what problems do you have in Windows 7 with compatibility. I myself am using Win7 x64 and have no noticeable problems. Maybe some really old or obscure games would cause problems but those are exceptions.

      • Daemian Lucifer says:

        “But than it wouldn’t truly be a console now would it? it would essentially be a bit locked in PC.”

        And whats wrong with that?Also,how would that be not a console?Just because it would have a hardware combination you could cobble yourself with a custom os wouldnt somehow make it different from a current console.

        “Oh, and what problems do you have in Windows 7 with compatibility.”

        Well the most recent,and most confusing,example was baldurs gate 1.On win7,it simply didnt want to work.Whatever compatibility option I tried.On xp,it ran without a hitch.

        That is,before gog sold me their version that works on everything.Damn that site.

        • 4th Dimension says:

          Because as I said before consoles are a package, hardware and software in sync. That is their advantage, if you develop for XBOX you know EXCATLY what hardware will be in the console, and can thus use low level optimization to suit that hardware. On PC you need to cover all possible hardware combinations, and even than you get stuff like RAGE not working on a particular set of cars because the manufacturer didn’t make available the proper drivers.

          It’s a closed system (package) and that probably makes development costs cheaper since once your dev guys familiarize themselves with a console they don’t need to learn how to use new hardware every year.

          On the other hand if you made consoles that anybody could assemble from whatever parts somebody wanted you would loose that closed system edge.

          And in the end the reason many people buy consoles is that they DON’T wan’t to think what is inside or assemble them, or look up their capabilities. They also don’t want to hear that brand new COD 322136 won’t work on their console because they have nVidia<> and not nVidia<>.

          Oh and about Baldurs gate, I expected it was some old title. Simply put that game probably used a lot of exploits that were based on WinXP, in order to do some things. Those holes and exploits were removed from Win7 because they weren’t a sane design and probably hurt security. And on top of it all it was probably considered that the amount of people that play BG were an insignificant minority. So they sacrificed you, to give the rest of us a really good sane OS.

          • Daemian Lucifer says:

            Again,not what Im saying.I dont mind them being closed,I mind them being not compatible.

            And lets use the call of duty example:Imagine what would happen if the next call of duty was announced as ps4 exclusive,because they cannot port it to the new xbox(dont know yet what itll be called).

            Now imagine how much easier it would be for the developers if they didnt have to port it,but just make a single game that would work no matter what console you use.The consumers would like it because they wouldnt have to wait for porting to be done,the developers would like it because they wouldnt have to port the thing,the publishers would like it because they could release it sooner.The only ones who wouldnt like it are the console makers because they would loose their exclusives.But this last generation had much less exclusives than the previous one,and Im sure the next one will have even less,so they are going to loose that no matter what.And what is better for them,to get a huge chunk from the few exclusive titles,or a small piece of a bunch of non-exclusive ones?

            That is,not counting nintendo,but they always were a curious breed.

            • Fuzzyghost says:

              “Now imagine how much easier it would be for the developers if they didnt have to port it,but just make a single game that would work no matter what console you use.”

              How would that work? Either you make a box that has unknown hardware, with a game that deals with the middle man to make calls to the system (PC), or you make a box that has one set of hardware, that gives developers direct access to system resources (Console).

              If you’re saying that the game developers should code ONE COPY of the game to WORK ON ALL PLATFORMS, then you are left with disadvantages:

              Can’t optimize the game software for the hardware, you now have to code one piece of software to work on multiple platforms, not have separate departments work on each version.

              Hardware agnosticism will cause issues. With all the hardware that you’ll have to code the software for, it may not work properly. You also have to get the vendors to decide on one medium (DVD, Blu-Ray, other?).

              Operating Systems are not the same. Not only do you have different forms of hardware with deal with, but different forms of software. The Microsoft platforms (Windows & XBox) are the only ones using DirectX, but the Wii & PS3 use an OS built off of Linux, and use OpenGL.

              Think technically about how you could solve these problems, and get all of the companies to buy into it (they each reap a profit). The only way this would work is if there was only one company for video games; one set of hardware, one distribution medium, one set of software. Any company in that market trying to deviate would just end up going bankrupt.

    • guy says:

      The main reason why Win7 isn’t backwards compatible is that there were many things deeply wrong with MS-DOS. I’m serious here.

      See, there were a lot of flaws or bugs or just suboptimal implementations in MS-DOS. In theory, most of them could be fixed without causing old programs to stop working, so long as they still did what was specified. However, many programs depended on unspecified behavior. That is, they depended not only on the final result of something in the situations it was designed for, but how that result was reached/the result of situations it wasn’t designed for. As a result, Windows maintained backwards compatibility via continuously accumulating terrible design choices. Finally, Microsoft decided to get rid of all that in the name of not letting programs get admin access when run by a non-admin user and the like.

      They also got rid of 16-bit drivers for the 64-bit versions. I dunno what was up with that.

      • Daemian Lucifer says:

        But its not just win7.I had emulation problems with xp as well.And not with the same games either.

        Plus,there are user made emulators that circumvent these problems(though with various degrees of success),so its not an impossible task.Its just that the built in emulation was never done well.And when people who have access to all the original code can not do something they are paid for,while people with no such access can do it for free,that just screams “laziness”.

    • Raygereio says:

      I mean look at the PC:It doesnt matter who is making your graphics card,or your memory,or your processor,or your mother board,all of those will always come together nicely*.

      It may not matter to you, the user.
      But it does matter to the developer. In fact it matters a whole lot. They have to ensure that their game runs on multiple versions of windows, on multiple brands and types of graphics cards with all kinds of driver version.

      The user may generally not notice the problems all the possible hardware and software configurations can cause, but that just means the developers did a good job. It does not mean the developer didn’t get a massive migraine from fixing those problems.

      What baffles me is why windows 7 isnt backwards compatible.It even has a built in emulation for older windowses,and yet even that doesnt work most of the time.

      Microsoft used to be really good about backwards compatibility. There’s an old story about Simcity. That game did something with memory that was a bit of no-no, but worked just fine on DOS. On Windows though it would crash. The testers on the Windows team at the time went through various popular applications, testing them to make sure they worked okay and found out that Simcity kept crashing. The developers then disassembled Simcity, found what it did with memory and added special code to Windows that checked if you were running Simcity and if you were ran the memory allocator in a special mode.
      All of that, just so that a game that ran on the previous OS, would run on their new one.

      I don’t know much about the internal politics of Microsoft, but it’s my understanding that there are two factions. One saw the obvious advantages of ensuring backwards compatiblity (not pissing of your userbase) and championed it. The other disagreed and figured that if an application relied on undocumented behaviour or did something wrong, it should just break when the OS is updated.
      Neither side are really wrong. Sure, backwards compatiblity is very imprtant to your userbase. But on the other side ensuring backwards compatiblity can become a nightmare. That Simcity story is cool, but imagine having to create individual exceptions like that for hundreds of applications.

      For a long time, the compatiblity group was dominant and the fact that your old applications would work on newer versions of windows was a very important selling point. Then the other faction scored a win. I believe it was Visual Basic .NET that dropped backwards-compatibility with VB 6.0 and it was something significant. It was the first time when you upgraded a windows product that your old data – the code you had written – could not be imported neatly.
      Shockingly enough, there weren’t any real repercussions for Microsoft. Sure VB 6.0 developers shook their fist in anger, but no one cared about them and they were a dying breed anyway.
      And so it became okay for Microsft to change things. It did make things easier for Microsoft (some of the things they did to ensure backwards compatiblity actually caused a lot of problems such as security risks) and it does seem to working for them.
      For now at least.

      • Daemian Lucifer says:

        What is infuriating about that is that they are pretending like they have backwards compatibility.They still have that “run in XYZ mode” option,which works maybe half the time.

        At least sony is honest about this(though I guess they dont have much of a choice).

        • Khizan says:

          If they have 50% backwards compatibility with no more than a checkbox, they’re doing pretty freaking well, man. Backwards compatibility is a bear.

          Picture this. You’re designing a new OS, for newer computers. It’s expected to run on new hardware, take advantage of modern resources and new architecture. It’s a modern OS for modern computers and designed as such.

          And then some guys come in with wheelbarrows filled with ~15 year old software and say “Oh, yes. And it also has to support all of these 15 year old programs on the off chance that we have a customer who wants to run them.”

          That’s awful.

          In the Sim City example, they had to essentially disassemble the game, crawl through the code to find out what was wrong, and change how the computer allocated memory when it was running SimCity, because the game had a bug that prevented it from working in Win95. Specifically, SimCity freed memory and then pointed at it afterwards, which is a cardinal sin of programming. However, Win 3.x would let that slide. Win95 stopped allowing that, and so to get it to work they basically put in a special mode for memory allocation that let you point at freed memory when you were running Sim City.

          That’s the kind of thing required to ensure backwards compatibility. At the release of Win7, BG1 was 11 years old. Expecting them to do that kind of thing for 11 years of games strikes me as a bit much.

          • Raygereio says:

            Expecting them to do that kind of thing for 11 years of games strikes me as a bit much.

            I once had an internet argument with someone who expected VtM: Bloodlines to magically be updated for windows 7 completele with the addition of widescreen support by an studio that no longer exists.

            He refused to understand why his demand was absurd. Some people are stupid.

            • Khizan says:

              You can still play Bloodlines on Win7. You can even do it in widescreen.

              You just need to download the fan patches and fixes that are out there, because the fanbase for the game is completely freaking amazing and they’re still putting out patches of it. The latest unofficial patch is from 2013-02-12, so maybe 2-3 weeks ago.

            • SKD says:

              Heck VtM: Bloodlines had problems on the OS it was designed for. If the studio couldn’t get those completely ironed out and has since gone out of business then it is completely reasonable to expect them to ensure patches for new OSes for the next couple of decades.

              I loved VtM:Bloodlines but it had deep-rooted flaws. I would really like to see more backwards compatibility in consoles. At least with PCs you can often get them to work in virtual machines if they won’t work in the new OS, at least in my experience.

          • Daemian Lucifer says:

            I wouldnt mind,if not for one small thing called dosbox(and other emulators).If a non official program can emulate the old system,made for free,by people that didnt have access to the source code*,why cannot an official program,made for a paycheck,by people that do have the access to all the codes,do the same?

            *At least not legitimate access.

    • I assume you’re talking about Win7 compatibility with games? Because apps tend to be pretty solid. I’m even running a SCSI driver on Win7 that was discontinued back before Vista was even talked about. Freehand MX still runs on it fine, as do all versions of Photoshop I’ve installed (one of my versions required an install of 4.0 to verify, and that had no trouble, either). I even have a very old copy of Adobe Streamline running under Win7.

      Games do tend to break, but if that’s the yardstick for an OS… well, I’m not sure that’s such a good benchmark. A lot of games do weird and funky things trying to maximize resources that other programs don’t do, so it’s no wonder they’d cause trouble. That’s another reason I like Steam and GOG. I get to play my old games, and I don’t have to go hunting for patches the fans have written to make them run. Still, fan-made patches are available for a lot of software as well.

      I’d say overall, Windows 7 (and previous versions) have done a bang-up job with backward compatibility. I haven’t run Win 8 yet, so we’ll see whenever that becomes an issue.

      • SKD says:

        Windows 8 is an abomination from a UI standpoint, but a lot of the behind the scenes stuff seems to be an improvement.

        • False Prophet says:

          The back-end is improved. Shell out $5 for Start8 or other app that restores access to the Win7-style desktop–there’s even a free alternative out there IIRC–and you’ll barely notice the difference at all.

      • Sabrdance (MatthewH) says:

        My recollection is that Microsoft’s backwards compatibility for games is a wonderful byproduct of their real target: business applications. By that standard, I think we can predict what will be backwards compatible for Microsoft based on how long an IT department will keep around a hacked-together kludge solution before they are willing to redo it to work on the new hardware.

        10 years doesn’t sound like a bad first estimate.

    • Kdansky says:

      Because you are asking for 20 years of backwards compatibility. Whatever works on Vista works on W7, and 99% of all stuff that works on XP also works on W7. It gets hairy when you want Win95-support, because that was a very different system.

      The only thing that doesn’t work well are drivers, and that makes sense: If you want to change your driver model, then you will have to drop support for ancient hardware. It’s like trying to plug an iPod into a steam train: Not going to happen.

  7. Wedge says:

    “under-powered Gamecube”
    Huh? The Gamecube was *more* powerful than the PS2. The PS2′s dominance was mostly to network effects, and the fact that it was already the market leader in the PS1 era. Nintendo didn’t shift its strategy away from competing on graphics until the Wii.

    • Shamus says:

      Gah!

      The gamecube was under SUPPORTED, not under POWERED.

      Edited the original post. Thank you.

    • GragSmash says:

      In fairness, the move to those little discs didn’t help it as a PS2 alternative, as it shut the door TIGHTLY on it being used as a DVD player.

      • Wedge says:

        I suspect the smaller disc format allowed faster loading times due to lower seek times. Nintendo, for some reason, was *TERRIFIED* of long load times, hence why they stuck with cartridges for the N64 when developers were clamoring for more space. Even after the PS1, with it’s miserable load times, ate the N64′s lunch, Nintendo still seemed gun-shy about having bad load times.

        I doubt they would have included a DVD drive anyway, though. One of the GCN’s selling points was that it was cheaper than the PS2, and adding a DVD player increases the console’s production cost. Making their consoles into a multimedia platform has never been one of Nintendo’s goals–after all, even the *Wii* doesn’t play DVDs even though it uses a DVD drive for its games!

  8. Dave B. says:

    I might be repeating one of your points, but something really struck me about this situation. By designing the PS3 in this way, and then abandoning the cell architecture for the PS4, they have sabotaged two generations of their console! First by limiting the PS3′s library, then by choosing to scrap that library and start over in the next generation. The PS4 might recover by making the porting process much easier, but that’s a lot of damage to undo.

    • Thomas says:

      It really was a terrible decision. It’s stunning that they didn’t suffer for it like they should of (or maybe they would have massively outstripped the 360 if they didn’t make those mistakes).

      Its going to be interesting to see how much that decision hurts the PS4. If it’s true that the 720 won’t have BC I guess it technically won’t cause much damage but people will be a lot slower to switch. And on the other hand we all know that most games are going to be multiplatform now, which we didn’t last time round, so libraries should build up a little faster

  9. Factoid says:

    That’s not a conspiracy theory at all. That’s just plain economics. Sony also benefitted from a few other “perfect storm” elements with the PS2. Their primary competition had floundered badly on the previous generation. The N64 had weak third party support, a high price point, less than stellar sales, etc… Sega had two or three flopped consoles in a row depending on how you count the genesis add-on modules like the SegaCD.

    Sony was also riding high on the success of the Playstation, which was sort of a sleeper hit console that took the field by storm after a year or so.

    Likewise your crackpot theory doesn’t seem far off the mark either. The PS3 was widely panned as expensive for consumers and difficult to develop for. Although people have mostly come around in the last 3-4 years as the tools have matured. Even Valve, which was notoriously anti-PS3 has gotten on board.

    Glad to see they’ve acknowledge their sins and are moving in a new direction with PS4. It will make things better for everyone. Backwards compatibility isn’t going to be a huge problem for most games. I imagine everything popular will get ported to the PS4 as a downloadable title.

    • Wedge says:

      One other thing that gave the PS2 a huge boost: the fact that it had a DVD drive. At the time the PS2 came out, DVD was just starting to take off, and while the PS2 was $300 at launch, paying $200 for a set-top DVD player was pretty typical. $100 more for a DVD player AND a console? I know a lot of people who for *years* their only DVD player was their PS2.

      Plus the PS2 had 100% backwards compatibility with the PS1, which meant it launched with a huge library of games, and PS2 was really the first console to ever do this. The fact is, the PS2′s success was part luck and part Sony very intelligently leveraging their existing market dominance. They showed a lot more hubris and a lot less good sense when developing the PS3.

      • Daimbert says:

        I very much resemble both of those remarks. I originally bought the PS2 because I wanted to get away from PC gaming — which wasn’t really successful — due to upgrading issues and wanted a DVD player, so it seemed like a good fit. Having access to the old PS1 games even though I hadn’t had one was the icing on the cake.

        With the PS3, I only bought it when I got an HD TV and could actually play Blu-ray disks. And while I’ve built up a library on it now and mostly play on it when I’m not playing TOR, I still have a much larger library of PS2 games than on the PS3.

      • Trevel says:

        I know people *now* whose DVD player is a PS2.

        Plural intentional.

      • Peter H. Coffin says:

        And the PS3 was and still is the reference platform for BluRay players, and until just a couple of years ago, was the cheapest way to get one. It’s also still an awesome media stream player. Sure there are other boxes that do that *now*, but doing it better is tough, and competing with that installed base is hard.

        On the backward compatibility issue, the early PS3s out there support PS3, PS2 and PS1 games, which means there’s a HUGE number of titles that do run on them. The limited number of titles kind of gets mitigated by that all your old games run just fine on the new thing, and in an era when people had a limited number of available inputs on their televisions, that meant they didn’t have to swap cables around.

        • False Prophet says:

          Yeah, I know several people who have a PS3 as their home entertainment centre and barely ever play games on it. It has that role in my house, but I also compared the list of 360 exclusive titles with the PS3′s, and preferred the latter–and most of the former were also on PC anyway. (Fable II remains the only 360-exclusive I was halfway interested in that doesn’t seem to have a PC port. Puzzling, when 1 and 3 do.) Granted, there were a lot of great exclusives on XBLA too, but eventually they all ended up on Steam.

  10. > My crackpot theory: Sony looked at this perfect storm of fortune and assumed they were just the best by divine right.

    Actually, mistaking luck for competence is a pretty common phenomenon in the game industry.

  11. drlemaster says:

    “Pie-crust promise…” Thank for the Mary Poppins reference in the original article.

  12. Psithief says:

    I guessed the content of the article when I saw the title. Hooray for basic hardware knowledge!
    Sony not understanding their market has been a oft-explored topic, going on what my vague recollections of the past 10 years have been.

    Of course, now I’m wondering what the massive downsides of developing for the ol’ x86-64 architecture are going to be. I don’t think there are a lot of developers out there that actually make decent use out of the multi-core CPUs, probably because developing for multi-core AMD and multi-core Intel are very much different things when it comes to bottlenecks.

    • 4th Dimension says:

      Actually there are generalized tools (at least there are in .NET) which which you manually make multiple threads and than split the work among them. And they are quite effective (you can get multithreaded code to run on an 4 core processor 3-3,6 times faster than it would if it wasn’t multithreaded).

      The biggest problem is splitting workload so threads don’t have to wait for each other or communicate among them. I guess if you don’t have to plan for different processors, you could get MUCH better results.

    • Peter H. Coffin says:

      Of course, now I’m wondering what the massive downsides of developing for the ol’ x86-64 architecture are going to be.

      Considering how much of the entire x86 line of architecture is backward compatible, feature by feature, bug by bug, traceable through the Pentium line, to the 4-, 3-, and 286, right down to the lowly 8008, 8080, and 8085 processors? The Power chips are only hanging on to 20 years of legacy design, not nearly 40…

  13. Hitchmeister says:

    I can understand why Sony wouldn’t come right out and say it, but I think there might be a bit of logic in embracing a more common x86-64 based architecture since it would make porting from existing PC games (particularly if anyone starts developing Windows and Linux in parallel) a comparatively simple process as opposed to current Cell-based ports. Which could lead to rapid growth of the PS4 library offsetting the issue of non-backward compatibility. These days it’s a lot more common to have multiple systems plugged in at the same time than even just a few years ago at the start of the this console generation. People who want to play their old PS3 games can just keep their PS3 around, and new people buying a PS4 will (hopefully) have the choice of superior PS4 versions much earlier in the console’s cycle. Of course the elephant in the room that Sony wouldn’t want to encourage people to focus on, is the possible very rapid decline in new PS3 titles as soon as the PS4 is released. I believe the last new PS2 release was in November 2012.

  14. Ingvar says:

    Hm, IIRC (and that is mostly from having read a bunch of Cell architecture papers and having actively considered porting a Common Lisp compiler to the architecture) the SPEs are actually identical to each other, have a stream-oriented instruction set, partitioned memory and the like.

    So, at least in theory, they OUGHT to be emulatable with N instances of “general purpose GPU programs”. Probably infeasible if you’re looking at running a Cell emulator on a random PC, but should be possibel on a console with known hardware across the fleet. Maybe.

  15. I wouldn’t say the Cell looks really weird, although it isn’t like I strongly disagree with anything you’ve said and that certainly isn’t chemtrails crackpot when it comes to theories.

    My main point would be that both the 360 and PS3 were developed under shifting sands. The 360 got an early implementation of unified shaders (while the PS3 launched a year later with the kind of split vert/shader units design that is so out of date that even mobile GPUs are moving to unified shaders today) and the PS3 got a very slim traditional CPU (exactly 1/3rd of the 360′s tri-core/hexathread design) bulked up with a close (ie shared memory levels) implementation of several dumb SIMD units, the things that thrive on what today you would offload to a GPGPU task.

    That is why the Cell is so hard to map into the modern design structures, because the CPU part of it is today split over a GPU for fast, highly threaded, dumb execution and a specialised CPU with great branching, op caching, and OoO performance. The Cell (to my lay-viewpoint) looks like it was going down that path that GPU compute ended up being the right answer to via someone asking what would happen if you really pushed up the SIMD units on a classic CPU design of the era.

    • Peter H. Coffin says:

      See also “Condor Cluster” from barely more than two years ago. There’s even a not-sucky article about it on The Escapist in December 2010.

  16. Zak McKracken says:

    Alternative theory:
    At the time the PS3 was introduced, something happened in the supercomuter market: While the old guard of vector processors was almost gone and PC processors (and models derived from them, like the Opteron, Itanium, Xeon processors) dominated, IBM entered the market almost from the top with their Blue/Gene architecture. This is almost the same as the Cell processor: A bunch of smaller processors packed together in one compact module, many modules packed together on one mainboard, two mainboards per server tray, and voila, the highest density of computational power and the highest FLOPS/Watt ratio seen until then. That system was developed for massively parallel problems, and for throughput over complexity. This means: Do a few operations to a chunk of data, then take the next one.
    This was not ideal for everyone in the supercomputing world, but it fit the bill nicely for the type of task that a console would be performing (process a polygon, take the next one…).
    … at least that’s probably what someone at Sony thought when they jumped on the bandwagon. In a way they were right, because these days even smartphones have multicore processors, but they probably didn’t consider that software for supercomputers is already massively parallel and needs to be changed to be a bit more massively parallel, get a different memory management and be sent through a different compiler, while games … not quite so easy if you want to really make use of the system’s capbilities. Still, based on the specs, the PS3 outdid any other console, and afaik there were even a few computing clusters made of actual PS3s.

    Still, the “walling off” thought must have been in their mind, too. Maybe it was a combination: 1: “This console is gonna be sooo much faster and more efficient than anything x86-based, so it will rule the market!” 2: “This gives us the power to lock in game companies!”

    • Sabrdance (MatthewH) says:

      This would be consistent with the last time Sony botched a technology innovation. Betamax was another piece of innovative, gee-whiz-bang technology that… ultimately was too clever by half.

      I shall have to add this to my list of examples of persistent organizational cultures.

    • The Rocketeer says:

      “There were even a few computing clusters made of actual PS3′s.”

      Right you are, and none other than the US Air Force was doing so. They realized that, for the money, PS3′s were an insane value compared to equivalent supercomputing hardware, so they strapped almost two thousand of the things together. It’s the fastest known computer in the DoD.

      It’s still around, but- and this is funny and sad to me at the same time- after Sony released the patch restricting custom OS’s on PS3′s, they lost the ability to replace units in the cluster if they wore out, since new PS3′s couldn’t be configured to work in the cluster anymore. So as each unit wears out, the cluster gets slower and slower.

      Look up ‘Condor Cluster’ if you’re curious.

  17. Kylroy says:

    Another factor that hindered the PS3 but probably benefited Sony: the Blu-Ray player in every PS3. Sony had lost damn near the exact same format fight in the videotape days (technically superior, Sony-owned Betamax versus inferior but easily-licensed VHS), and wagered their console superiority as a way to make sure they won the HD video format war. I don’t know enough about how Sony handles Blu-Ray licensing to know if they’ve eased up since the Betamax days, but the fact that every Sony console gamer already owned a Blu-Ray player gave them a massive advantage in driving HD-DVD to extinction.

  18. “While reading up on how the Cell Architecture works for this article, my eye began to twitch s I pictured just what I complete mess it would be to emulate this beast.”
    Should be “as”.

  19. Cat Skyfire says:

    Sony lost its dominance with the PS3 because it wasn’t compatible with the PS2. Part of the advantages of the PS2 is you could play your old games while building up a new stock. Eventually, you’d give up the old ones (after all, the graphics start to suck in comparison…). If the PS4 was compatible with the PS3, they’d probably do better. (I don’t think you need to go back two systems, but at least one.)

    I think they may be erring in thinking we want something new yet. If it doesn’t have something significantly worth it…

    • Thomas says:

      Deadpool brought it up below, it’s worth remembering that the very first PS3 was BC with the PS2. In fact I can almost guarantee that PS3′s sold at a higher rate post BC than when they had it

      People argued that they already had PS2′s so the PS2 game library wasn’t an incentive to upgrade (I remember people saying they’d bought a new PS2 for the ridiculous cheap price they were selling them rather than a PS3).

      The PS3 lost its fanbase because of ridiculous prices, very few games at start, coming out a year later and maybe being hard to develop for ruining it’s supposed tech advantage. When it got rid of BC and lowered it’s price correspondingly, thats when it happened to start picking up momentum

      • Peter H. Coffin says:

        I can still play Gran Turismo, the first one, on the PS3 tucked into my media center. That game is *why* I bought a PlayStation in the first place.

        • Thomas says:

          You’re almost certainly the exception to the rule though. In the first years of release (with BC) there were a lot of stories about the PS2 outselling the PS3 some months, even though the PS3 could play all PS2 games. Heck the PS2 was selling until last year

    • Sabrdance (MatthewH) says:

      Care to wager how many people would buy a PS4 if they remade Final Fantasy VII?

      • False Prophet says:

        FF VII ended up on the PS1 because Square and Nintendo had a massive falling out. That probably played a role in making the PS1 successful, as many FF fans jumped ship to Sony to play the next installment. Does Squeenix still feel obligated to release FF games as Sony exclusives? Doesn’t appear so.

  20. Even says:

    It’s good that we have services like Steam that at least try to make your library available as in many places as possible. If only we could convince all the big players that it’s in their best interests to get rid of exclusivity and try give better service instead.

  21. Brandon says:

    Excellent article, though I think you give the PS2 hardware a little too much credit. Ars Technica has an old article discussing the PS2, the GameCube, and the Xbox and their different approaches to 3D rendering, and the PS2 has arguably the pickiest and most difficult design, for minimal benefit to boot. The PS2 succeeded because of DVD capabilities and carry-over success from the PS1. It was only later in the system’s life that Sony offered tools that actually made the system reasonable to develop for, and even then you still had to do lots of optimization to get best performance.

    But you are definitely right. Sony though transitioning to the PS3 would have the exact same inertia as their first system transition, and they made the exact same mistake in that their new system was unconventional and a PITA to dev for. Only this time, their competition had both gone to market already.

  22. Deadpool says:

    Notice that when the PS3 was first released, it WAS backwards compatible. My roomate bought the behemoth, extra expensive PS3 and I can still play my PS2 AND PS1 library in it…

    It wasn’t until the Virtual Console showed that people are willing to pay new money for old games that the PS3 was suddenly not backwards compatible anymore…

    You’re also not the first one with this “crackpot” theory… What makes it more amusing is that something similar happened before: The PS1.

    The Playstations was, amusingly enough, designed by NINTENDO. Nintendo pulled out of the deal with Sony for a CD based system AFTER Sony had announced it, publicly embarassing them and potentially costing them quite a bit of money (back then, Nintendo wasn’t exactly the nicest company). Sony put together what they had and actually released a damned system.

    Nintendo was a judgernaut at the time, beating its only competition soundly at every generation and decided to stick with the cartridge model for the Nintendo 64, which was harder and more expensive to program for and to port to and from. The Playstation was easier and cheaper to work with, so *gasp* everyone and their grand mother went to Playstation.

    Sony, in a rare moment of wisdom (or luck), managed to broker several exclusive deals with software developers, which was a BIG part of their “perfect storm” PS2…

    • Wedge says:

      That’s not why the PS3 removed backwards compatibility. The early PS3′s used hardware emulation–essentially, the PS3 had a PS2 built-inside of it! (This is more or less how other backwards-compatible consoles, like the PS2 and various Gameboys provided BC) One of the main reasons the PS3 struggled initially was the exorbitant cost, and effectively cramming PS2 hardware into the PS3 was a significant per-unit cost, so when Sony decided they needed to reduce the cost of the PS3 to compete, the PS2 hardware was one of the first things cut, replaced with the vastly inferior software emulation.

  23. Sabrdance (MatthewH) says:

    Can someone explain to the non-computer expert why they couldn’t use the quadcore architecture to emulate the PS3?

    I’m sure this is wrong, but here’s what I am gathering and envisioning: the PS3 has 6 processors doing 6 different things. A modern computer has 4 processors that can all do everything. So why not put 6 processors that can all do everything in the PS4, use the parallel processing speed for non-cell-architecture games, and then -when playing a PS3 game -simply designate each processor as one of the original cells?

    I can imagine you wouldn’t do it for cost reasons. Is there a computer design reason not to?

    • Bubble181 says:

      I was wondering much the same, myself.

      • Jonathan says:

        >has 6 processors doing 6 different things. A modern computer has 4 processors that can all do everything

        Easy to get misunderstanding from the article, but the hardware of each cell is identical; It’s the software* that is doing 6 different things.

        *And I use the term software very loosely here. If I understand correctly these SPEs are very dumb math-throughput machines that can do little in terms of predicting and organizing work: as in what data to work on and where the data comes from and where it goes; The bulk of deciding exactly what math to do where is dumped on the big PowerPC core; and that “producer” role is not done directly like what is done in shaders in modern GPUs (the closest somewhat analogous hardware), there is a number of storage and communication layers separating the commander from the worker bees.

  24. Klay F. says:

    I could see Sony maybe getting away with no backwards compatibility for the PS4. After all, they had no problem taking it away mid-way into the PS3′s life cycle, and nobody seemed to raise much of a fuss except for people that are crazy into Linux.

    However, I just don’t see how Microsoft could get away with the same behavior. The usual solution to no backwards compatibility is to just keep your old console. That just isn’t viable for the 360. I firmly believe that in 10 years, there won’t be a functioning 360 in the entire world.

  25. Lalaland says:

    The PS3 had perfect backwards compatibility as it had an entire PS2 built onto the motherboard. I know because it was about all I used my launch PS3 for except for Blu-Ray films & docs until GTA4 came out.
    Edit: Later revisions removed the chip to save cost, I think it was the Slim model.
    The idea that the PS4 would have backwards compatibility was a bust from the moment they decided it would be x86. You can only really emulate very old and very slow h/w architectures successfully in s/w.

    The benefit of b/c is that it gets you over that ‘no new titles’ drought that accompanies new console architectures. As the new consoles are x86 and GPU which we already have mature toolchains for this shouldn’t be an issue. In fact I’m more hopeful for this generation because of the mature toolchains meaning more investment can go to the games and not the tools (much as happens late in the console cycle).

    • Peter H. Coffin says:

      Nah, it didn’t wait until the Slim. The hardware PS2 circuitry died with the 3rd and 4th front USB ports, and a lot of the original Playstation compatibility died with the hardware emulation, but the PS2 compatibilty lived on as software emulation until it was finally dropped with the release of the the Slim.

  26. wererogue says:

    I don’t know if I credit Sony Entertainment with enough Macchiavellian forethought to try to lock developers in by obscure architecture. Maybe someone there thought it would be a bonus.

    At the time Sony were building the PS3, the common wisdom was that MOAR CORES was the future of computing power – and, to be honest, they were probably right. Unfortunately, the *near* future proved to be more powerful processors (again) and the concurrency revolution that they were betting on never really materialized.

    Once you’re programming with concurrency in mind it actually isn’t that hard to port to other architectures. SPU (the little PS3 cores) c++ is almost identical to regular C++, and IIRC it can be compiled for a PPU anyway. If you’re already doing job scheduling and transformation gating (or data dependencies) then all you really need to do is scale the number of job consumers, and you’re golden.

    The problem is porting code designed for a non-concurrent architecture, which, as you’ve said, kept developers away from the PS3.

    I would definitely imagine a PS3 “emulator” to be achievable on an x64 architecture. Since they have all the original system libraries, you’d be looking at something more akin to WINE than Snes9x. A bit harder than other emulators, but not much.

    • Jonathan says:

      In that case the 8 way Jaguar used by the PS4 and the next Xbox has more ”cores” (well, just one more, but I wouldn’t call the SPEs full cores) than Cell. They are just made much more sanely by a competent engineering team, namely AMD.

      Its not so much the number of cores as speed. The Cell runs at 3.2 ghz; these new cores run at 1.6ghz*, and if I understand correctly have narrower SIMDs than the Power core had; so unless you plan on having the main thread run at best 1/4th the speed* of the PS3/xbox 360 version; emulation is not difficult; it’s completely impossible.

      *not implying that AMD’s x86 cores are 1/4th or even 1/2 as fast as PowerPC; apples to oranges and all. Even in the x86 world, during the netburst/k8 days an AMD 2ghz processor would easily outclass an Intel 2ghz processor. These days the IPC (compute throughput/clock-speed) situation between Intel and AMD is not completely reversed, but it has reversed; In terms of compute/Joule they are comparable outside of the desktop computer market.

  27. Dork Angel says:

    I bought my PS3 just before they nerfed it and did away with the backwards compatibility and 4 USB ports. Main reason was I didn’t want to lose the backwards compatibility (after all that I’ve never put a PS2 game in it) and I fancied a blueray player (my current total of blueray discs stands at 4 I think). The PS3 still impresses me though. It’s given me Oblivion, Skyrim, Fallout, a few Tekken’s, numerous COD’s and Dead Island (my favourite game ever despite it’s failings). Oh, and it still works many years later. Can’t really see the point in the PS4 yet though. It doesn’t have the equivalent novelty of the Blueray player and I don’t think I’d notice better graphics than what the pS3 provides. The only thing that would tempt me would be some really impressive launch titles and I can’t see that happening.

Leave a Reply

Comments are moderated and may not be posted immediately. Required fields are marked *

*
*

Thanks for joining the discussion. Be nice, don't post angry, and enjoy yourself. This is supposed to be fun.

You can enclose spoilers in <strike> tags like so:
<strike>Darth Vader is Luke's father!</strike>

You can make things italics like this:
Can you imagine having Darth Vader as your <i>father</i>?

You can make things bold like this:
I'm <b>very</b> glad Darth Vader isn't my father.

You can make links like this:
I'm reading about <a href="http://en.wikipedia.org/wiki/Darth_Vader">Darth Vader</a> on Wikipedia!