Big things are going on in the world of graphics API. A graphics API is what a programmer uses to talk to the graphics hardware. This is a complicated job. You write some videogame code, which talks to the graphics API, which talks to the graphics driver, which makes the graphics card give up the shiny pixels for the player.
For a lot of years, there were really only two players in town: OpenGL and DirectX. OpenGL is so old that the original code was written in hieroglyphs on stone tablets, and all of the documentation was localized for Mesopotamia. The first version was released in 1992, back when developers were still living on Pangaea. It was built in a development world very unlike the one we have today. Before C++ rose to become the language of choice for AAA game development. Before shaders existed, and indeed before consumer-level graphics cards evolved.
This means that the OpenGL API looks pretty weird to modern coders. There’s an alternative, but…
The only other alternative is DirectX, which is controlled by Microsoft. That means it’s only available in places where Microsoft chooses. So if you make a game using DirectX and want to port your game over to (say) Linux but Microsoft has decided to they don’t care to make a Linux version available, then you can’t. You don’t have access to a platform unless Microsoft goes there first, which gives them a scary amount of power over various platforms. This is a big part of how Microsoft works: Slowly get a stranglehold over something and then use that control to strong-arm rivals and reward obedience. Every game made with DirectX makes them just a little stronger. If you’re a developer taking the long view, then you’d probably rather not add to this.
So by making a game with DirectX 12, you’ll effectively turn your game into a sales pitch for Microsoft Windows™.
On the other hand, Microsoft can write good software. I mean, when they want to. When they don’t care you get Games For Windows Live, a program that would qualify as Malware if it wasn’t so buggy. But when they care they make awesome stuff like Visual Studio. And DirectX is indeed one of the things they care about. I’m not riding the cutting edge with the hotshot AAA developers, but the buzz seems to be that DirectX is significantly better and faster.
So for years developers have had to make a choice: Do I go with the open platform that sucks or the closed platform that puts me at the mercy of Microsoft? This choice is no fun. So what do we do?
AMD tried to “help” by coming out with a third graphics API, called Mantle. Note that AMD is one of two major players in the graphics card industry, with the other being NVIDIA. They claimed it would be an open standard, meaning it could work (or could be made to work) on NVIDIA hardware, but these two idiots have been cock-blocking each other at every turn, and there’s no way one would accept a standard devised by the other. I’d be willing to bet that AMD would design Mantle to favor their hardware at the expense of NVIDIA hardware (so games using Mantle would run faster on AMD cards than NVIDIA cards) and I’m sure NVIDIA would design their stuff to preemptively sabotage any such move. This entire enterprise sounds like a non-starter to me. And even if it worked, you’d just be at the mercy of AMD instead of Microsoft, which is just a different devil.
At the same time, Apple rolls out yet another API, which is (because things aren’t confusing enough yet) Metal. (Not Mantle.) The names Mettle, Molten, and Mental are still available if anyone else wants to enter the fray and make things more confusingMercifully, Mattel is taken.. But it’s Apple, which means it has all the drawbacks of Microsoft. EDIT: Also, as Nick points out below, Metal is only for iOS, so it’s only useful if you’re targeting Apple devices.
So what else can we do? Lots of graphics engines act as “wrappers” for the big two: They use DirectX where that makes sense, and OpenGL where DX isn’t available. You just talk to the graphics engine and don’t worry your pretty little head about what’s happening under the hood. That’s nice, but graphics engines are expensive, and they end up being yet another layer between you and the hardware. This might mean that your game runs slower than if you used OpenGL or DirectX natively. It also means you might end up cut off from certain features. Maybe there’s some exotic (or new) thing that graphics cards can do, but is not supported by your graphic engine. Either the developers didn’t know about it, didn’t think it was important”Why would anyone do that?” is one of the most natural yet infuriating questions an engineer can ask., or it didn’t exist yet when the engine was written. Because you’re so far from the hardware you won’t be able to use that feature.
So… expensive, slow, and limiting. This isn’t an attractive option either.
Of course, you could just write all your code twice. Write a DirectX version of your game and an OpenGL version. Have fun doing three times the work for code that will be a support nightmare. Loser.
Valve has been funding an open-source alternative to OpenGl called Mesa. This could fix (or perhaps has already fixed) some of the speed problems of OpenGL, but it can’t really help with the slightly idiosyncratic and very dated OpenGL API. It would get faster, but not easier to use.
It should be clear by now that the real solution that would make everyoneExcept Microsoft, who is the main beneficiary of this mess. happy would be for OpenGL to stop sucking. The Kronos GroupThe not-for-profit member-funded industry consortium that maintains OpenGL. has been trying to do this for years. I’m not in the loop enough to understand all their moves because I’m so far back the tech curve, but the buzz I generally hear is that their changes are too few, too rare, and too incremental. They seem to be an incredibly cautious and conservativeI value these attributes in engineers. Generally I think the daring, devil-may-care types are best using technology, and the people who invent the technology should be slightly obsessive and paranoid. The astronaut should be bouncing around in space and giving thumbs-up to the world while the eggheads at home wring their hands and worry about every thing that could ever go wrong. Seems to be the best way to Get Stuff Done. bunch, and they’re not eager to make Big Scary Changes.
But things have gotten bad enough – or at least worrisome enough – that they seem to be doing exactly that. They’re working on glNext, which is a complete re-write of OpenGL. Now, I recognize the re-writes are generally foolish and self-indulgent, the work of engineers who are obsessing over code “cleanness” and not on the usefulness of the product. But if re-writes are ever warranted, then I think this would be such a case.
On the other hand, if this is a complete re-write without regard for backwards compatibility, then from the perspective of a game developer it’s basically just another API entering the field. If – like me – you’ve been faithfully working on OpenGL stuff for years, you’re not going to get a magic speed boost for nothing. You’re going to need to re-write your code to do things the new way.
|On the Oculus, it’s not just framerate that matters, but also latency. It’s possible to have a demo running at 75fps, but each frame is slightly delayed by a fraction of a second due to some clog in the operating system. In this case, you’ll turn your head and the thing you’re looking at will seem to move with you, then snap back to where it should be. It looks like everything vibates slightly when you turn your head, with vibrations getting more extreme the faster you turn. This is called “juddering”. It’s not pleasant.|
For me, the more immediate problem is that the Oculus SDK doesn’t really properly support OpenGL right now. For a product going for a broad, multi-platform rlease, this strikes me as being really odd. The OpenGL version of the SDK is incomplete, and there’s almost nothing in the way of example code if you’re looking to figure out which parts work and which parts don’t. Specifically, you can’t render directly to the device. You have to set it up as an extra monitor, then create a maximized window on that second monitor. That ends up being really laggy for a lot of annoying reasons. Your rendering images get squeezed through some extra layers of Windows processing to be turned into images on your “desktop”, and that latency makes the Rift really uncomfortable to use.
Since I got the Rift because I wanted to experiment with simulation quality and how to minimize VR sickness, this basically puts me out of business. Trying to measure smoothness and user experience in this mode is like trying to play Jenga on a rollercoaster. The noise in the system is larger than the thing I’m trying to measure. I can either drop what I’m doing and go learn DirectX (a massive and frustrating investment of time) or I can shelve my Rift until the SDK is finally updated. And there’s no ETA on when they will add OpenGL support, so I could end up waiting a long time.
There are no easy answers. Only very ugly tradeoffs.
 Mercifully, Mattel is taken.
 ”Why would anyone do that?” is one of the most natural yet infuriating questions an engineer can ask.
 Except Microsoft, who is the main beneficiary of this mess.
 The not-for-profit member-funded industry consortium that maintains OpenGL.
 I value these attributes in engineers. Generally I think the daring, devil-may-care types are best using technology, and the people who invent the technology should be slightly obsessive and paranoid. The astronaut should be bouncing around in space and giving thumbs-up to the world while the eggheads at home wring their hands and worry about every thing that could ever go wrong. Seems to be the best way to Get Stuff Done.
What is Vulkan?
There's a new graphics API in town. What does that mean, and why do we need it?
The Best of 2015
My picks for what was important, awesome, or worth talking about in 2015.
Silent Hill Origins
Here is a long look at a game that tries to live up to a big legacy and fails hilariously.
Push the Button!
Scenes from Half-Life 2:Episode 2, showing Gordon Freeman being a jerk.
PC Gaming Golden Age
It's not a legend. It was real. There was a time before DLC. Before DRM. Before crappy ports. It was glorious.