As before: In the process of going through this I'm bound to commit minor omissions, errors, misunderstandings, grammatical errors, or war crimes.
Times are approximate.
41:00 “I don’t mind blocking for the 1.2 milliseconds it will take for this to come in from flash.”
Carmack is talking about the difficulty of loading resources while rendering. This is mostly a problem with multi-threading.
The idea is that your program has several independent threads. One is running the game itself. Another is pumping all the data to the sound card. Another is loading in geometry or textures. They all do their own thing and in an ideal world they wouldn’t interfere with each other.
But what happens when some asset isn’t ready just yet? Like, it’s time to draw a rusty crashed alien spacecraft. Maybe you’re still missing the doorknob for the spaceship. Maybe you’re missing the texture for the “I brake for Strogg” bumper sticker. Whatever. It’s time to draw the scene, there’s something you need, and you don’t have it just yet. Maybe you’re waiting 1.2 milliseconds for the asset to arrive from flash memory. Or maybe you’re waiting for something in the neighborhood of 12 milliseconds to grab it from the hard drive. Or maybe you’re looking at a terrifyingly long wait of 100ms (a tenth of a second) for it to arrive from (shudder) optical media like a DVD.
In this specific case, Carmack is probably just talking about the time it takes to move stuff
around in in from flash memory. He’s saying he wants rendering to STOP until the asset is in place, since the game can’t proceed without this asset. (For the sake of argument let’s say this thing is critical.) Instead of the game doing a painstaking inventory and making sure everything is ready BEFORE it begins drawing, it’s way easier if you just start drawing and stop again if something isn’t ready yet.
Hardware vendors HATE this idea, since it makes their graphics hardware perform more slowly for things that aren’t their fault and are out of their control.
56:00 “CRT’s back in the old days had essentially no persistence.”
Some of you young folks might not remember the early 90’s laptops, which had ghastly persistence. Persistence is where it takes time for a pixel to stop shining once it’s no longer wanted. If you’re writing white text on a black background and you backspace to remove a letter D, how long does the after-image of the D linger onscreen?
On a CRT (Cathode Ray Tube, those big heavy monitors that re quickly going extinct) the pixels would go dark instantly. On the LCD of a laptop in the early 90’s, it would linger for a long time. I don’t actually have the numbers, but it felt like a second or so. This was really strange when you would wave the mouse around the screen and it would leave this trail of after-images behind it.
LCD’s have gotten much, much better, but they’re still not quite as crisply on/off with their pixels as those old CRT’s were. Sure, they’re better just about everything else (weight, space, power usage, flat viewing surface, etc) but we did take on this persistence problem. You don’t really notice it very much during normal usage, but when you’re wearing the screen on your face I gather it becomes important.
59:00 “We update 60 times a second, like clockwork […] and that’s why we have vertical tearing.”
He’s taking about this visual problem:
|Vertical tearing. (Simulated.)|
1:13:00 “Robustness first, then predictability, then performance.”
This is kind of a turning point for games. The reverse has been true for over two decades. An entire generation of programmers has risen in this world where you’re fighting for every CPU cycle. They would bend over backwards coming over with tricks to get the most graphical bang for their buck. The code wasn’t pretty, but it was fast and that’s what counted. And if the code was ugly, who cared? You were going to throw most of it way in four years when technology changed and you needed a completely different set of hacks to make things work.
And now we’re finally reaching that point where people are saying, “We’ve got plenty of power and we’ll probably still be using this code eight years from now. Let’s make sure the code is reusable, clean, easy to read, and properly annotated.”
It will be very interesting to see this new mindset percolate throughout the industry and see how it changes our approach to making games.
1:15:00 “OpenGL won.”
I actually have no idea where the battle between the Big Two is at any given moment. You can render with OpenGL, or you can render with DirectX. Each of these libraries gives the intrepid programmer a way to talk to the graphics hardware and make it draw stuff. Which do you use?
GL was first by a long way, and was designed by and for people who programmed in vanilla C. DirectX came along years later and was more of a C++ creature. There was a tug-of-war for years, but by the middle of the last decade I assumed that DX was winning mindshare based on the number of big-name engines that used it.
Back in the 90’s I dabbled in both and I hated the way DirectX worked. There were more steps to do simple things and there was a lot more typing involved. I remember DX as having obnoxiously long variable types and for their documentation being unforgivably horrendous. I’m sure the latter got better, but I already knew my way around GL and I never saw any reason to make the switch. Today, almost 15 years later, I’m still using OpenGL.
1:18:00 The functional programming stuff.
This is a big topic and I know people are kind of eager to hear my thoughts on it. The thing is, it might take me a while to do it justice. I have another programming series planned after I’m done with this Quakecon stuff. I’m writing a kinda-game thing and I might end up talking about this stuff in the process. It doesn’t make sense for me to delay writing about my own code and while also spoiling what I’ll have to say, while also not giving the subject the attention it deserves.
The Brilliance of Mass Effect
What is "Domino Worldbuilding" and how did it help to make Mass Effect one of the most interesting settings in modern RPGs?
The Dumbest Cutscene
This is it. This is the dumbest cutscene ever created for a AAA game. It's so bad it's simultaneously hilarious and painful. This is "The Room" of video game cutscenes.
Skylines of the Future
Cities: Skylines is bound to have a sequel sooner or later. Where can this series go next, and what changes would I like to see?
The Gradient of Plot Holes
Most stories have plot holes. The failure isn't that they exist, it's when you notice them while immersed in the story.
What is Vulkan?
What is this Vulkan stuff? A graphics engine? A game engine? A new flavor of breakfast cereal? And how is it supposed to make PC games better?