You know what the hardest feature of this game was so far? Fullscreen mode. Well, not so much fullscreen as the ability to allow the user to change the window size, which crops up most often when toggling fullscreen mode. I don’t think it was the most time consuming feature, but it was certainly the most effort for the most mundane and uninteresting payoff.
It’s not something you can skip, really. I’m long past the point where I’ll put up with a game that wants to restart because you changed the resolution. We’re all accustomed to being able to smack alt-enter to toggle fullscreen or to resize a window by dragging. It’s just part of making civilized software and not something a developer can leave out. On the other hand, it’s infuriatingly difficult and troublesome and adds all kinds of unwelcome complexity to systems that would otherwise be graceful and elementally simple.
Ever wonder why some old games (and a few modern ones) won’t change the resolution until you restart the program? I’ll tell you.
The problem has to do with how video memory is allocated. Let’s imagine this blank canvas is your video memory:
When the game starts up the silly user has the resolution set to a hilarious 640×480. So we need to allocate enough memory to hold that image. Actually, we need two of them. These blocks of memory are called the front and back buffers. The idea is that the game shows you one image (the front) while it’s busy drawing the other (the back) and then swaps the two. It’s always showing you one and drawing on the other. This swap is the end of a frame, and doing this switch is what gives us the “framerate” thing that the kids are always talking about.
So when we start up OpenGL or DirectX we grab some memory for our front and back buffers:
Now, we might also need a few more buffers: Depth buffer, stencil buffer, accumulation buffer, or whatever those fancy folks working on the Cry Engine use to make their pixels so shiny. But you get the idea that we need to take a chunk of video memory to hold the stuff we’re drawing.
We’ll also need to grab some more memory to hold our texture data. We might have a small number of huge textures, a modest number of medium textures, or a multitude of small textures. It kind of depends on the game. Either way, in most cases textures will take up more video memory than anything else.
To the other coders: Yes, I know memory is linear and not arranged on a grid like this. But that’s not important for the conversation we’re having and this is much easier to visualize.
We might also have some vertex buffers. That’s where we take a bunch of polygons and send them to live in video memory so we can refer to them quickly. Instead of sending all 10,000 polygons of Master Chief’s head every frame we can just say to the graphics card, “Hey man. Remember that head I gave you a while back? Draw that.”
So let’s put some vertex buffers into video memory:
And finally if we’re doing bump mapping or somesuch then we need vertex and fragment shaders. These things are just bits of code, so they’re small. But they do exist in video memory.
So fine. We’re drawing this 3D scene. We’re rolling along, rendering stuff and giving the user aliens, Nazis, Zombies, or robots to shoot. Everyone’s happy. But then the user decides they don’t like 640×480 mode and they switch to something like 2048×1152. Suddenly…
Suddenly many times more memory is needed for those front and back buffers. (And stencil, depth, etc.) This space is gained by purging everything from video memory and starting over. This always happens, even if there’s tons of memory left.
I’m sure there’s a good reason for this. That’s not sarcasm. I don’t know enough about the low-level videosystems to know why this purge happens, but a lot of people have been obsessing over these software layers for over a decade, and there’s no way this is the result of “laziness”. I’m sure the answer would be very technical and would come down to performance.
It doesn’t matter. The point is, at any given frame we could suddenly discover that every single thing we’ve ever sent to video memory has been wiped out. Keep in mind that establishing these resources is a big percentage of the time you spend at the loading screen. If you’ve ever wondered why videogames tend to enter this massive funk just after you change the resolution, this is why. The program has to stop and re-load every dang thing. (The textures take far longer than everything else combined. My images above are not at all to scale. If they were, then the canvas would be about sixty times larger than the front/back buffers, the textures would be at least twice the size of the LARGE buffers in the last image, and the shader programs would be too small to see.)
It’s really annoying to have to design a system to handle mid-game re-initialization like this, but it’s what you have to do. Making the user restart is obnoxious and barbaric, but if we try to do ANYTHING without restoring the lost data we will end up drawing a heap of garbled mess. And then crash. (Trying to render from a missing vertex buffer is suicide.)
There’s other stuff that’s not kept in video memory that also ends up getting killed by this reset, like OpenGL display lists. In my case, this means it kills all of the fonts and the HUD.
The user re-sizes the screen and suddenly you have to re-init textures, vertex buffers, lists, shader programs, and fonts. You have to do these in the right order, since some systems contain others. What a mess.
On the upside, this acts as a great stress test. If your program can survive several successive re-sizes without crashing, leaking a bunch of memory, or displaying a bunch of ugly artifacts, then it (probably) means you’ve got a solid foundation on your program.
Performance-wise, I’m pretty happy with the program.
Some Trivia for the Curious
The project has ~40k lines in 84 source files. 2,500 of those lines are comments. The largest source file is 15k lines of code, but that’s a silly OpenGL thing called OpenGL extentions wrangler. It’s mostly a bunch of conditional statements to figure out what OpenGL features are available on the user’s system and how to get to them. The largest part that I’ve written personally is one tenth the size of that; the Robot AI is just 1,550 lines of code.
I’ve never heard of Cyclomatic complexity before, but according to the source-examining tool I just downloaded my McCabe VG Complexity score is 4,175. I’m sure somewhere out there is a computer scientist who knows what that means and will find it interesting.
We’re currently on week ten of the project.
It’s a total of two seconds from the moment I start the program to the moment when I can start playing. I hate loading screens and I think I’m going to be able to meet my goal of making a game that doesn’t have one. So far I can fly from the starting area to the end of the game in a single unbroken journey. There’s a tiny little stutter when you move down to the next level, but I’m sure I can get rid of that.
Memory-wise, I’m doing okay:
My in-game memory usage is a good bit less than the memory used by Pac-Man at its main menu. I could get that even lower, but this is already ridiculously low. You can see it’s already using less memory than a couple of browser tabs in Chrome. (I think one of those is Facebook.) I’m trying to make efficient software, not mash my program down to some teeny tiny size for bragging rights.
How much more am I going to use in the way of resources? I think I’m basically done consuming texture memory. I won’t need more unless I fill my atlas texture, and I’m using less than a third of it:
I suspect I’ll eat some more memory with new sound effects. Other than that? I guess this is about what I can expect from the final product, minus optimizations.
I’m still not to the point where I’m ready to start optimizing and testing to see what the final system requirements need to be. But at this point I’m feeling confident that I’m not making any egregious errors or doing anything obviously stupid.
PC Gaming Golden Age
It's not a legend. It was real. There was a time before DLC. Before DRM. Before crappy ports. It was glorious.
The Witch Watch
My first REAL published book, about a guy who comes back from the dead due to a misunderstanding.
The Opportunity Crunch
No, brutal, soul-sucking, marriage-destroying crunch mode in game development isn't a privilege or an opportunity. It's idiocy.
Games and the Fear of Death
Why killing you might be the least scary thing a game can do.
Artless in Alderaan
People were so worried about the boring gameplay of The Old Republic they overlooked just how boring and amateur the art is.