I’ve implemented erosion simulation a few times in my career. My usual technique has you begin at a single spot on the map, landing like a virtual raindrop. From there, you examine the immediate surrounding points and see which one is lowest. Then you move there and repeat the process. As you go, you shave a tiny little bit off of each point. If you want to get fancy, you look at the steepness of the slope. If you’re going on a nice downhill then you assume this theoretical trickle / stream / river is moving fast. You might pick up a little extra dirt (thus making the riverbed deeper) and taking it with you. When the land levels out, you assume the flow has slowed and you drop off a bit of what you’ve collected.
|And if the erosion code doesn’t pan out, maybe we can repurpose it into a sled-riding simulator.|
Eventually you hit the ocean or find yourself at the bottom of a hole. Here you drop off whatever you’ve collected, which ought to help form sloping beaches, river deltas, and gradually fill in craters. At this point you’re done. Now pick another point on the map, drop another raindrop, and start the whole thing over.
It’s not perfect. This would be useless to a geologist or other science-type person trying to study science-type stuff. But it’s perfectly good for making a plausible landscape.
This is nice, but shaders simply can’t do this kind of processing. You can’t say, “I’m done with grid coord X, Y, now let me move next door to X+1, Y”. You can’t just store values globally, and you can’t stop processing when you feel you’re done. (Unless you don’t want to generate any output. In which case you just wasted your time.) When your shader is executed, you’re dropped into a situation where you’re expected to do the calculations of one pixel, and only that pixel. You can’t change any adjacent pixels and you can’t carry values between them using variables. You can LOOK at adjacent pixels, but because processing is heavily parallelized, you can’t even guarantee that points will be handled in any particular order.
So how do we handle large processing jobs like this, where we have a lot of inter-dependency and changes that need to propagate in unpredictable ways?
There are probably a lot of solutions, but the system I came up with requires one new texture (for holding erosion data) and two new shader passes.
So our new texture is called the “erosion map”. We look at whatever pixel we’re given, and check the heightmap for its immediate neighbors to the north, south, east, and westI experimented with using all 8 ordinal directions, but to my surprise it looked worse.. We look at which one is lowest. This means there are five possible outcomes: north, south, east, west, and none. (“None” happens if we’re lower than all our neighbors.) We jam this value into the red channel of our erosion map.
Next we store the number 1 in the green channel. I’ll explain that in a minute.
Next we look at the red channel of our four neighbors and see how many of them are pointing at us. If they are, then that means they are tributaries of us. We take their green channel and add it to our own. This means the green channel is effectively the number of cells upstream of us. If we’re at the exact pinnacle of a mountain, this will be 1. (Remember we set the green channel to 1 a couple of paragraphs ago.) If we’re at the bottom of a valley then we could have hundreds of tributaries. (Although remember that since we’re saving this to a one-byte color channel, we can only hold 256 unique values. Which means that we basically lose track after 255.)
Doing things this way is a bit odd. If we have a chain of tributaries 100 units long, it will take 100 updates for everything to fully propagate.
After we update the erosion map, we do another pass and update our heightmap. The heightmap looks at the erosion map. The more tributaries a point has, the more elevation is eroded away. Then after we update the heightmap we have to update our normal map again, since the topography has changed a bit.
|The erosion has begun to work its magic. That river was carved by our simulation. You can also see ridges forming.|
So we have a nice little chain of updates going now. 1) Update the erosion map to see where erosive forces exist. 2) Update the hieghtmap to apply those erosive forces. 3) Update the normal map, since the shape of the hills have changed. And since the hills have changed, we need to go back to #1 again.
The texture for our world is pretty big: 2048×2048. That means the heightmap, the normal map, and the erosion map all need to be that size. The erosion processing is pretty expensive. If we tried to update the whole thing in one frame our framerate would tank. So we update it in patches, spending 16 frames updating the whole thing. Then – just to keep things nice and unified – we spend another 16 frames updating the heightmap, and another 16 doing the normal map. So the whole cycle takes 48 frames. Since we’re running at 60fps, this means the erosion system runs at a little better than a frame a second. That’s fine. The changes it makes are incredibly slight. If they were big, the terrain might feel too chaotic, and it wouldn’t last very long. The mountains would melt away into speed bumps and (since we don’t have any other geologic forces in play) the lowlands would flatten out to an immense and very boring planeWhich would technically also be a plain..
|Instead of forming gentle valleys, it’s gouging ever-deepening ruts.|
It’s not awesome. It’s too brute a force. Once some erosion (we can think of it like water, although no water is actually depicted) gets down into a crevice, it tends to just dig deeper and deeper, forming these ugly trenches. Also, these trenches have a habit of lining up with the grid and then going perfectly straight, which makes them look just awful.
|This was a single mountain, but the rivers have carved everything away, leaving these sheer vertical knife-ridges.|
So I add a new feature. While the erosion pass has a cell examining its neighbors, I make an average of all of the green channels. (That’s the tributary count.) I stuff this value in the unused blue channel. This is basically “tributary count, except blurred”. This value is now used for erosion when we update the heightmap.
|Gentler, more realistic hills. The nasty seam is due to a bug that I eventually got around to fixing. The world is supposed to tile seamlessly. During the normal pass, it wasn’t wrapping values properly when it got to the edge of the world.|
This is basically good enough. It rounds off the sharp edges. There’s a lot more we could do: Maybe we could mess around with river banks, making sure they sought a plausible angle of repose. Maybe if the erosion value of a particular cell got too high then it would no longer be considered a good place to flow, which would make river beds wider when they had a large number of tributaries. We could fiddle with this stuff all day, but we’ve basically met my initial goal of making a sophisticated, multi-pass process run on the GPU.
|Close enough, I guess.|
It’s interesting the things you learn like this. It’s the kind of stuff that might not appear in the docs. It’s the stuff you sort of work out through trial-and-error as you learn, and then take for granted when someone else comes along asking questions.
Notes / comments so far:
- I got a sense for how much work we can expect to get done in a single frame. Right now doing erosion is pretty expensive, and updating our texture in 256×256 or 512×512 chunks is roughly the sweet spot on my machine. This doesn’t mean I know how it will run on other machines, or tell me how other shaders might perform, but at least I have SOME frame of reference.
- I got a sense of when the GPU starts clamping and truncating values: Only at the last possible moment. You can have nonsense color values (like negative numbers, or gigantic values) or non-normalized normals, or other data that’s supposedly invalid. The shader programs will tolerate whatever nonsense you throw at them until the moment you draw a pixel to the screen.
- Converting between floating-point values and single-byte values is safe and reliable. You can take the extracted color value (which runs from 0.0 to +1.0) and multiply by 255. Treat it like an int, do whatever you like with it. As long as you divide by 255 before to send the color value to the screen, you’ll get out exactly what you put in. I wasn’t sure if there was other clipping going on that would make this difficult, but this entire erosion system would have collapsed if this conversion was the slightest bit unreliable.
- Having said that, I really wish I could query the color values directly as ints. Right now the GPU looks up the color value, converts it to a floating point value, then I convert it back to an int so I can use it, then I convert it back to a float so GLSL will accept it, then the GPU converts it back to a single bytle value. We’re doing four conversions, when zero would do.
We’re not done yet. More shader shenanigans to follow.
 I experimented with using all 8 ordinal directions, but to my surprise it looked worse.
 Which would technically also be a plain.
Project Button Masher
I teach myself music composition by imitating the style of various videogame soundtracks. How did it turn out? Listen for yourself.
Bad and Wrong Music Lessons
A music lesson for people who know nothing about music, from someone who barely knows anything about music.
Let's ruin everyone's fun by listing all the ways in which zombies can't work, couldn't happen, and don't make sense.
Grand Theft Railroad
Grand Theft Auto is a lousy, cheating jerk of a game.
Joker's Last Laugh
Did you anticipate the big plot twist of Batman: Arkham City? Here's all the ways the game hid that secret from you while also rubbing your nose in it.