Raytraced Quake II

By Shamus Posted Sunday Jan 20, 2019

Filed under: Random 47 comments

Two months ago I wrote about raytracing. At the time I said, “Dear games industry: Good job. That’s nice, but don’t make me upgrade my graphics card for this. It’s nice and all, but it’s not ‘five hundred and ninety-nine U.S. dollars’ nice.”

The technology struck me as a fun curiosity but not remotely worth the required jump in processing power. Then this week I came across this story at Rock Paper Shotgun talking about how someone added raytracing to Quake II.


Link (YouTube)

I love it. For whatever reason, I actually think the effect is more interesting on the lower-fidelity scene.

I’m curious to know if this required new art assets. The textures look very accurate to what I remember, so I initially assumed this technology was just drop-in and was working with the original texture maps. Now I’m looking at things a little more closely and wondering if I was wrong.

If you look at the floor at the start of the video, you’ll see that light behaves a little differently on the floor tiles compared to the space between the tiles. Check out this image:

I really do think Quake II is the best of the Quake series, even without this mod.
I really do think Quake II is the best of the Quake series, even without this mod.

To me, it looks like the edges of those floor tiles are picking up specular highlights. I dunno. Maybe this is just a trick my eyes are playing on me.

In modern games, you don’t want everything to have a uniform shine. Some surfaces should be glossy, like a glass bottle or polished brass. Others should have a bit of shine, like brushed metal or wet stone. Other surfaces should be completely dull, like dirt or concrete. To get these surfaces looking the way you want, you usually have an extra texture map that will tell the renderer which areas should be glossy, and how much. Obviously Quake II didn’t have that sort of thing, which means to do it right someone would need to make those assets and add them to the game.

Then again, I looked at the Q2VKPT homepageQ2VKPT is short for Quake 2, Vulkan, Path Tracing. Vulkan is the rendering API used, and path tracing is the more correct term for Raytracing in this case., and none of the credits say anything about new assets. From the way it’s described, it sounds like Q2VKPT is a simple drop-in program.

Either way, I’m less excited about how cool it looks and more excited that you could get lighting this robust for so little effort. It would be amazing if we could go back to the workflow of the 1990s. This would give small teams the ability to make fancy shooters without being forced to evoke the 90s design style. Mixing the comparative ease of 90s development with the sizzle of modern graphics is the kind of innovation I can get behind.

Then again, these cards start at $800, so it’ll be a few years before any of this makes sense for either developers or consumers.

 

Footnotes:

[1] Q2VKPT is short for Quake 2, Vulkan, Path Tracing. Vulkan is the rendering API used, and path tracing is the more correct term for Raytracing in this case.



From The Archives:
 

47 thoughts on “Raytraced Quake II

  1. Kamica says:

    It’s definitely technology that I think the industry wants, so I do think it’ll become cheaper with time. I just hope it’ll be affordable in regards to the bit-coiners snatching up all medium cost cards on the market…

    1. Echo Tango says:

      Proof-of-work crypto currency is a really expensive waste of time. I’d love if people just moved past it, to more fruitful experiments. :S

    2. Agammamon says:

      Right now all the cryptocurrencies worth mining are past being mineable with GPU’s. I think they’re pretty much all ASICs now.

  2. Grey Rook says:

    Whoa, cool. I’ll admit that the tech talk goes a bit above my head, but getting lighting that good in an engine as old as Quake 2 is impressive. Though, am I parsing your statement correctly if I say that you’d need a new graphics card to use this kind of lighting system even in a game as old as Q2?

    1. Lanelor says:

      Yeap, you ought to have RTX 2060 or above. The site uses 2080ti and it runs on 1440 resolution

      1. Sleeping Dragon says:

        It’s a good thing I’m not that into graphics since I just upgraded my PC last year and it’s highly unlikely I’ll be able to repeat this feat for the next couple years.

    2. Kamica says:

      Basically it’s a new rendering technique, unlike what all graphics cards in the past decade (two decades?) has worked upon. Shamus made an article about it a bit ago going more in depth, but basically the graphics card you’ve got uses rasterisation for all its 3D stuff, which basically requires you to perform ALL THE HACKS in order to get photorealism. This new technique uses path-tracing, which works basically how Greek philosophers thought vision worked =P. (Rays shoot out of each pixel of the screen and bounce around a few times, collecting colours, and then at the end it displays that onto the screen)

      1. Michael says:

        It’s not exactly new. Ray tracing, at least at a concept level, has been around for decades. The problem has been processing power. So, ray tracing used to be something you’d use as a quick, “hey, does this have a line of sight, I need to check,” to being able to simultaneously send out every possible ray and seeing what it lights up.

  3. ??????? ????? says:

    Hi, Shamus! Looking at the videos, my consern with most ne games is the lack of reactivity and real alternative story. Most games after 2015 look pretty nice, but are becoming boring to play as the same goody character for the last 25 years. When can I stop saving the universe and start my own gang of thugs, develop a crime ring, plant a puppet as a head of country … Or just be on the other side of the usual strugle? In short, adding ray tracing to the same old formula is just as dumm idea as going for photorealistic graphics in 2010+.

    1. Kamica says:

      I don’t think it quite ticks all your boxes, but have a look at Kenshi? =P. (I’ve been really enjoying it, but it’s not for everyone I reckon)

    2. tmtvl says:

      Enclave, Dungeon Keeper, Overlord, Evil Genius, War for the Overworld, Pathfinder Kingmaker, Way of the Samurai 1 – 4,…

    3. Echo Tango says:

      Play indie games. The big budgets are all focused on chasing graphics. (Very rarely, you’ll get something like Prey 2017 or Prey: Mooncrash, that’s got better mechanics.)

    4. Paul Spooner says:

      Beyond supporting Dwarf Fortress development you mean?

  4. Asdasd says:

    It makes me uneasy somehow. Like my brain is being tricked, or trying to be.

  5. Zak McKracken says:

    Is the hardware requirement for Q2 so high because even for the simple geometry raytracing is so expensive, or is it because only the high-end cards support the feature? I would think that the computational requirement scales linear with the number of pixels on the screen, and in some other way with texture resolution, polygon count and number of bounces that are being computed (i.e.: how much indirect lighting/reflection is involved).

    So, I’d imagine that a future low-end card with the right capabilities should be able to run the game just fine in a screen resolution befitting the low-res models.

    1. decius says:

      My guess from the name ‘path tracing’ is that it scales with the pathfinding complexity of the geometry of the level times times the number of light sources times the number of pixels on the screen.

      Or it could use only the brightest light source, which I think could be reduced to (pathfinding complexity plus the number of lights) times pixels on screen.

      1. Geebs says:

        From the video it’s pretty clearly using multiple light sources.

        It’s a pretty good plan to use the early Quake games for demonstrating path tracing since the BSP level format is already highly optimised for line of sight (Q2 ran in software, after all) and there’s rarely ever more than three NPCs on the screen at a time.

        @Zak McKracken – yeah, the raytracing feature is only fully supported on RTX hardware, which Nvidia is charging an arm and a leg for. The new GTX 2060 could probably run this at 1080p, but doing full lighting requires far more rays per frame than just a few localised reflections (like Battlefield 5, which most of its lighting using standard raster techniques).

        Basically, sit the RTX 20xx series out, since the RTX 40xx series will be released by the time anybody makes a commercial game with properly implemented raytracing,

        1. decius says:

          You can have multiple light sources and still only use the brightest one. Instead of finding a path to every light source and combining them together, find the best path to ‘any light source’ using the rules that apply to light pathfinding.

          1. Geebs says:

            They discuss this on their website; they cast a ray towards one random light source from both the directly visible surface and indirectly visible surface (I think this refers to the first reflection bounce). The two random light sources are selected from the locally-relevant lights identified in the level data structure.

            Looking into their site further, the really fascinating thing is exactly how much work their temporal reconstruction filter is doing to remove noise. I do rather wonder how much of a filtering effect this has in terms of increasing the relative importance of the strongest light source in the immediate locale i.e. do you end up with effectively single source lighting per pixel after you’ve introduced all of those random effects and then squished them into an actual image using a highly conservative process.

            1. decius says:

              Huh. That suggests that a lot of dim lights make a scene darker?

      2. Zak McKracken says:

        From the video, I’m fairly certain you can’t really count the lights, as they use area lights — any ray that ends in one of the surfaces which provide light, makes the surface it came from brighter. In this kind of system, making the area lights smaller (and more intense, to get the same amount of light) will actually increase render times since you need to cast more rays to hit the light sources sufficiently often to get a non-noisy brightness distribution on all surfaces.
        And they’re using multiple sources alright — but as I said, that may actually even reduce render times because in most cases, a ray that hits a light source can just end there, as opposed to one which hits a surface.

        So it’ll be something like n_pixels x n_polygons x n_bounces^f … times some other things . f will be a number <1 related to how many rays end, on average, after a bounce, so it's related to how much reflectivity/absorption/light source surfaces are in the scene. On top of this, there'll be some nonlinear function relating to cache/memory/execution efficiency, which may then make the whole thing more or less nonlinear, depending on a bunch of real-world hardware restrictions and memory/cache optimisation. For example, testing ray collisions may not actually scale linear with the polygon count as I assumed, if there's some parallelisation at that stage, or a good way to test the most probable polygons first, or somesuch.

        1. decius says:

          Testing ray collisions from the camera is expensive and trivial- you already have to do it once anyway, even if you do it the very hard way by using a z-buffer and drawing all the things.

  6. Zak McKracken says:

    I’d tend to think that they must have updated the assets in some way as all the lights, explosions etc. are now area lights, and while I didn’t really play the original, I don’t think it had a light-emitting material property. So maybe they also added something to auto-generate reflectivity maps from bump and texture maps? Also note that in some screenshots the floor looks like the relief was still just painted on and doesn’t reflect the lighting situation. So maybe they just gave the assets a quick and dirty make-over and added reflectivity maps only where they could auto-generate them?

    And I think, too, that the main two benefits of raytracing is that developers can get dynamic lighting and reflections for free while also reducing the costs to set up realistic-looking scenes much quicker and cheaper. I could imagine also seeing bre-baked shadows and lighting for static parts of scenes at some point — which could be done automatically. The raytracer would then only need to add the dynamic objects and lights. That would probably require some more development work on the algorithm side but reduce the hardware requirements further.

    1. Olivier FAURE says:

      Yeah, I had the same thought. My (imaginary) money is on auto-generation for at least 80% of textures.

    2. decius says:

      Pre-baked shadows for static items will interact poorly with shadows for dynamic items.

      1. Zak McKracken says:

        I think that would very much depend on how you implement it.
        If you have your pre-baked scene and a dynamic object moves through it, what you do is you shoot your rays as you would usually. If they hit a static object, you already have a diffuse colour stored for that (you’d still need to compute reflections, as they are PoV-dependent), and you could also have stored which light sources have visual contact to it in the static scene. Then you just cast rays between that point and the relevant light sources, check for collisions with dynamic objects and adjust accordingly — done! All the interactions with static objects are already in the initial value which you had stored.

        That becomes more complicated if your lighting model also includes multiple-bounce indirect lighting or caustics caused by dynamic objects and lights. In that case, there’d be more iterations and the advantage of pre-baked static scenes would decrease. Depending on how much of the scene is dynamic, the logistics of it all could be more expensive than just computing everything dynamically, but that’s too applied for me to have a strong opinion on.

        1. decius says:

          No, I meant that if you have a static box with pre-baked shadows and push a dynamic box up to it, the area where the shadow of the dynamic box meets the shadow of the static box is going to be jarring.

      2. RichardW says:

        I would think they’d have simply removed the shadowmap textures from the game if they’re going for a fully raytraced approach, there’s just no need for them and it would’ve caused visual errors if they’d stuck around like you said.

    3. I seem to recall reading they changed no textures, but did add material to the renderer which could explain the specular effect seen.

      The plasma shots etc. fizzle out sooner or is almost invisible with raytracing as the old quake engine over exaggerated them.

      So while the lighting and shadows are more correct and natural now, some of the artistic lighting and shadows and related effects have been lost.

      Also, 60FPS was mentioned, with a 2080ti at 1440p? That would imply a 2080 could do 1080p at 60fps, and a 2060 might do 720p at 60fps.

      In theory this should work on the GTX 1660 (or whatever it’ll be called) if it has Tensor cores. None of the RTX stuff yet (not even battlefield) actually uses the RT cores, they just use the Tensor cores (which makes no sense to me, wouldn’t the RT cores be even more powerful than just the Tensor cores?).

      Another odd thing is that the youtube video of all this is at 48fps, which makes me wonder if perhaps vsync was turned off and the 60 fps was just peak average, and youtubes algo saw a average framerate of 48 (by looking at frame to frame changes) and assumed 48 fps.

      The result is microstutter in the video, which I’m unsure if it’s due to the youtube encoding or the game recording.

      I believe the Q2 RTX renderer is open source so I guess we’ll see more videos soon though and benchmarks possibly.

      Edit: And what we’re not seeing is the RTX off / RTX on toggle.
      That would show how bland and nerfed a game looks when there is no RTX (and no traditional game rendering).
      This is partly why RTX adoption is so slow, game devs need to do a normal rendered an a RTX renderer, this doubles the work needed.

  7. RCN says:

    It’d be cool to have a side-by-side comparison of this with the original quake 2.

    It looks really good, but I played Quake 2 a long time ago, and it looks good in a way that “it always looked this good”, even though objectively I know it didn’t. This falls precisely in the gap of “it looks like you remember it, not as it actually was.”

  8. tmtvl says:

    For me the raytraced version of Quake 2 hits a type of uncanny valley, I really don’t like the way it looks. Kind of reminds me of an HDR version of UT99.

  9. Maryam says:

    I didn’t play the Quakes, but is that how the water looked originally? I’m assuming that at least has to be an updated texture.

    1. Joe says:

      No, and probably. They’ve done something to the water, certainly.

      1. decius says:

        The water is definitely different. It might be that the water used some of the things they changed, just like the plasma bolts did. I think the original plasma bolts produced light around them but didn’t produce shadows?

    2. Droid says:

      I don’t know about Quake II, but the original one certainly had this grimey, green-grey swirls on its surface that made it seem really dirty, as if a thick oil film was on top of already polluted water, but was in the process of being broken up in certain places while still lingering in others. I remember wondering how jumping into it could possibly NOT hurt you instantly.

  10. Abnaxis says:

    What does the ‘K’ in “Q2VKPT” stand for…?

    1. Shamus says:

      As far as I can tell, it stands for VulKan.

      1. Sleeping Dragon says:

        Maybe they were German? “Kan ve do zis? Yes Ve Kan”

  11. Matthew Downie says:

    I couldn’t tell the difference between that video and my memory of playing Quake 2 twenty years ago.

  12. EwgB says:

    I think you’re a bit off on the price there. The 800$ is about right for the 2080, which is I believe the second most powerful after the 2080 Ti (which costs well over a thousand), but the 2070 can be had for 500-600$. And the next one down, the 2060, came out recently for 350-400$. The price drop-off is pretty sharp here.

  13. ccesarano says:

    Unrelated to Ray Tracing but to your statement on the quality of Quake 2: I didn’t really get into first-person shooters until Halo: Combat Evolved. I really got absorbed into the genre, enjoying even some of the worse games simply because it was all new territory for me. I still recognized the lesser games as lesser, though my comparison point was what many PC gamers at the time mocked as being a “substandard” FPS.

    That said, I didn’t play Quake 2 until 2005, when I bought it cheap on a whim online and decided to give it a go. I was surprised at how engrossing it was, and as a result turned Quake 4 into one of my must-have games when I bought my first Xbox 360 the next summer.

    ……..Quake 4 was immensely disappointing…

    I’d love for a team to go back and give another try to that Strogg War setting of Quake, but I’m not sure anyone at ZeniMax or Bethesda would think it worth investing in.

  14. Jabrwock says:

    It’s possible it tries to do some auto-bump-mapping based on the texture itself. Like if there’s high contrast between adjacent pixels, assume it’s trying to texture a surface with edges, and try to map based on that.

  15. C__ says:

    For comparison purposes, anybody knows which level from Quake 2 is this footage?

    1. decius says:

      The first one.

  16. Anachronist says:

    Hmm, for some reason this made me wonder what Minecraft would look like raytraced…

    1. Exasperation says:

      You wouldn’t be the first; do a search for “minecraft ray tracing” – you’ll get results going back to at least 2010. It’s kind of interesting to see the progression of image quality as the technology improves over the years. And yes, the most recent results involve RTX.

  17. Simplex says:

    “these cards start at $800”

    349$ https://www.pcgamer.com/nvidia-geforce-rtx-2060-review/

    But I’m not sure if that cheapest card could handle raytraced Quake 2 at a reasonable framerate.

Thanks for joining the discussion. Be nice, don't post angry, and enjoy yourself. This is supposed to be fun. Your email address will not be published. Required fields are marked*

You can enclose spoilers in <strike> tags like so:
<strike>Darth Vader is Luke's father!</strike>

You can make things italics like this:
Can you imagine having Darth Vader as your <i>father</i>?

You can make things bold like this:
I'm <b>very</b> glad Darth Vader isn't my father.

You can make links like this:
I'm reading about <a href="http://en.wikipedia.org/wiki/Darth_Vader">Darth Vader</a> on Wikipedia!

You can quote someone like this:
Darth Vader said <blockquote>Luke, I am your father.</blockquote>

Leave a Reply

Your email address will not be published.