Since the unveiling of Nvidia’s RTX 2070 and 2080 cards a couple weeks ago, I’ve started to hear a lot of chatter that’s pretty dismissive of the technology. While I’m not sure this first gen is going to be the thing that will convert people, or whether the pricing Nvidia has gone with makes sense for the market, I think people are underplaying just how big ray tracing is. This tech, more than almost any graphics tech since we started using dedicated 3D acceleration cards to play Quake, has the potential to change how games are developed and played.
If you’ve followed our coverage of the RTX cards, you might already be familiar with just what ray tracing is, but let’s lay the definition out here to make sure we’re all talking about the same thing.
Ray tracing isn’t just another step in adding more flash to games. It’s a basic aspect of 3D rendering tech that has been a sort of holy grail for game development for decades. Ray tracing is the act of tracing rays of light through a 3D-rendered scene to accurately follow light rays in that scene. This affects not just lighting but reflections as well. Movie studios have been using this for years. When you hear Pixar talk about Toy Story taking thousands of hours to render, ray tracing is why. Any time you introduce light into a 3D scene, you can go about rendering that light in a couple different ways. You can ray trace it, which involves tracking billions of rays of light to see where they go and then rendering things with that. Or you can simulate it with some tricks.
For example, let’s say you have a room with a table and a window, with a glass vase on the table. Sun is streaming in through the window. You can see in your head how it would all play out in real life, right? When you open the window, even though the light is only coming in through this box on the wall, the whole room lights up because light bounces. If you have a white tile floor and bright white walls, the room will light up one way, while a David Lynchian combination of earth tones and rough materials would stifle the light. Light filters through the glass vase, drawing a distorted version of the vase on the table. If the vase is full of water, the light will bend differently, while flowers will create their own silhouettes on the table if the light is right.
Right now, if you want to create this room in a game, you have to basically do a bunch of small magic tricks to make the whole magic trick work. You have to put in that main light source, of course, but then you have to put in smaller, invisible light sources to simulate where the light might bounce and make the space visible. You have to put in invisible texture maps that tell reflective items what reflections to show when you look at them. And further, you have to decide what stuff to “cull” or cut out of the scene, because modern graphics cards can’t handle rendering the whole world around us.
The stuff you can’t see when you’re navigating a game like Battlefield is is simply not rendered, while the server tracks the locations of things like other players and any bullets they’ve fired to tell if you’re getting shot. If an explosion happens behind you, you’ll hear it, but you won’t see it unless the developer adds an effect to show that an explosion happened. They have to draw the light source and the reflection, and then make sure both things work the same way. A lot of the reflections and lights we’d see in real life simply have to be dropped when rendering these scenes because our cards can only handle rendering what’s actually on your screen right at that moment.
Game developers have used all kinds of wild tricks over the years to simulate stuff like this. In games with mirrors, developers have created entire rooms behind the mirrors, in which a double is rendered and moves with the same inputs that our character does, just flipped, so that it looks like a reflection, but really it’s a mindless double, like some horrifying Twilight Zone horror movie nightmare.
With ray tracing, a lot of this goes away. You tell that table in the room how much light to absorb and reflect. You make the glass a reflective surface. When you move up to the window, you might be able to see your character if there’s enough backlighting. Getting up close to that vase will show you a funhouse mirror version of your face. The room lights up with one source of light, and no one had to draw those reflections or that vase’s bent shadow on the table. Instead, they’re just there, and they just work, because ray tracing takes care of all that stuff. That’s why it’s both so difficult for consumer-affordable hardware to do and so desirable for game developers. It saves a lot of time spent manually simulating things and lets the creators just create.
On the game developer side, that’s going to save tons of time. That could mean fewer game delays, because developers aren’t spending time fixing cube maps to make reflections seem real enough, tweaking lights to get a scene to light up correctly, or anything like that. There’s less to debug because instead of 10 different things simulating the effects of one light source, there’s just one lights source, and the other elements around it will get in the way of the light or bend it just by being there. It could also mean less time spent on stuff like that and more time spent on more interesting features. Instead of trying to make lighting look realistic, the lighting will just look realistic. Instead, the artist can focus on making more interesting objects in the space and more realistic textures. The layout person shift their worry off of making sure the room lights up as expected. The particle effects artists don’t have to worry about creating the effect and its reflections.
So we won’t just get flashier games out of this, we’ll get more interesting games and maybe even less buggy games. It’s a lot easier to make one magic trick work than it is to get 10 working in concert.
On the gaming side, though, things will change, too.
We make all kind of calculations in daily life using our many senses and our understanding of how the world works. You might be more willing to take a blind left turn at night because you know that if someone’s there, you’ll be able to see their car’s lights reflected off the trees and signs around you. You know your computer just woke up in the next room because you can see the monitor’s light, reflecting off the walls in your office, flooding out into the hallway, even though you can’t see the monitor itself or hear the computer.
In a horror game, you could deliver scares through reflections in puddles, in rain-spattered glass. A flash of a face in a window or a quick glimpse of a tail slithering above you as you run down a hallway. A flashlight changes completely in a situation like that. In a war game, you can make assessments about the battlefield just from looking at your surroundings. An accurately-rendered shadow – rather than a painstakingly simulated one that is guesstimated at best – could be the difference between life and death in a match’s clutch moment. All those reflections in racing games that we’ve been marveling at for years will be actual reflections instead of simulations.
We’ll also start seeing games based around the idea of ray tracing, which could lead to more interesting puzzles that simply weren’t possible before. In a stealth game, that mysterious light on your character’s back could become a liability, but you might be able to slide a mirror under a door and get an accurate reflection of the room on the other side that shows you what’s actually happening in there.
It’s a bit like putting glasses on for the first time. You didn’t know you couldn’t see very well before. Once you take the glasses off, though, it becomes apparent just how blurry everything is without.
I don’t want to be overeager here – it’s going to be a little bit before ray tracing goes mainstream. The RTX cards are the first hardware to make ray tracing “consumer grade,” and adoption will be pretty slow at the current prices. It’s a lot to ask for someone to spend $600 minimum on just a graphics card, and while there are games coming out this fall that support the tech, it’ll be a while before it’s ubiquitous.
The biggest mistake Nvidia is making right now is not releasing piles and piles of videos showing how much of a difference ray tracing makes. There should be side-by-side comparison videos for every game that will have RTX tech embedded this fall, as well as some custom Nvidia-produced demos that are a bit easier to visually parse. Videos like the one below look great, but they tell the layperson absolutely nothing useful about RTX. This is just a nice hallway to most people. We need to see this stuff side-by-side. This video tells the untrained eye absolutely nothing about the technology.
But make no mistake – this isn’t like other video tech we’ve seen come out in the past few years. It’s not a gimmick like 3D, and it’s not a nice-to-have like HDR. It’s something that will make its way into every card on the market eventually, and every game. Even if it takes a while, it’ll get there. This is just the beginning, and it’s already awesome.
The Galaxy S20 Ultra's Space Zoom camera is amazing and a bit creepy
The Galaxy S20 Ultra supports up to 100X zoom, which Samsung calls Space Zoom, but is it any good? Can a phone really product usable photos at 100x zoom? We've got our Galaxy S20 Ultra already so join us to find out!
Win an iPhone, iPad and Apple Watch with the Reader's Choice giveaway!
What's the best phone of 2019? Is it the iPhone 11 Pro, Pixel 4 or OnePlus 7T? What about the best laptop, games console, tablet and more? Vote NOW in the Reader's Choice awards and win BIG in time for the holidays!
Here are the best products from IFA 2019!
Here are the products announced at IFA 2019 that were worthy of our Best of IFA 2019 awards. Also featuring MrMobile's single best product at the show!
The Dungeons and Dragons loot you always wanted
These are the accessories you need to be at the most prepared D&D player at your table.