NVIDIA Unreal Engine ray tracing demo showing physically-accurate light dispersion in real-time

youtube.com/watch?v=ttghcA4kUQU

Holy moley, this is insane. You can actually simulate physically-based light caustics through prisms and it all calculates in real-time.

I can't even believe this is happening in 2020, I thought this stuff would be at least a couple of years away.

Attached: image_201486.jpg (1280x720, 140.37K)

GOD LIGHT TRANSPORT ALGORITHMS TURNS ME ON

Attached: file.png (749x351, 672.83K)

Mmm yes

We can't still get something like Toy Story 1 in real time, but we can cheat and obtain something that is visual far more appealing.
I really don't know where I stand with RTX. It's technically correct but it's also a resource hog. We can cheat our way through similar thing for a fraction of the price with shaders (although something as detailed as what's show in the video won't be doable).
Given the ghosting in RTX, the cost of that tech, and how little I care for being "technically correct" as long as the art direction is good, I'm really still on the fence with this thing. It's a cool tech, a nice visual boost, but it's not really making better games.
IDK.
Nice vid.

You didn't know any of this existed or was being worked on until you watched the video, retard.

It does allow for light-beam based mechanics to be more accurate, so I would expect some weird RTX-required games in the future

>relfections in glass and puddles and shit
woooooow

Ray tracing might be the future, but it's nowhere near ready to replace old lighting tricks. I'll only be truly impressed once they can actually calculate all the lighting in real time instead of relying on smoke and mirrors, but right now even those demos have artifacts like the colors of the shadows slowly filling in as you start looking at them.

Attached: 0-2020-10-07_02.20.42.webm (1084x610, 2.09M)

>unreal

Attached: Xt8drk1_d.jpg (450x450, 31.6K)

Ray Tracing will only become standard in next next gen.

It won't. GI isn't a standard yet on this gen.

GI?

Not exactly reflections, it's calculating the light as it passes through the glass and splitting the light into different colors on the spectrum as it would in real life.

Attached: DWL Presentation.jpg (1280x1024, 172.06K)

Global illumination.

Attached: Screenshot_2020-10-06 Quantum-16 jpg (JPEG Image, 1920 × 1080 pixels) - Scaled (89%).png (1707x475, 1.13M)

Hunt Showdown has had real-time GI for the past 2 years. It's not raytraced, but it looks good and most importantly doesn't tank performance.

GI will probably be good in next gen and in next next gen we will see fully ray traced games. PC market will obviously have that tech 2-4 years before consoles as always. But future of consoles are still uncertain too. MS might want to go with full streaming/service model instead of releasing a console too.

holy shit, fucking science copying pink floyd

This is what happens when 100% of your GPU can be used to do this. This is not the case with videogames.

Doesn't ray tracing include global illumination?

OK but will that make games fun?

>games come with realistic lighting
>hair and skin textures still look like Play-Doh

this won't make video games more fun or better even one bit.

Attached: bored kid.jpg (300x240, 9.28K)

You could probably have a neat puzzle game with light as the mechanic

>30-40 FPS
>For an empty room with 4 prism and 5 lights
It's insane that people still think anyone will use anything from those tech demos

Videogames don't exist for your pleasure user.

>Gush over how light makes incredibly beautiful rainbow color due to diffraction
>Reveals in the video he's colorblind
Kek, really picked the right topic, here, dude

I dont want art to just "be like reality" I can just go outside nigga Ray Tracing is the qorst fucking meme.

There’s no point posting this on Zig Forums where 5% of posters actually understand what is going on

>We can't still get something like Toy Story 1 in real time
I'm calling bullshit on this. What the fuck was 3DS max using in the 90s that we don't have standard in game engines by now? The only thing Toy Story 1 should have is the animations, which were hand crafted frame by frame and expecting that out of a simulation is insane.

Toy story still used ray casting that is still stupidly expensive to use even on meme cards
Video games cheat by using very few rays with limited rebounds

The big difference is that every individual hair on the head of each character is actually modelized and reacts to physic individually. Right now, you can't have your 60 fps of real-time rendering of what Pixar rendered frame by frame with an army of computers. So instead, we shove multiple strands of hair on one triangle and call it a day. It's enough for our needs, but it's incorrect from a purely technical standpoint.
The problem is not so much what 3D Studio Max could do back then, but how much power you needed to render your scene with the proper physic, lights, and the amount of triangles.

Ray tracing looks best when done with particles and diffuse light. Reflections are probably the least interesting part and can be done with raster in real time unlike the others.