DLSS 2.0 is a game changer. 4k gaming is no longer just a novelty

DLSS 2.0 is a game changer. 4k gaming is no longer just a novelty.

Attached: 68D41B0B-0456-434E-8342-41F39D3CCEC3.jpg (1280x720, 113.91K)

Does it work on every card or do you need the latest one which defeats the purpose

i think dlss is cool and all but exactly how many times a day do we need to have these threads? it's getting old and tired.

rtx cards that first came out two years ago

>glorified poorfags fake 4k
>90ms input lag

Attached: Index122.webm (432x298, 91.36K)

It is?
Like a fucked up Vsync?

It's not. AMDfags keep spouting out nonsense.

vsync delay is literally a fraction of a frame

If this becomes common in games how is AMD gonna compete?
>t. not a nvidia shill just curious and uninformed

they'll add cores to cards to utilize dlss, its not like you need to do the effort of processing stuff on a supercomputer trice

who are you quoting?

me

I wore the gold suit for the rest of the game once I got it. Thing is snazzy.

>glasses off
>glasses on

unrelated to the grafix but did people like control? I played it a few days ago and I thought it was just gonna be a mediocre scp game but it was a pretty great scp game, and the combat was pretty fun
it was a bit clunky but it was never too demanding or precise for that to be a big issue
I liked it so much I got the dlc after I was done playing it, the frame rate did shit itself sometimes though

i just put the normal outfit back on, even after post-game, i didn't really like how the others looked

A similar technology is also available when watching TV/movies using the Nvidia Shield. It can take any quality below 4K and enhance the visuals and it actually looks fucking great. I thought it sounded like a gimmick until I started watching 1080p sources on my 4K OLED and used it, and it makes the picture quality look as close to 4K as you can get. Even 720p sources still look incredible. It's only a matter of time before this AI enhancement technology is implemented in more devices.

Attached: 1560637739443.jpg (1280x854, 357.28K)

You can't quote a signature

Attached: 1587027675815.jpg (500x375, 25.65K)

Eh
I liked the not SCP stuff but the combat with the respawning enemies was a chore.
The ending was extremely disappointing for me, not sure if the DLC is good or not.
Oh and FUCK that worm boss, that took me a lot of tries.

>details visible
Not kino, repeat not kino. Games should look like movies filmed by a cameraman with parkinsons who smeared vaseline on the lens or else they have no soul.

>I liked the not SCP stuff but the combat with the respawning enemies was a chore.
It would have been better if it was slightly more manageable, it was annoying to have to try and run around and get them to spawn, especially for the missions
>Ending
Eh, the climax of the game was 100% the maze and while the rest of the game was nice it was mostly just a tapering down.
>Worm
See I thought that was bad at first too, then I realized that telekinesis absolutely shits on it since you can use the exploding rocks on it.
I also thought the missile homed around the rocks you used as cover, but they didn't, so once I found out those things I killed it in one go.
Tomassi's post-game fight was the hardest, only because of the invisible thing that showed up halfway through and could deal like 90% of your health. What I ended up doing was maxing out my pierce, slapping 3 damage mods on it, and then hiding in the rafters and sniping him so the invisible thing couldn't hit me.
I was surprised (but pleased) that "Hey go out and get supplies for us" only ended in them turning into zombies once, and it was for a relatively meaningless sidequest too.
I haven't actually played the dlc yet, but it seems nice enough, and then there's a second one coming out soonish.
I never played alan wake or quantum break but even though I really liked this I don't think I would like them since when I looked into it they were pretty different mechanically and the whole not SCP flavor probably covered up any gameplay misgivings I had too.

How the fuck does the image quality get way better and sharper and you get an FPS boost?

some techfag that can explain?

the nvidia driver guesses what the image should look like based on the real pixels and adds pixels to fill up the 4k res
Theres only 1080p worth of real pixels being rendered so the fps is higher

did she poop her pants?

while it is amazing, the whole game needs to be "prerendered" by ai for it to work and you have to go directly thru nvidia. most developers are too retarded to even make their engine work on more then a single core, let alone the requirements for dlss

It's going to be a lot cheaper to get a high quality picture. I can just use my 4k TV

Would actually mean something if there was a budget RTX card. Upgrading to a 4k monitor for this will just fuck up anything that doesn't support DLSS.

The computational cost of having he upscaling AI guess what the frame will look like at 4K is cheaper than actually rendering the frame at 4K

Not anymore, any game that supports TAA is able to support DLSS 2.0. The future is here.

oh shit, that's crazy. Does it still look as good as a 4k native resolution?

Depends on what the image is comparing. Obviously NVIDIA will only pick what makes them look good.
They might be comparing 1080p -> 2160p vs 1440p; that would be the most reasonable explanation.

ah, so thats the reason RTX cards dont have any fps impact with taa enabled

I'd say it looks better.

some games have a cheap AA solution and the AI solution can look better than a shit native 4k with cheap AA
Digital foundry showed that with death stranding video

One "side" of the GPU is processing everything in 1080p and other "side", that only exist on RTX, its optmizing the 1080p output.

>Does it still look as good as a 4k native resolution?
No, it isn't perfect. It's more like a better alternative to MSAA and upscaling/sharpening solutions
We would need a 2160p (1080p) vs 2160p to see that

This idiot overspent on a Radeon instead of overspending on an RTX.

gtx 3080 fucking when? I aint wasting my money on a 2080 meme.

September

It uses the tensor cores on the RTX cards.