LOL

Apparently it isn't: archive.is/DSfsD

Monitors.

4k, 240Hz, 30-bit, surround multi-head, stereoscopic, multi-angle, VR, volumetric, etc.

Plus, who knows, maybe RTRT driver patches will happen?

basically yeah its a useless card at 4k at 60fps/hz still because once again RTG dropped the ball be prepared for another 2 years of nothing.

(x+1)80 being slightly faster than x80 ti is pretty much par for the course for the past 10 years(shit, remember 960? Before Nvidia started gimping kepler, it was performing on par with 760). The big difference is that it used to be a little over a year between new series. Going from 900>1000 took over a year and a half and going from 1000>1100 has taken more than two years.

Won't be any use for these cards till new games with new engines using real time raytrace and even then these cards will be slow as fuck 30fps 1080p in such modes
Gonna wait with my 1080 until 202x

Graphics stagnate due to lack of even passable coders, games stagnate due to corporate BS. Game Industry pay is shit and they treat their staff like shit, so anyone that can get another job does. That leaves the incapable diversity hires (see Ubisoft) who couldn't make a new engine if they could write it entirely in Python.
AA is dead, AAA is exclusively design-by-committee with the goal of not leaving out a single market, which means easy cookie-cutter games.
Since modern games lack advances in tech and just appeal in general, nobody but graphics designers buys cutting-edge graphics cards, so there's less money put into development.

Bitcoin is currently below the price where its not even profitable to mine using dedicated hardware.


Having seen that tech in person you have to have a very loose definition of what constitutes 'real time' to call it that.

The DGX Station (which costs $69,000) with its 4 V100 GPUs acheives ~2fps at 4k from what I remember seeing at GTC2018.


Whats depressing is that most game studios don't even emply GPU programmers, they contract out the optimization of their graphics pipeline to AMD and Nvidia.

Yeah I know what you mean it's still very low trace path counts a d they use that approximation denoise shit
It's still better than traditional non real time raytrace but yeah I really don't see graphics going anywhere new for the next decade or more now that vr and high resolutions AND silicone limits have finally hit the wall
youtube.com/watch?v=tjf-1BxpR9c
Only way forward I think is highly optimised and parallel dual quad Oct core modules sharing embedded hbm gen3+ memory and optic fiber bridges between cards making it prohibitively expensive for home use so gen 10+ gaming pcs and consoles will probably try streaming instead

Actually they use a neural net which runs on the Tensor cores to denoise the image.

Why the fuck are they calling it game works then? Fucking marketing stunt