LOL

LOL
archive.is/ZUSPc
archive.is/MwAqz

Attached: Screenshot_20180810-121758_Samsung Internet.jpg (899x400 238.63 KB, 79.79K)

Other urls found in this thread:

archive.is/DSfsD
youtube.com/watch?v=tjf-1BxpR9c
youtube.com/watch?v=CFRZfCgIs2E
youtube.com/watch?v=lMSuGoYcT3s
invidio.us/watch?v=KJRZTkttgLw
twitter.com/SFWRedditImages

I don't get it. Is it the 132% vs. 8% comparison, or the fact that hype is known to dissapoint?

Is this a typo?

What is there even to use a new card on? We've got indieshit, console ports, and even Epic now makes games built to run on recycled macbooks like fartknight. The only game that still has graffix is shekel shillizen and it's not even a game.
Do I buy this so I can hit 600fps in warfarm? Is it only useful for datamining and buttcoins?

I just play old SNES games my dude

with new hardware there should be games that allow to toggle better graphical effects etc, otherwise theres no need to buy, make it an ultra ultra setting somewhere in the menu, oh and dont just drop in vendor locked bullshit, make it settings that takes advantage of ALL hardware,
specific high end features bra

Do graphics cards and games suffer from stagnation just as the rest of tech? I'm not gaymen for some time now, but this sounds a lot like a case of diminishing results. No more giant leaps like in the past where the difference between a card supporting dx8 or 9 made a big difference in visuals.
And how about,
Face it, Nvidia is scared shitless because the market is dying, hence retarded marketing things like "founders edition". It's like they are creating a cult or something.

What happened is a combination of normalfags and the turd world getting online and dragging everything down with their shit hardware and shit taste.

Maybe to some extent, but at a certain point, are people really going to care if lara croft left ear consists of 200 or 300 polygons.
Innovative or just really solid gameplay with source tier graphics > triple normal ambient occuded foliage.

fortnite has terrible performance. it's been slowly getting better though. still in tilted towers and retail row (but not pleasant park for some reason) your framerate will drop to shit

Apparently it isn't: archive.is/DSfsD

Monitors.

4k, 240Hz, 30-bit, surround multi-head, stereoscopic, multi-angle, VR, volumetric, etc.

Plus, who knows, maybe RTRT driver patches will happen?

basically yeah its a useless card at 4k at 60fps/hz still because once again RTG dropped the ball be prepared for another 2 years of nothing.

(x+1)80 being slightly faster than x80 ti is pretty much par for the course for the past 10 years(shit, remember 960? Before Nvidia started gimping kepler, it was performing on par with 760). The big difference is that it used to be a little over a year between new series. Going from 900>1000 took over a year and a half and going from 1000>1100 has taken more than two years.

Won't be any use for these cards till new games with new engines using real time raytrace and even then these cards will be slow as fuck 30fps 1080p in such modes
Gonna wait with my 1080 until 202x

Graphics stagnate due to lack of even passable coders, games stagnate due to corporate BS. Game Industry pay is shit and they treat their staff like shit, so anyone that can get another job does. That leaves the incapable diversity hires (see Ubisoft) who couldn't make a new engine if they could write it entirely in Python.
AA is dead, AAA is exclusively design-by-committee with the goal of not leaving out a single market, which means easy cookie-cutter games.
Since modern games lack advances in tech and just appeal in general, nobody but graphics designers buys cutting-edge graphics cards, so there's less money put into development.

Bitcoin is currently below the price where its not even profitable to mine using dedicated hardware.


Having seen that tech in person you have to have a very loose definition of what constitutes 'real time' to call it that.

The DGX Station (which costs $69,000) with its 4 V100 GPUs acheives ~2fps at 4k from what I remember seeing at GTC2018.


Whats depressing is that most game studios don't even emply GPU programmers, they contract out the optimization of their graphics pipeline to AMD and Nvidia.

Yeah I know what you mean it's still very low trace path counts a d they use that approximation denoise shit
It's still better than traditional non real time raytrace but yeah I really don't see graphics going anywhere new for the next decade or more now that vr and high resolutions AND silicone limits have finally hit the wall
youtube.com/watch?v=tjf-1BxpR9c
Only way forward I think is highly optimised and parallel dual quad Oct core modules sharing embedded hbm gen3+ memory and optic fiber bridges between cards making it prohibitively expensive for home use so gen 10+ gaming pcs and consoles will probably try streaming instead

Actually they use a neural net which runs on the Tensor cores to denoise the image.

Why the fuck are they calling it game works then? Fucking marketing stunt

Graphics programming has stagnated due to the opacity of the GPU as a platform turning optimization into guesswork. Instead of providing proper docs, they have developers blindly release "good enough" code that barely runs and then use driver updates after the fact to "fix" the game by injecting their own shader code developed by coders who are actually allowed to look at the docs. Blame NVidia and AMD if anyone.

We could do RTRT in 2000, entirely in software, on a single Athlon core, without SIMD:
youtube.com/watch?v=CFRZfCgIs2E

640x480@30FPS, sure, but I'm certain the 1-2 teraFLOPS of raw compute between modern CPUs & modern GPUs via Vulkan would bridge that gap.

AMD actually release docs describing how their architecture works so its really just Nvidia.


The difference is the effects which make it look more realistic, doing simple shadows like in your vid is easy and game engines have been doing it since the early 2000's, DOOM3 had far better lighting than that tech demo.

Compare it to this
youtube.com/watch?v=lMSuGoYcT3s

Only real benefits of next gen tech should really be savings,
you can do many effects with fewer tools,
say you ray trace(with really optimized algorithms and hardware) what you used to with 3-4 different complicated things at lower, similar cost or slightly higher cost

and dev time too
tech to do more with what you have

You might also notice a number of other features that still don't exist in modern Gouraud-shaded games, like true refraction, true reflection, zero-penalty dynamic global lighting, and (unlike the demo you linked, though that's a problem with their assets rather than the renderer itself), non-polygonal geometry.

Nah not happening until 7-3nm stuff in 202x nvidia and amd sandbagging in case Raja's new 2020 Intel gpu is any good

The RT in RTX stands for Ray tracing. All it really mean it that your polygons are going to be oiled up like hookers so that you can see those "Real Time Reflections™" That's it, that's Your Premium Geforce experience. You get to see reflections in Ultra HD, until it dies out like the Mega tessellation they threw into the Crysis games.

Attached: 2019.png (2011x323, 444.52K)

wtf?

...

...

Ray Tracing Reflections are irrelevant if they cant compete with very optimized SSR solutions
If it looks more or less the same as some customized SSR solution but that runs way faster, nobody will bother using it

whatever is raytraced has to have clear graphical or perf advantage
i am not seeing any of that
to get the quality past good SSR requires resources and graphic cards that no normies can afford at the moment

same problem with VR but even worse,
it has no games that can compete with proper single player experiences, motion controls were in fact ABANDONED by consoles few years ago and there are almost no new titles using them

Plus the tracking is still terrible

Vile and disgusting.
invidio.us/watch?v=KJRZTkttgLw

Attached: Ooga booga.jpg (1073x590, 294.81K)

Because their tracking was shit compared to now and when used with just a TV. In VR it makes a lot more sense as the controllers are an actual part of your body and instead of having to carefully aim at a small area on a screen at the other end of the room, you can do shit like blind firing around corners etc. Most people can't touch type either so it's simpler for them to memorize a controller too.

>>>Zig Forums

>>>/zoo/

Attached: diversity.png (1257x333, 449.53K)

go away nigger lover

Shut the fuck up.