Who else skipping this shit?

Who else skipping this shit?

Attached: Screenshot_2018-08-21-09-37-20-32-01.jpeg (1487x545, 71.62K)

Other urls found in this thread:

pcgameshardware.de/Grafikkarten-Grafikkarte-97980/Videos/Nvidia-RTX-2080-Ti-Performance-in-Shadow-of-the-Tomb-Raider-1263244/
streamable.com/973rc
youtu.be/mLIgfT04wsY
anu.edu.au/news/all-news/experiment-confirms-quantum-theory-weirdness
hardocp.com/article/2018/08/28/nvidia_controls_aib_launch_driver_distribution/
twitter.com/NSFWRedditGif

It's fake anyway. It won't work in games as if your back is to the light source it will be culled and can't light the room. Or if a game is modified to support the tech and tries to render a simplified light/model behind the player when the geometry is culled then all the lighting will dramatically shift when not bouncing off the same geometry. Games that want it to look right will have to disable frustum culling and take an absolutely massive performance hit.
They presented this shit as simple as "flick a switch" but it absolutely will not work that way in reality.

You can get a nicely loaded complete system for that much.

Me. Current games run fine at 1080p with 970, not paying a shitload for a graphics card, then having to justify buying it by spending hundreds on a 4K monitor, or even more on a VR headset. Not to mention the main games that will take advantage of it will be overpriced AAA bullshit that only lasts 8 hours unless you buy exorbitant DLC to wring a fraction more "fun" out of it. Even then the only benefit you get is not having to mess with a couple of AA and settings and it looking marginally better than before.
With crypto gone to shit and not being that interested in cracking WPA2 password there aren't even any non-vidya applications I could use it for.

Thought so I knew there would be a catch.
Hope those fuck heads like a gpu that's absolutely useless in normal games as well.
Suddenly my gtx 1080 and Vega does not look so bad

Already did. Bought a GTX 1070ti last month.

Attached: MSI-GeForce-GTX-1070-Ti-Titanium-8G_1-740x592.png (740x592, 255.45K)

Isn't the new Vega supposed to be out in Q4?

also skipping. waste of money.


news to me if so. afaik that's navi due q4 / q1 but it's likely targeting mid range. navi should be 7nm so greater power efficiency. if so, we could see miners buy them up and so nobody will get them for months.

loooooooooool the salt next month is gonna be so funny

Attached: ClipboardImage.png (1677x993, 1.02M)

Bought a MSI R9 390 8G in October 2015. Why would I need a new card?

because its slow and hot as fuck?
t. 390x from 2016

fuck me this is insane

Attached: ClipboardImage.png (1041x508, 299.58K)

Why would I buy it? It's not as if the AAA games with cutting edge graphics that may conceivably warrant this in a few year's time have any gameplay worth mentioning. And the GoG I do play probably don't need a graphics card that costs more than the rest of my system combined.

Attached: EVGA NVIDIA GeForce GTX 1080 VRM Catches Fire Caught On Camera.webm (1280x720, 1.51M)

I'm really happy with my Rx 480 overclocked to 1350/2100.

Anybody worth keeping here.

Intel HD 3000 masterrace.

Attached: terry.jpg (802x854, 205.91K)

How many of you actually need more GPU power

LOL 30FPS MIN 35FPS AVG rtx mode
Lol I knew this was snake oil watch the video on this page it never goes above 60fps and hovers between 30-55fps @ 1080p pcgameshardware.de/Grafikkarten-Grafikkarte-97980/Videos/Nvidia-RTX-2080-Ti-Performance-in-Shadow-of-the-Tomb-Raider-1263244/
>b-but it's not finished Bullshit no wonder they didn't demo or give out real benchmarks it's just green smoke blowing out Jensens braaapper

Attached: Screenshot_2018-08-21-22-14-12-08.png (1920x1080, 2.02M)

Don't worry, they'll fix it before release with asynchronous space warp or some other frame interpolation technique

Lol no u can't just double a cards raw power
And idiots said these cards wherent cut down but high end my fucking ass

I would like a more powerful GPU for VR dev. But not in the $1300 for 20% more performance way.

Better question, for those of us not in the several-hundred-dollar-GPU demo: Is this going to push down prices on older cards, especially in the used market?

Attached: 2000px-EBay_logo.svg.png (2000x800, 70.18K)

Lower then they already are? Depends on on how hard the dumb miner fucks offload

Me

Attached: old scraps.png (332x16, 3.64K)

Don't tell me you buy one every year, you dumb fuck.

The last time I bought a video card was a 380x more or less on release, it lasted me 2 years then I got it RMA'd and it works just fine, I don't see the need for there being new cards every year. This is honestly one of the worst aspects of planned obsolescence ever.

Got a very nice 1080, no reason to upgrade for a while.

streamable.com/973rc
Ati was doing this 10+ years ago too lol

[1] It's Nvidia. If you don't skip it, you should stop consuming soy.

[2] My 290x can do what it does with raytracing using FOSS software from 2011 (luxrender).

All I've been playing lately is TF2, L4D2, and Stalker occasionally. No new card can make stalker run better since it's laughably unoptimized

GTX 960 btw

I'll be sticking to mid-tier AMD cards for the foreseeable future. Actually good out-of-the-box Mesa drivers trump the 10% better performance you might get with the equivalent Nvidia card running their binary driver.

And Denuvo will probably make performance even worse.

Those are just standard fake reflections. We do much better fake reflection today via building cube maps with light probes, even shit like Unity supports it natively. But they've got a lot of weaknesses like the point of reflection isn't precise unless you spam them everywhere, they don't react to in-game changes, they require a lot of manual labor, they need adjusted as you make edits, etc..

you are retarded

Just pretending, or legit?

Consider that global illumination cabin they showed and suspiciously didn't move around in. The presentation suggests you just flick a switch and suddenly dynamic GI, the holy grail! The reality is that the textured geometry necessary to cast rays against and pick up surface color wouldn't be being updated on the card when not on screen and there have been a decade of papers on attempts to fudge that. So you have options: disable culling, or come up with a hack. But wait, the promise was that we'd not have to use hacks anymore! Muh flicked switch! Yeah, that's the lie. In reality, this would likely be done via sending an unculled low res mesh to the card for lighting, probably one of the LoD models, and then using the solved lighting to paint the visible geometry. But that's still somewhat expensive and now also chunky, low res lighting. A scene with a lot of small spaces will have issues with them not having shadows, and scenes with a lot of contrasting colors will get the wrong ones. And so more hacks, probably requiring the artists to manually indicate areas to shadow more or maintain yet another LoD model for lighting where areas with problems are left high detail (and high cost), and then have to redo these whenever editing the base model.

Attached: tell me lies.png (999x561, 626.21K)

TL;DR: RTX is a con made to steal $$$$ from unsuspecting 20xx buyers? Sounds like classic Nvidia.

Damn, and I was thinking of getting a RTX2070 to replace my RX480 yesterday. Guess i'll be getting a Vega 56 then!

Something else that I noticed: no benchmarks whatsoever, only some vague "higher FPS than Titan XP and 1080Ti guize!" bullshit. Suspicious?

The smaller, localized effects like better shadow blending in those highly custom areas of womb raider they showed would be the more common use imo but I don't know if anyone would care much - it's like physx just ending up adding more trash and clutter because that was the easy thing to do with it. I think they'd rather have more fps.
The really cool thing of effortless dynamic GI lighting in all games is bullshit but that's probably what people think they're buying.

To me, all this other bullshit doesnt matter. Modern graphics is not perfect.
Ray tracing is not perfect either, but usually produces aesthetically better renders.

The only thing that matters with is is 60fps. IF they can't hit 60fps on their new cards, they simply won't sell at that price. which seems unlikely right now, but we'll find out soon enough

This is totally meaningless as it's up to the developer to choose the fps target. A GTX 8800 from 10 years ago can run 1440p@60 if the scene is simple enough. Gaymers are willing to eat shit as long as it's shiny so devs are happy to drop you into 'silky smooth' territory for a bit more pizzazz. That means that no matter how fast hardware gets, a new AAA game will still run at the same FPS on a new card (exceptions being for console ports that don't track PC hardware because zero fucks given).

go away

Ignoring the gimmick, it's pretty safe to assume they've slotted them in at no more and no less performance than necessary. They have zero competition right now so can safely drag their feet way behind the state of the art and not take risks.

I've bought one nvidia card since 2004. Never again. Sticking to AMD for the rest of time. I used to think it was the poorfag choice, but honestly, nvidia is the retard with money choice. AMD is also going to keep getting my money as long as they have good OS drivers on GNU/Linux.

At some point between ATI and now they seem to have become the crashy shit Linux driver company. I have no trouble at all from our Jewvidia Linux desktops at work other than some settings not saving via the GUI on Debian.

Me, still using a 770.

This. I remember how many of the biggest N64 games hung around 15 FPS, capped at 20, and then a few rare gems like F-Zero X were built to maintain a locked 60, even if it required a rather spartan aesthetic.

Unlike then, hardware has been more than good enough throughout the 2010s that maxing out resolution and framerate is perfectly possible with very good looking graphics. There really is no excuse other than marketing bullshots.

Framerate was one of the things that made the arcade version of Daytona USA look so good.

That's where you start dropping the eyecandy, but PC gamers went from performance tweaking hackers, cobbling together a solution that works even if it isn't pretty, to plug and play retards, who just install, run the autoconfig and start playing.

HAHAHAHA I SKIPPED PASCAL THANK GOD

so just keep all the rooms and objects in memory whose light sources reach the player?

BSP is back on the menu boys

Attached: bsp_poly.gif (711x331, 4.26K)

youtu.be/mLIgfT04wsY
Df doesn't even mention the shit 30-60fps @ 1080p rtx mode on

What gpu u running? I skipped Kepler and sold my Maxwell it sucked.
Asscal higher end is alright

It needs a fully textured model to get sexy colored light bounces. So it'd have to be the real geometry or one of the LoD models. I don't think many devs will use that GI stuff, they'll just use it for prettier spot lights.

I don't know about the proprietary stuff. I'm specifically talking about the open source drivers, which nvidia lacks. ATI used to suck back in the day in this regard, not the case anymore.

Meant for

Real time ray tracing...for all the new Soy Boy nuGamez with PC approved Diversity crammed in for maximum demographic exposure and sales?
Nah.

The latest gimmick seems to have created quite the buzz, though.

you must be under impression that raytracing is something like voxel meme
this is pretty big

I'm under the impression that all the sites are talking about it.

such is the life of a vidya connoisseur

Is there even a ppc64le driver blob for this?
>>>/trash/

Would like to play some old shit in higher res and framerate but otherwise yeah new "games" have about as much worth as an entertainment form as having your balls sewn into a cloth that reads "I'm gay".

Can't wait for the "tech press" to fellate Novideo, telling us how great gay tracing is, after all they had no problem signing the new and improved NDA. Hardware Cucks is already shilling on tawtter after the nuTomb Raider leak. Funny thing. the Metro devs said they are aiming for 1080p 60fps for RTX features on a 2080TI LMAOF.

I still have a GTX 760, and don't have any pressing need to upgrade. Or the money.

The 2080 is barely 30-40% faster for double the price

Attached: Capture.png (1121x592, 269.91K)

Bought an RX 580 and never looking back. Nvidia can go fuck itself.

I got a Vega 56 when my gtx 1080 dies
Fuck nvidia breh

Such is a world where AMD is not competitive.

you wish you skipped this once 7nm hits
both NVIDIA and AMD

We need to go back to the days when there was more than just AyyMD and Jewvidia.

Attached: eh.PNG (923x753, 968.74K)

12nm Zen is fast as fuck
5-3nm is what I'm waiting for

But will it require 2 cases to fit and the contacts will snap in half whenever I need to reseat the card?

When is THAT coming out?

imagine if a millenial saw a CRT

What I don't get is how rtx looks visually worse despite meme ai aa and denoising

Whoah, ultra gaming, must buy now so I can be an ultra gamer.

Hory shit, u gaiz, I just thought of something. Could it be we're getting closer to:
REALTIME WAIFU2X?

Attached: waifu.jpg (2000x1000, 304.58K)

Faster, but in good goy points, not in frames per second.

well shitposting australia scientists experimentally proved that reality doesn't exist until observed so he's not wrong.
anu.edu.au/news/all-news/experiment-confirms-quantum-theory-weirdness

Goyflops per pixel

Ray Tracing hardware does make a big difference. Its just not a very marketable difference. I expect this to be a relative failure in the consumer market but huge in the CGI and film industry as a way to improve workflow when working with lighting and shadows, things that are typically artistically expensive as well as computationally expensive to render out.

The CGI film industry is already beyond what not-vidia offers via custom software and scripts.
And anyway, the big film companies are turning away from CGI effects to embrace the classic special effects again.

I honestly hope you're right about that. I want to believe CGI special effects were a meme. Practical effects will always have superior suspension of disbelief with the tradeoff being more effort required

Fucking coin niggers ruining everything.

Vega 20 will have ~50% better compute than the RTX 2080ti (13.4tflops vs 20tflops) and twice the RAM. Really the only reason to get a 2080ti is if you use the CUDA Toolkit and don't want to pay 8 fucking thousand dollars for a V100.

Personal use? Very skip, already have a 1080Ti. Might end up needing something for ML at work, but since it's still 11GB it will need to be a lot faster than 1080Ti to warrant the price difference.

Attached: Breath of Fire (USA)-180520-154400.png (1411x1080, 870.16K)

The green goblin is preparing to brainwash the cattle via approved review sites and youtubers
hardocp.com/article/2018/08/28/nvidia_controls_aib_launch_driver_distribution/

Sauce?