PS5 is underpowered before release

notebookcheck.net/Nvidia-claims-RTX-2080-Max-Q-is-faster-than-PS5-Xbox-Series-X.448695.0.html
gpu.userbenchmark.com/Compare/Nvidia-RTX-2080-Mobile-Max-Q-vs-Nvidia-RTX-2060/m704710vs4034

>The nvidia RTX 2080 Max Q is barely stronger than the current entry level RTX 2060

How's PS5 supposed to compete?
Also incoming GPUs are released this year and rtx 3060 is confirmed to be at rtx 2080 level which is twice faster as the PS5.

Attached: technology.png (1315x729, 282.8K)

Other urls found in this thread:

cogconnected.com/2020/05/unreal-engine-5-runs-better-laptops/
microcenter.com/product/618231/evga-geforce-rtx-2060-ko-overclocked-dual-fan-6gb-gddr6-pcie-30-graphics-card
twitter.com/SFWRedditVideos

cogconnected.com/2020/05/unreal-engine-5-runs-better-laptops/

it's almost like exact same thing happened when ps4 was released

wasn't the PS4 supposed to be at HD7970 level? which at that time was already a bit obsolete if i'm not wrong.

>gaming notebooks

Attached: 1483998134614.png (1920x1389, 2.86M)

more like hd7850 which was more than twice slower than hd7970

Attached: chadgamer.jpg (647x659, 40.42K)

>rtx 3060 is confirmed to be at rtx 2080 level
[citation needed]

>link is from 2019
shut up Zig Forums you stupid cuck board

cope

cant even do 60fps

wow a 500 dollar console is going to be less powerful than a 1000 dollar gpu in a 2000 dollar rig?

What do you expect of pc cucks who just pay money without previous knowledge of specs

And console users don't do that?

Of course Jensen, who would have zero experience with the current consoles, would say this.

Absolutely everything is pointing to a minimum of RTX 2070 performance. The series X being more like a 2080.

Jensen doesnt want AMD to ever hold any spotlight

Babystation fags feeling really perturbed by the looming disaster on the horizon.

Attached: 1386287805595.jpg (399x480, 22.36K)

microcenter.com/product/618231/evga-geforce-rtx-2060-ko-overclocked-dual-fan-6gb-gddr6-pcie-30-graphics-card
300 dollars and already outperform the PS5, expect prices to drop even more.

>Nvidia claims
Nvidia is just throwing another hissy fit because Sony and Microsoft told them to fuck off once again.

Attached: hot.assses.jpg (681x462, 78.11K)

PS5's is around a 2070 non S and series X is between a 2080s and ti. Series S is around a GTX 1660 since it will only do 1080p The latter will still have 8 cores and 16 threads though since that will be the bare minimum you need to run next gen games regardless of resolution.

based irish chad

>imagine being this ignorant

About what? It's simple math.

>1660
>only 1080p

A 1660 super does 1440p/60 in almost every modern game at high/max with small things like volumetric fog turned down (I have yet to find a game where you can actually see a difference between the lowest and highest fog setting aside from the 7-10 extra fps you get from turning it down)

>500 dollar
SIX NINETY NINE

Yes, but for next gen it will have to do 1080p with the assets and fidelity next gen provides. Not current gen games,

Cpu won't matter because the consoles are locked to such a low frequency.

If the leaked spec sheets are to be believed then for raw computing performance the "3060" should be between the 2080 and 2080s. Of course that doesn't mean it will translate directly into gaming performance, but who knows which is why its fucking stupid to speculate.

> PS5's is around a 2070 non S and series X is between a 2080s and ti
Delusional

PS5 is at level with 1070ti + no bounce ray tracing because is underpowered and they prove it in the screenshots while the current pc rtx version has more bounces and multi ray tracing.

They can lock it lower because they have dedicated hardware to offload the cpu and free up cores to do other things than decompression. On pc you will have to brute force that all on your cpu not even mentioning the DRM fuckery you have to deal with that hampers cpu performance.

I don't think SX is above a 2080 desu

"Next gen games" wont actually exist until like 2023 though.

The I/O decompression will only come into play with PS5 exclusives like R&C when it's portal hopping, most multiplats won't bother with that

Is it worth getting an RTX 2060 with an R52600X? How bad would the bottlenecking be?

>userbenchmarks
fuck off and die retard, at least post a real benchmark

with this gen i think we got it in assassins creed unity which ran like shit in consoles but i play it fine in pc with a gtx960 at that time, since then all the next games looked like shit compared to it.

We'll see I guess. When I play MHW on pc just using the high resolution textures it bumps up my CPU load by like 20% compared to it off on my 6 core 12 thread cpu. But that's a current gen game.

>buy a $400 gpu
>every other driver update fucks performance in mhw

At least consoles dont have to deal with this shit. And don't even @me with >amd because I'm talking about nvidia. I literally lost 10fps with the most recent "gameready" driver.

Probably because higher quality lods, higher res textures shouldn't stress the CPU typically

Still runs the same on my 2060 Super

i have an i5 + rtx 2060 and it does fine on every game, i can play 1080p full settings and i always get 90+ frames.

It's almost as if consoles have never been competitive with PCs for more than a very brief while if ever. Hmmm.

Idk I have a 1660s, it was getting a solid 60 at 1440p with most settings at max, motion blur off and fog set to low.
Now I have to turn on the fidelity fx scaling to keep it from dropping to ~53 every 10 seconds.
Februarys drivers worked perfectly. Maybe ill try switching off dx12.

idk if the 2600x got the same refresh as the base 2600 but if it didn't you should just get a non-x.

All the 2600s from january this year forward are 12nm like the 3600's, theyre now within like 5% performance but you can get a 2600 for ~$100 on sale.

They gotta lower performance for turing and pascal cards to get ready for Amperes release. Gotta get those new benchmark comparisons when they come out

Attached: jewish.gif (320x240, 2.03M)

I wish I didnt believe that, but I do.

nice user where's the benchmark comparisons? You know the only actual way to compare 2 different devices.