AMD's upcoming mid-range GPU has 16GB VRAM

Engineering board just leaked on Chinese site, the board has a sticker on it claiming it has 16GB Samsung memory with 256-bit bus. This virtually guarantees that it's only the mid-range model because high-end chips need a much higher memory bandwidth to feed the GPU cores or they will be severely bottlenecked. Big Navi will likely have 24GB VRAM.

This is why you don't buy the 3070 / 3080 with 8/10GB. Wait for AMD.

>videocardz.com/newz/alleged-amd-radeon-rx-6000-engineering-sample-spotted

Attached: AMD.png (1213x1024, 119.41K)

Other urls found in this thread:

videocardz.com/newz/alleged-amd-radeon-rx-6000-engineering-sample-spotted
twitter.com/AnonBabble

Yeah but I don't give a fuck about muh VRAM because im still using 1080p in 2020 because anything higher is a meme

>Zig Forumstards still think VRAM means anything

even the most demanding games like RDR2 at 4K only use 6 gigs MAX

WHY ARE WE EVEN MAKING IT THAT HIGH IM NOT SAYING GAMES SHOULD LOOK BETTER BUT FUCK THIS IS UNNEEDED

As if I care. The drivers are going to be so trash that you won't be able to get anything done.

3070ti and 3080ti have already leaked and they both have 16GB vram. Wait for those.

>256 bit memory bus
slower than RTX 3070 confirmed

to market it? you're a really stupid motherfucker aren't you?

Today, future games will need more

>JUST WAIT BRO

LOL

>Just wait, okay? Just wait a little longer. You won't be disappointed if you wait. You NEED wait, it will be totally different this time. Just. Wait.

Attached: 1599093448461.png (653x726, 94.59K)

the 3070 also has a 256 bit bus

XSX only has 13.5GB total memory available for games. You won't need more than 8GB even for 4k on PC.

And no drivers. Pass.

don't care still buying nvidia

be happy you cornered the cpu market and shut up amdrones, shilling here won't make your stock rise any further

Nvidia cards have been significantly faster on the same bus width for a long time. RTX 2080 super was 256 bit and the best AMD could do was the 5700 XT.

5700XT exist. AMD shitting on intel. I dont see the problem

>videocardz.com/newz/alleged-amd-radeon-rx-6000-engineering-sample-spotted
literally cooled by a cpu tower cooler
with that being said, looking at that setup it seems fake and gay

The only reason I use AMD is because the Linux drivers are infinitely better than Nvidia.

I only play at 1080p so a 3070 will be more than i'll ever need. hopefully Big Navi is competitive though so they drop the price a bit more.

That RTX IO thing nvidia showed off is just a crutch because they didn't want to shell out for more GDDR6X or hbm on the 30 series.

>Upgrading when you're still on 1080p

Literally why? a 970 is still good for that shit

i have a 1060 3gb and i actually want to play games at 144hz, 1060 struggles to even hit 60 in most cases on medium settings.

Remember when AMD GPU's had HBM2 ram? And then they came out and literally no one cared? Because I do.

get one with a better cooler, bet my second pc's 970 smokes it, runs at 1600 mhz all day (g1 gaming)

3070ti SOON

Attached: 1579185657773.gif (250x270, 803.57K)

I'm looking to get a GPU this year, but its hard to decide.

Nvidia is likely taking a big jump forward on their previous cards. But we also know they are making false equivalencies. 2x the core for RTX is not 2x the performance when they are slower cores. But the example footage does look good to be fair.

AMD with their dominance in CPU and the people that were responsible for Vega/VII are gone to Intel now. So AMD may turn things around and rock a great card again. But it will be their first go at DXR(RTX) and it would be a hell of a come back to expect (even the massive CPU success was not as overnight as it looked).


Nvidia also has a slight emulation advantage was does factor in for some us. This is going to be a hard choice and I have feeling by the time the real world data is in, the supply of both will gone or jacked up past MSRP.
GPUs are getting used fore more these days. If you do 3D design or image/video editing you can very quickly need a lot of vRAM and regular RAM too for that matter. And 4k is being pushed pretty hard which will need more textures which eat up memory.

I don't play a lot of AAA, but I gave Sims 4 a shot the other day and even it as an old game was using more than 4 and its fairly simple over all.


Can someone explain this meme. I've used AMD card in the past and not had issue.

I dunno man, vanilla Fo4 with HQ texture pack dlc takes around 6.8GB t times, division 2 takes around 6.5GB most of the time, both at 1920x1080. The former game is a 2015 game.

no, at that point i might as well buy a used 1080Ti, it's not worth to buy the low-medium end pascal or maxwell because games are going to keep getting more demanding even at 1080p. 970/1060 are fine for low settings 60fps 1080p but they won't ever be anything more then that.

Clock speed doesn't equal performance between different cards. What game, how demanding it is, and resolution to frame rate is a big factor.

AMD's drivers are unstable and early adopters are basically beta testers. The RX 5700 to this day still has issues with flickering.

i just meant whatever you get
talkin bout a 970, oc'ed to 1600-1700 will outperform a single fan 1060

Smart user. Why is Zig Forums so retarded at cost-effective decision-making?

>spending 400 shekels on a used 10nigger80 ti instead of spending 499 shekels on a 3070
>Smart user. Why is Zig Forums so retarded at cost-effective decision-making?

>talkin bout a 970, oc'ed to 1600-1700 will outperform a single fan 1060
Meaningless. Game, benchmark, target frame rate, target resolution.
For all we know that 970 is running at 720 with a low setting to get a stable 60 fps on Stardew Valley while his 1060 is playing a more demanding game that he pushing up settings and frames... I'm exaggerating of course but your claim has no context and his is even missing some info like what game struggles.


Yea everyone says that but in what way? Do they crash a lot, render badly, fail to meet performance projected by their hardware? I had 7000HD card and R9 at one point that which damn awesome. But those are from awhile ago and pre-date the Vega/VII disappointments.

Just curious on this, if I had to buy one today I would probably buy Nvidia too. But say its performance is 10% under what it should be due to shit software, but the cost to performance is still really high for a particular cards price then it could still be a good card to get, if that makes sense.

On those old cards I did run beta drivers a lot which worked better. Nvidia I always ended up having to roll them back. So I've had about an equal experience, but not got a new card in a long time.

Radeon VII also had 16GB VRAM, and look how that turned out.

No context needed, when you know what the 2 cards are.

>2020
>AMD GPU
What, you still buy Intel too?

Attached: 1401580658634.jpg (522x610, 182.92K)