Graphics card

Im using a GTX 460, i've been wanting to get a new graphics card for a while but the prices were fucking outrageous.

the 460 is a great card but it only has 768mb of vram.

what nvidia graphics card should i get?
a 1080 ti or a 2070

amd is trash and doesn't even compare so i never even bother.

Attached: header_productshot1.png (551x346, 146.27K)

use the sticky

If you buy Nvidia, you'll be stuck with their weird proprietary drivers and miss out on neat shit like Wayland and kernel modesetting (really useful for RetroArch).

Attached: 81d90389ee5f16185423b7920638b62eb2b43f9ffe30674a6bd3b8e1f6eda373.jpg (800x600, 35.63K)

1/10
you tried.

what's the point of tech if you can't even discuss tech.


im for real retard

none, get AMD card

read the rules or go back to fucking /g/

freesync also

webm related, it's you, that's what you sound like

Attached: how_much_nuffin_did_a_dindu_du_if_a_dindu_dindu_nuffin.webm (480x480, 584.06K)

Mostly depends on what you wanna do.
Some around here lament having bought something powerful, only to end up playing low spec indie titles and emulators.

I'm on a AMD 7970 here and play mostly emulators recently.
Breath of the wild ran 1080p with 60fps locked, Persona 5 runs 30fps locked. (60fps patch fucks stuff up)
Nier Automata after that fan patch came out ran 40-50s maxed maxed out without AA.

So really I don't see the point of upgrading. Invest in a freesync monitor and those 40s to 50s suddenly feel like great smoothness.

1070ti with discount, get a proper model that doesnt suck
or if you are budget oriented, get one of those rx480s,rx580s with a discount for midrange
raytracing will be meme for atleast a decade before its worth it

no u

The amd = trash meme is FUD. Don't believe benchmarks too much. Buy polaris-based AMD. Also, Install gentoo.

they won't.
everything lower than freesync'd 100..144 is a fucking slideshow.

forgot to add, on a 144 hz monitor, but that should be obvious anyway.

I oc'd my monitor from 60 to 78 hz.
The improvement in smoothness is awesome. CS:GO and arena shooters feel great at those high refresh rates.

But those numbers are not easily achievable with the sorts of titles I enjoy.
My CPU won't push BOTW above 70fps for instance.
I won't deny the awesomeness of HFR gaming, but that is often times unachiveable, except if got quite the cash to go through.

Don't, nvidia is proprietary and there for evil.
They have extremely jewish marketing schemes that constantly push against fair competition.

your right! you should get the latest intel i9 and 4 t080 ti's in SLI mode, spare no expense! you deserve the best!

Cervantes was right. People shouldn't be educated - once they start reading, they will think they know shit about anything, when in fact they're just conceited retards.
That's you, OP.

what are you, poor?

...

I hope this answers your question...

Attached: nvidia fuck you_1.webm (720x480, 11.17M)

get an used rx 460. more than enough for most games at 1080p. also has the open source radeon drivers and vulkan.

how the fuck does KMS not work with nvidia hardware? or are you making that up? just go in their proprietary driver and find where it does modesetting and copy it, it shouldn't be that hard.

LCD degenerates pls go

hmm this is a good point, most LCDs wouldn't let you go down to 40Hz or even 50Hz before VRR came out. running 50FPS on 60Hz will indeed be more choppy than a real 50FPS display

fun fact: 60Hz on a sample-and-hold LCD (what 99.99999999% of you gaymers have) looks choppy when compared to 60Hz on a CRT (and my CRTs go to 160Hz). not to mention the measly 3000:1 contrast ratio (which is a VA ratio which you wont get anything near on your TN gaymer screen) looks like shit compared to any CRT whatsoever.

it's not overclocking you nigger. your GPU generates a video signal and the monitor synchronizes to it. if you go outside what the monitor can do, it will simply display black screen saying "out of sync". also almost every LCD ever made supports 75Hz officially. when a monitor says "60Hz", it could be 60.001Hz, 60.002Hz, 60.003Hz,...60.050Hz (with respect to the monitor's clock), and the monitor would not give a shit, as long as the timings are in range of what it supports. CRTs/LCDs need to support a large range of timings because the GPU's clock is not perfectly accurate and can't generate timings exactly at 60.00000000000000000000000001Hz in atomic clock time. So if an LCD officially supports only 60Hz, you could probably go up to 62Hz or who knows and a few Hz down as well. LCD manufacturers didn't even start marketing high refresh rates until 20 years later when they saw a bunch of idiots on forums comparing each other's refresh rates. and before that CRTs only went up to 85Hz at recommended resolutions because that's what was needed to eliminate flicker (and as a side effect you can get rates like 160Hz on CRTs at 800x600 on a CRT that's meant to run 60Hz at 1280x1024)

hey dickhead! buy AMD mann! CUDA will fuck u up with spyware - u'd wish u were never born & streamed ur sis taking a shower using opengl accelerated codecs!!! :D