Input latency

Want proof? Go disable SMT/HT in BIOS and enjoy better FPS and input response. Disable dyntick and you will get even better input response. On Windows 8/10 you just do bcdedit /set disabledynamictick yes in admin CMD and reboot. Windows 7 is safe from this "feature." On Linux I believe it's a kernel parameter so it's not so trivial to test.

>Complaining about 400 nanoseconds latency
Holy shit Intel shills are getting desperate.

Extremely based, I never thought to visualise it like that. Isn't it sad how much we throw away?

>>>/4chan/
>>>/reddit/

congragulations, you argued semantics and missed my point. any perceptable input lag is solely caused by slow software, not fucking #T

No, there is no such point. Vsync is an engineering problem and this is why peons shouldn't be talking about it. The whole reason we have "no vsync" option in the first place is at the core because the game failed to meet the 1/refresh rate time constraint. So now you have a game that varies from 30fps to 300fps on a typical mid range box and it looks like utter shit because such large framerate variance causes judder. There are no guarantees with vsync off. If you run at 120FPS on 60Hz and these were actual real frequencies (instead of approximations which you get in any real system), you would get two tears that stay perfectly still in the middle of the screen. If the frequencies were slightly off (which happens in practice), the tears will slowly move down the screen. Even with tears not staying in one place (e.g each frame has the tear in a completely random spot), it will still look like shit, even without judder. Vsync causes other artifacts not just tearing: for example vertical compression, or hiding strobed animations (e.g a sprite flashes between two colors, but now because of "no vsync", it is just one color). Now if you use a real monitor like a CRT or an LCD in strobed-backlight mode, unless you have vsync, you will get double images or worse (triple image, quad image etc).
Now that peons have memed their invalid claims about vsync for 20 years we have the misconception that is VRR (gsync/freesync) in every monitor for no reason.

are you the most retarded fucking faggot who ever stepped onto a shitty tech forum? CPUs don't cause input lag. If they do, it's because they merely perturbed a bug in your dog shit game. Do you know what "input lag" means in the context of this thread? We're talking milliseconds or higher. Humans don't feel microsecond lag and if they do there is no system that remotely supports that, due to limitations of the screen and input devices.
you stupid nigger, even if an interrupt took 1ms to process, that would mean adding 1ms of latency to a keyboard or mouse event.
noone cares about some windows issue

and now some nignog is gona chime in:

It actually does help with some variations of vsync, by giving the algorithm more choices of frame. In some cases, a framerate of 3x actual output frequency is recommended. Note there are also some situations in which a framerate a fraction of an entire FPS just above or below output frequency is best.
The point of VRR is to shrink granularity of the vsync window, beneath the usual vsync limitation of "evenly divisible by current refresh rate or bust". This eliminates the need to jump from 30 to 60 to 120 FPS at 120Hz, instead seamlessly (lol) tracking the game's FPS as if vsync was off, but without tearing.

There are also other advantages to VRR, such as the ability to sidestep clunky and slow modeswitching when watching video, to see flawless judder-free playback in weird framerates a lot of content uses (i.e.: 23.976, 29.97).


Has nothing to do with tearing, it merely reduces input lag. For this reason, it can also be combined with vsync.
Input lag isn't caused by vsync, but by double buffering. Vsync using correctly implemented triple buffering eliminates such latency:
anandtech.com/show/2794
Maybe there are some scenes with little motion, where higher detail is more important, along with other scenes where rapid motion and player response are crucial in preference to detail? Or, perhaps, the game simply is flaky, and the hardware/drivers/OS/game engine handling variable FPS elegantly minimizes the impact of such bugs.

see


"b-but windows doesn't matter :(" And neither do you.

Having such a high variance of framerate precludes using the monitor in a strobed mode, which means you get motion blur, which means your retarded 4K monitor is indistinguishable from 480p whenever the image is moving (at all). If VRR worked on strobed modes, the refresh rate variance would be something like 4 (in my head calculation), not 100. Also worth noting: With VRR you also still need to cap the game refresh rate below the monitor's max refresh rate or you get input lag.
The point of being able to drop to 59FPS on a 60Hz monitor without quantizing to 30FPS sounds like bullshit. What I'd guess really happens most of the time is the framerate is dropping down to 40 or some shit (and then the double buffered vsync quantizes this to 30FPS, slightly worse) and the user throws his hands up "OH NOES ITS THE DOUBLE BUFFER I READ ABOUT ON ANANDTECH!". I had an explanation of why VRR doesn't solve input lag any better than beam racing, but I forget, I don't really think of VRR much since it has no appeal.
well beam racing solves tearing since the whole point is to do it on the vsync signal. it's also the only way to have all of 3: 0 input lag, synchronized video, 0 motion blur
Well, input lag is caused by "double buffered vsync", and all forms of "triple buffered vsync". If we wanted to be correct, we'd only mention the word "vsync" when talking about a pin on the VGA cable (and the corresponding signals DVI,HDMI, and DP all have). Unfortunately these shitty hacks aren't well named enough.
>anandtech.com/show/2794
Absolutely false. But that's a common meme thanks to shitty authorities like Anandtech. Triple buffered vsync can either be FIFO or the "proper" way that article refers to (IIRC). FIFO just means you get even more lag than double buffered vsync. "Proper" triple buffering just means you render new frames based on most recent inputs as often as possible (100% resource utilization), and when the GPU is about to send a new frame to the monitor, it selects the newest of these rendered frames. I did the math before and even at 300FPS (the rate of rendering new frames, each of which the GPU may or may not display) on 60Hz you still get something like half the lag of double buffered vsync. Also "proper" triple buffered vsync gives you judder, so it's still not a proper synchronization method. This isn't a tiny issue, it's just as bad as if tearing. On a CRT or strobed-backlight LCD, judder causes double (or triple or worse) images (but even on a normal LCD, judder still looks bad).
The entire logic of "proper" triple buffered vsync is: "Oh, we can't guarantee a solid 16.66ms render time, so we'll render frames as fast as possible (in the range of 300FPS to 1000FPS, but still drop to 20ms once in a while) to try and have frames that coincidentally are sampled a tiny time before the vsync signal arrives."
Not in any game where the camera is tied to the mouse.
you wont see any of that detail since you're using VRR which means motion blur. if the camera moves in the slightest the quality drops to 480p. modern LCD applications can only display static images with high resolution, nothing else
you're right, VRR is to minimize the impact of shitty software, and to attach an extra $200 fee to new meme monitors lol

BTW, this means that by using "proper" triple buffered vsync, you're admitting that you will still take longer than a frame (20ms vs 16.66) to render sometimes, so you will still get input lag sometimes. Otherwise, you'd be using beam racing (since it does everything 100% perfectly if you can guarantee to meet the framerate). Similar stories apply to VRR IIRC.