Display technologies

Why do manufacturers continue trying to improve a stale, cul-de-sac technology (IPS) -- we know it's a dead end. I remember buying multiple IPS monitors (ASUS, Dell, and Samsung -- supposedly companies with rigorous QC) a few years ago; they invariably had glow/backlight bleed/ghosting issues. However, I recently was impressed by a cheap AMVA+ monitor. This BenQ monitor has inky blacks + variegated colors, it's _actually_ comparable to my CRT -- first I've ever said that about an LCD. It's bizarre how I continue to read that VA panels have color smearing and poor response times; they did, but contemporary panels are excellent. Why does IPS/PLS continue to be the technology manufacturers invest in? They could have put that money into other promising (now dead) technologies, like SED, FED, Plasma, and Laser. Let's hope MicroLED and OLED hit the market soon.

Attached: Color.jpg (470x538, 80.24K)

Other urls found in this thread:

flatpanelshd.com/news.php?id=1465304750&subaction=showfull
tftcentral.co.uk/reviews/benq_gw2450hm.htm)
wiki.archlinux.org/index.php/font_configuration#Pixel_alignment
twitter.com/NSFWRedditGif

MicroLED or quantum dots, please. OLED has those well-known durability issues. Same reason why I wouldn't bother with plasma now, even though the picture quality is astonishing.

The concept of Patents needs to fucking die for man to reach salvation.

Attached: angry emma with bow.PNG (924x850 98.3 KB, 1.74M)

a.k.a. built-in obsolescence

Attached: 91040c5eb986a471ba085f08349dbeb0d69d17c36829b76895a71ff338e6ec1a.jpg (837x1024, 38.75K)

I wish 4:3 crts were developed further.

Well CRTs are RIP, LCDs are dogshit, plasma burns out faster than LCD and are super expensive to boot. QLED at least has the same lifetime for all the pixels, it won't have the problem of blues inexplicably burning out significantly faster than all the other ones.

Not anymore, as of the latest chemistries entering production about last year. OLED should now beat CRTs and fluorescently backlit LCDs in lifespan:
flatpanelshd.com/news.php?id=1465304750&subaction=showfull

Are Retinal projectors a meme?

I have a BenQ GW2450HM (AMVA) which has 2500:1 static contrast ratio (measured by TFT Central: tftcentral.co.uk/reviews/benq_gw2450hm.htm) and it still looks washed out compared to my CRTs (but maybe the CRTs are just exaggerating the color - I have no experience in color so I couldn't say for sure). It also It also has terrible overdrive artifacts, or ghosting when you disable overdrive. It also has terrible persistence of vision artifacts. Have you ever played a 2D platformer on an LCD and CRT side by side (or even watched a video of them?). The moment there is any movement on an LCD, the effective resolution drops to below 640x480.

This BenQ also has what looks like a translucent layer of dirt over the entire screen that's especially noticable when looking at light colors like text on white background. It's far worse than most other LCDs I've seen. This layer is more disruptive than the small amount of noise caused by using an analog connection to a pre-DVI CRT. People say this is caused by anti-glare coating but I couldn't say for sure. It's definitely there on my other LCDs but not as bad. It also takes 15 seconds to switch resolutions, multiple seconds to open then menu, and 1 second to switch between menu options using the arrow buttons. And about 10 more issues omited to keep this post short. It's interesting to note that I have about 20 LCDs from before 2007, and 6 CRTs and none of them have the menu latency or huge mode switching delays as this BenQ GW2450HM. I'd guess it's caused by the HDCP and other modern cancer they put in the display controller.

Blur caused by persistence of vision is so bad that pixel response times have not mattered since around 2004 for TN panels. People who compare LCDs on a pixel response basis are literally retarded. Persistence of vision, which affects all LCDs (aside from ULMB,ELMB, etc which nobody uses), causes a giant trail of blur, whereas whatever pixel response these guys are talking about causes maybe 1 frame of blur or a smear that lasts half a frame. The reason you can't see text scrolling on any modern machine is because LCDs don't flash their backlights (also because they don't use vsync but even then it would still be mostly readable).


b-b but muh patents are required for innovation!

That's strange -- I doubt they made any huge strides since AMVA, (and I could have just won the panel lottery), but the monitor I have is honestly excellent. I don't notice inverse ghosting when using overdrive; there is still some (tolerable) motion blur, but any LCD will have that. I run a dual-head setup (CRT + BenQ) and I tried the UFO test on both monitors, concurrently -- the CRT is obviously superior, but with AMA (overdrive) set to premium, the BenQ looks great. What I'm trying to say is that the blacks and motion clarity are impressive for an LCD. There are some things CRTs will always be useful for, and one of those things is playing old games at low resolutions. Something else I notice is that you can crank the contrast on a CRT to max and _actually_ increase the contrast levels; with LCDs you just get more luminescence. If I remember correctly, BenQ is an Acer subsidiary, so I'm guessing their QC is sloppy.

That's like saying milk or mice has built in obsolescence because it spoils or dies.

Haven't all of the manufacturers given up on plasma shit?

Attached: f95959832.jpg (255x231, 10.04K)

What I'm trying to say is that TN (not AMVA) LCDs already maxed out motion clarity in 2004. To get more clarity you need to up the framerate or strobe the backlight. Pixel response time no longer helps. But lol even 144Hz LCDs don't have nearly the motion clarity of a CRT.

I suggest you turn on clone mode and play a 2D platformer with VSYNC (otherwise you will see double images and other odd effects on the CRT) and compare the LCD and CRT. You should be seeing a huge difference.

I see. Backlight strobing is something I'm curious about; it's what those "blur reduction" settings on "gaming" monitors use, correct? However (from what I surmise) this setting doesn't actually ameliorate the motion blur, it just appears to reduce it. Let me know if I'm wrong.

Or it's an aftermarket mod you can install on some monitors. Check out blur busters for all the information on this subject you'd ever want to read.

And a reminder for people here to make sure their pixel layout is correctly set.

wiki.archlinux.org/index.php/font_configuration#Pixel_alignment

Attached: project480-rear-690x539.jpg (690x539, 241K)

Yes backlight strobing is what all these new "motion blur reduction" features like ULMB and ELMB do.
It removes the motion blur caused by persistence of vision.
When an object is moving across the screen, the LCD will display it like this

o
o
o

The object position is updated every 16.66ms, but your eye moves finer than that. You will predict where the object should be between frames and track that with your eye, so your eye position actually looks like this:

o*
o *
o *
o *
o

But the problem is that the object is still there as your eye moves to the right. So the object smears across your eye just like if you held out your hand and stared at it while someone further away runs from your left to your right.

Strobed backlights just flash on briefly once per frame. So it looks like this:
o*
*
*
*
o

CRTs only have a few lines litten up at any time, which gives the same effect.

So well, there was never any blur in the first place, it just appeared that there was. And strobing undoes that blur effect.

The only real blur is caused by slow pixel response, which can be captured from a still small exposure time camera, unlike the persistence of vision blur described above.
Also interesting is that with strobing backlights, some LCDs are waiting until the pixel transitions are complete and then strobing the backlight, so the pixel response time no longer matters (except maybe the pixels at the bottom of screen). However this introduces lag, but I'm not sure if it's perceptible. I've never tried a strobed backlight LCD since the stores don't have them set up properly.

That looks amazing

Nigger, I'm using a Samshit S5 with Lineage OS and the OLED panel on it still looks as crisp and bright as the day I opened the box. What are you worried about, burn in? Set a screen saver or turn it off when you aren't using it you autistic fuck.

Attached: 1512446263896.png (645x729, 61.58K)

Yeah except the pixel response times aren't optimized for this, and you get latency from waiting til the end of frame to display it (which is 16.66ms but maybe feels like 8ms in practice), so maybe it's not so good.

Some of my family's phones: Samsung S5, Samsung Note 2, Motorola Nexus 6 -- and ALL OF THEM have noticeable burn-in and yellowing.

4k 244hz when

not any time soon since your average game can't even keep 60fps on low graphics

(cont) also the only other thing with animation - movies - are 25fps and strobe reduction cant help them because it causes double images and other effects since they duplicate frames to bring them to 60+Hz. panning images is one application that can be helped by strobing though. suddenly when viewing satellite images, you no longer have to stop for a second every time you want to see some fine detail

Reee

After 90 Hz, you hit diminishing returns quickly.

T. Vr early adoptor

It's not the hardware holding back fps in games, it's gamers. They buy flashy games and will accept 30fps, so devs pack in as much flash as they can until they hit 30. This is what happens when things become accessible to niggers and the poor, they become the target market and their bad taste drags everything down.

A lot of gaymers these days demand 120Hz and the same framerate or higher because they've been marketed to. Then they see a benchmark where the game ran fine in the best case scenario but in reality the game runs slow as shit half the time. And the in game FPS measurements lie or fail to capture that every second frame takes over 20ms to render. Also they're fine with 20 tickrate and having weapon switching and similar actions require a round trip to the server ("just get a 5ms connection bruh").

1800 Hz would be the best refresh rate for TV and movies.
Silent films were 18 fps.
Sound films were 24 fps.
PAL was 25 fps, 50 fps scanned.
NTSC was 30 fps, 60 fps scanned.
The Hobbit films and possibly more in the future were 48 fps.
Some shitty internet videos are 15 fps.
1800 is the LCM for all these numbers.
No more interpolation necessary between formats.

ITT

When you view 30fps content at 60Hz with frame duplication, if there is no motion blur (which means strobing [LCD]/scanning [CRT] is used), you'll see double images. at 120Hz frames will be quadrupled and you'll see quadruple images. also i don't think we'll see processor speed good enough for really high framerates any time soon, especially for games

WHY IS TERRY A. DAVIS'S DEATH BEING CENSORED?


WHY IS TERRY A. DAVIS'S DEATH BEING CENSORED?
WHY IS TERRY A. DAVIS'S DEATH BEING CENSORED?

WHY IS TERRY A. DAVIS'S DEATH BEING CENSORED?


WHY IS TERRY A. DAVIS'S DEATH BEING CENSORED?

WHY IS TERRY A. DAVIS'S DEATH BEING CENSORED?
WHY IS TERRY A. DAVIS'S DEATH BEING CENSORED?

WHY IS TERRY A. DAVIS'S DEATH BEING CENSORED?

WHY IS TERRY A. DAVIS'S DEATH BEING CENSORED?

WHY IS TERRY A. DAVIS'S DEATH BEING CENSORED?