Apple - FULL ON JEWMODE

See this piece of shit that got unveiled in WWDC? The monitor next to the new Mac Pro cheesegrater.

That monitor costs $4999. It's impressive, to be sure. 6K and all.

That monitor comes without a fucking stand.

THE STAND COSTS $999 EXTRA

youtube.com/watch?v=PPiEpSMkzoo (Linus TT)

Attached: fuck you.jpg (720x480, 48.75K)

Other urls found in this thread:

youtube.com/watch?v=Bf1GjCaYzYg
bhphotovideo.com/c/product/1015238-REG/sony_pvma250_25_professional_oled.html
mafiadoc.com/electrofluidic-displays-university-of-cincinnati_59e70dd51723ddacaa177628.html
proav.co.uk/sony-bvm-x300-30-inch-v-2-0-4k-oled-master-monitor
rtings.com/tv/reviews/lg/c9-oled
youtube.com/watch?v=Vhh_GeBPOhs
rtings.com/tv/tests/inputs/input-lag
rtings.com/monitor/tests/inputs/input-lag
ignisinnovation.com/technology/
narcoticnews.com/drug-prices/cocaine/
pubs.acs.org/doi/10.1021/acsnano.8b07938
twitter.com/AnonBabble

Attached: 1558634861236-0.jpg (400x400, 54.13K)

Mildly unrelated but I'll pay fucking $10000 if I can have a low latency 2560x1440 monitor with deep blacks and accurate colors throughout the color range and no fucking seams in the middle of the screen that become obvious on low contrast backgrounds. It literally does not exist yet I stare at a monitor 90% of my waking time.

But that's not all because I'll add another $5000 into that if the monitor is also 120hz+ and doesn't compromise on the other things. I want to kill myself just for the off chance that I'll reincarnate into an alternate universe where SED technology didn't get patent jewed to death.

Attached: 4k40hma3.png (275x277, 37.11K)

Just wait for microled, or whatever it is since oled's burn in is really killing it. I too hope for a god-tier display, slight off color accuracy isn't actually a huge deal as long as you can correct it with a color meter, which you should use. I heard e-ink was too expensive to be improved upon, what a shame really.

The stand is retarded, but the screen itself, when you compare it to similar spec'd factory calibrated high color accuracy displays used for high-end imaging and shit its actually riced very reasonably. Apple isn't targeting this product to regular content creators or gamers and shit, Apples target audience with this is the high-end content creation industry, and most importantly as a reference display for applications that need a high degree of color accuracy and high resolution. Its not made for you.

Attached: 97ddda5c31f72e9137769e79ab35a2dc36d9037d80a54d394132e7d781381a3a.jpg (480x621, 22.6K)

Fuck I mean content consumers

Gamers are consumers. HAHA. Lifeless bandwagoners

They could've achieved the same thing by increasing the base price by $1k and making it a BTO option to remove the stand for -$1k. The price would still have been ridiculous but you wouldn't have the insane situation where you're buying a display that is unusable out of the box.

Jobs may have been a delusional hippie but at least he had fucking common sense about this sort of thing.

E-Ink is final.
It has really high latency and only two colors but swaps them physically, so that the image could remain if the device was off.
The only thing that costs energy is updating the display. It's totally unfit for anything but reading.

Oh shit, the high quality with craftsmanship of chinese children you can only get from Apple! And it's only 10000$$$ I have to buy it!!!!!!111

I'm pretty sure you'll get third party monitor stands for that exact monitor for less than a 100$.
(checked)
I have a 1440p 144Hz IPS monitor and it's pretty comfy.
Good luck waiting till your eyes get old and don't see the difference anymore.

You can't because it doesn't come with a VESA adapter and uses a proprietary magnetic connection. You would need to buy the $200 adapter from Apple and then a third party stand too. And that's beside the point.

this

After owning several of them I've decided to never touch one ever again. Every single one I've used has had either gray blacks, or eye raping dark colors due to some kind of IPS light bleed. Watching a dark movie or playing a dark videogame and holy fuck my fucking eyes hurt trying to look at it.

I have one with a VA panel and I stopped using it because although it has incredibly high contrast and deep blacks, basically anything from like 20% gray to below has complete shit color accuracy and very dark grays are anything between green to brown depending on what they feel like being at any time, gradients especially have disgustingly obvious jumps from color to another. That's after I fixed the monitor settings which just clipped all those blacks into pure black by default, and calibrated it with a Spyder.

It's has a very low latency and combined with the refresh rate makes it feel very nice to use. Going back to my current color balance-focused 60Hz monitor felt like the framerate tanked to 30fps due to some heavy botnet that's running, and the mouse feels like it's dragging behind by like 1:10 of a second or something. But that's just the difference in the feeling of a higher refresh rate and low latency.

I've seen exactly one normal LCD monitor in my lifetime that actually had dark blacks and that was on one of my old laptops. It had alright colors too, not sure what was up with that display.

I can't wait. The absolute shit state of computer monitors is so fucking obvious to me but nobody else seems to think so.

That has never stopped China.

Hey, I think so too.
That's a tradeoff. I like it a lot better than my old TN. And I always put the brightness down as much as acceptable.
For my IPS I set it to 20%. light -> energy -> burdon to the eyes
Yes it would be better if dark sport emmited less light and bright ones more but
To me it's the best I can get.

I have an OLED with perfect blacks, low latency, and a really beautiful picture. Also it'll do 4k @ 120hz it's a 65" TV though

GET OUT!

Ever hear of electrowetting?It gets you paper-like displays (think kindle DX), except they're fast and in color. I spend all day reading shit in front of a computer or coding and it's fucking with my eyesight. I want a paper-like display, one that reflects ambient light instead of emitting it. Fucking patent jews and market cartels.

youtube.com/watch?v=Bf1GjCaYzYg

Attached: maxresdefault[3].jpg (1280x720, 89.46K)

Knock yourself out:
bhphotovideo.com/c/product/1015238-REG/sony_pvma250_25_professional_oled.html


There are also a number of other e-paper technologies capable of much higher framerates and lower latency, such as the University of Cincinnati's electrofluidic displays, the framerate of which is largely dependent on the size of its pixels due to their hydraulic nature, capable of well over 30 FPS, probably up to well over a hundred FPS:
mafiadoc.com/electrofluidic-displays-university-of-cincinnati_59e70dd51723ddacaa177628.html
Also, of course, there are a variety of full color e-ink technologies.


>TV
Yeah, the killer problem with consumer OLED, much like plasma before it, is its lag-free kilohertz-FPS-capable panel being tied to a junky consumer electronics LCD TV support chipset that's not even on par with PC gaymur LCDs, let alone the DACs that drove CRTs.

Attached: kom.jpg (176x184, 40.66K)

Oops, sorry:
proav.co.uk/sony-bvm-x300-30-inch-v-2-0-4k-oled-master-monitor

The 2019 LG models come with a high end chipset, and as a result have some of the lowest latency of any modern TV and a lot of monitors. Still not as good as CRT

...

I'm willing to pay a lot, but I have my limits as well.

rtings.com/tv/reviews/lg/c9-oled

Does anyone really feel like they gain anything after 720p?

LMAO because Appletards will actually pay this for something you could make in shop class. Just make it from 2 pieces of aluminum of a certain thickness fillet-welded (with a tig welder) and drill the three holes for the mounting screws. Even if you have zero skills to do this, your local machine shop would make it for you for $200.

I wouldn't recommend anything lower than 1080p. I had a 900p monitor for years. It was shit.

We'll have LEFET, so cry me a river.

People who do more meaningful things than tinkering in a terminal screen.

Like what, editing your shitty gameplay to post on youtube? Or do you work for a magazine agency and feel highly important developing the covers

No you mongoloid, Apple needs to cater to developers too and not just pedowood. Apple is falling behind on ML because they can't offer their fanboys a tower that can be loaded up with Nvidia gpus. All the ML code is being developed on Linux and if developers are using Linux they are not using Apple.

Apple only needs to cater to Apple developers. They only need to cater to iOS devs. ML shit is doing just fine on their own platforms that use their own dedicated ML hardware and shit. For Apple that's fine. Developers will always buy the latest Mac Pros to develop for the newest iShits

This is the correct answer
youtube.com/watch?v=Vhh_GeBPOhs

How much coke do you think he did before going on stage that night?

Or µLED, or laser projectors, or QD...

Nothing will change until LCD is put in the only place it ever belonged

Attached: only trash.jpg (499x497, 82.1K)

LCDs have no set lifespan like most other display technologies that want to replace it. Even if the backlight dies an LCD screen can still be used forever. As long as this is the case LCD will still reign supreme

Ok, you don't know what LEFET is
Ok, you don't know what you're talking about.

µLED has an identical lifespan to LCD, because LCD backlights use the exact same decades-old technology. Lasers, including the latest and most compact VCSELs, are also proven to have lifespans vastly exceeding any LCD backlight. QD, when working devices are invented, should have roughly identical lifespan to silicon LED, because it is based on the same technology. Practically nobody ever replaces the backlight on an LCD, so the lifespan of the panel is irrelevant.

Even OLED's "to half brightness" lifespan for their infamous blue subpixels is now over 100k hours, vastly exceeding any actual length of ownership for a monitor at 8 hours/day, so the continued burn-in problems with OLED are entirely down to a lack of per-subpixel wear leveling in the controller.


Has identical emission and decay characteristics to LED, because the substrate and operational principles used are identical. The only difference is its more compact layout.
Was always trash. Every competing technology (FED/SED, LED/LEP, PDP, eInk, etc.) started out technically ahead of LCD across the board, but as R&D resources were repeatedly redirected to incremental improvements for LCD over investment in fundamentally superior technologies, neglect allowed LCD to pull ahead.

What is the issue here and how are digital display drivers in any way comparable to analog CRTs?

What if I don't want display controllers doing wear leveling autism? What if I just want a display to just display shit?

After using four phones with OLED all hand-me-downs, is poorfag I can tell you all of them had noticeable yellowing. This sort of shitty durability is the reason why I would never buy an OLED monitor.

The chipsets that connect a digital display's HDMI/DP/DVI port to its panel tend to be far, far laggier in TVs than PC monitors. This becomes especially galling when comparing a $100 PC monitor with a $5000 TV. For instance, compare these numbers:
rtings.com/tv/tests/inputs/input-lag
With these numbers:
rtings.com/monitor/tests/inputs/input-lag
And you'll notice that most TVs have minimum input lag in the 30ms range, outside game mode rising to over 120ms, and even the "best" TVs suffering from over 10ms latency. Whereas even the worst PC monitor in the list is barely over 40ms, and the best gayman monitors are under 6ms. All at 60Hz.

This is exacerbated by the fact that TV are basically never capable of FPS higher than 60Hz, imposing even greater input lag.

Due to the fact that LCD (and before it CRT) monopolized the PC market, all other technologies have tended to exist entirely in the TV market, and as such have been shackled with hand-me-down chipsets intended for LCD TVs, even when (as in the case of OLED, plasma, and DLP video projectors) the underlying technology is capable of far faster pixel switching than LCD that would be excellent for low-latency high FPS PC applications.

Because in order to go from the digital information generated by a computer, to the analog information needed to drive pixels in a flat panel, a DAC very similar those which drove CRTs is used. CRT DACs exhibit pixel lag in the nanosecond range, indicating that technology designed primarily for LCDs has much lower engineering constraints.

The type of technology I'm talking about is not used in any consumer OLED display products I'm aware of. That is, continually recording the wear of individual subpixels (in addition to an initial calibration for manufacturing defects), permanantly keeping this information in accumulators throughout their lifespan, and compensating the brightness of every frame of every subpixel against this information in realtime.
There is no reason such a system would have to add latency or otherwise be externally noticable to an end user, unlike image orbiting or whatever.

Here's a good example:
ignisinnovation.com/technology/
Notice that it completely eliminates unevenness from the factory, TFT aging issues, and (given sufficient OLED lifespan headroom) burn-in. It's really pretty pathetic something like this didn't become standard issue back when plasma was on the market.

Attached: max-life-3.jpg (1280x720 76.19 KB, 5.07M)

Planned obsolence.

About as much as you can buy with a official Apple™ monitor stand.

About 1.2oz
narcoticnews.com/drug-prices/cocaine/

Seems correct for a guy of ballmers size who's probably built up a tolerance all the way from the DOS dominance days.

You don't know what you're talking about, stop trying to sound smart.

Explain to me how this represents radical changes in operational principle

Attached: nverpix_white_186392647.jpg (640x360, 31.1K)

First, it's not organic. Second, it's not dependent on diodes, the mechanics of a diode and a transistor are worlds apart. Third, it used QD technology. Fourth, it doesn't even need LCD. Fifth, it uses a vertical architecture completely different from what we have today. Sixth, it relies on metals, not carbon chains. Seventh, it uses SSG. Eight, go read scientific papers, and not bullshit journo articles or kikepedia.
I recommend ACSNano.
pubs.acs.org/doi/10.1021/acsnano.8b07938

I just saw your image now. God, you're more of an idiot than I thought.
Do you really believe that this nVerpix device is a LEFET? Are you high on drugs or something? That's an OLET.

Never said it necessarily was otherwise. Just like LEDs, there are a variety of organic and inorganic designs competing.
That is in fact the entire thing
That's not emissive, it's just using it as a filter, like any "quantum TV" you'd see in a big box store. True emissive QD is pure SF at the moment, nobody has managed to concoct a substance that produces visible spectrum light, let alone lab test a complete working device as simple as a single monochrome pixel.
None of the technologies we're discussing do

Look, I agree LETs would be a noticable bump up over LEDs in performance, but they're far from the only variation on LEDs, or even the most radical.

I'm not sure, even 480p feels like too much.

Ray tracing is only usable on 1080p so 1080p is probably good enough.