Do we need a higher resolution than 640x480? Why? What is the ideal resolution?

Do we need a higher resolution than 640x480? Why? What is the ideal resolution?

Attached: 44e26525b09cf5ba43ae9d0a859c8d130cff533abf01fa9e32b15939eab9a699-tech.jpg (1803x1335, 205.85K)

Other urls found in this thread:

en.wikipedia.org/wiki/Subpixel_rendering#FreeType
freetype.org/patents.html
webtoons.com/en/challenge/internet-explorer/bonus-article-13/viewer?episode_no=46&title_no=219164&webtoon-platform-redirect=true
imgur.com/gallery/XEEzVY1
i.imgur.com/EJ5Y8VK.jpg
stari.co/tv-monitor-viewing-distance-calculator
mafiadoc.com/electrofluidic-displays-university-of-cincinnati_59e70dd51723ddacaa177628.html
archive.is/hlezu
youtube.com/watch?v=nHG131WPRFE
old.reddit.com/r/pcmasterrace/comments/433c5x/can_someone_explain_lod_bias_texture_filtering/
youtube.com/watch?v=XjHt-4Z6PwI
en.wikipedia.org/wiki/File:ZfightingCB.png
twitter.com/NSFWRedditImage

I think we need to go back to /g/ my friend XD come follow me I'm going there right now.

...

>>>/g/

Not enough room for all of my huge 32*32 icons with just 640*480 pixels of space.

The /g/ay community

Higher resolution - or more precisely, higher pixel density - is very important to make reading on a screen comfortable, or to work with image editing.

Attached: Pixel-Density-Comparison-of-Displays-with-Different-Resolution.gif (1079x1600 211.33 KB, 699.1K)

Is there more of that comic?

Why is Chrome-chan so fucking damn sexy mmm

She looks good and performs well but she's got a spying habit that reports your habits back to the advertisers.

One that matches or exceeds what the human eye is able to resolve.
This has to be balanced of course by what available technology is able to provide while also not crippling performance, using too much power or becoming too expensive.

Attached: Bueno goblin.png (769x329, 272.22K)

I was talking about drawing rather than the actual browser desu. I use Ungoogled Chromium usually.

1080p for laptops, desktops, TVs. 720p for phones.

baka
ungoogled Chromium is usually behind IceCat in terms of security patches

720p for 5 inches or less screens, 1080(+ for more than 16 :9) from 5.1inches to 21inches, then 2k(+ for more than 16 :9) from 21.1 inches up to 69inches and to finish 4k for really big useless shit.

Is he trying to fondle Chrome-chan's boobs and ass? Isn't it a sexual harassment?

It's not a sexual molestation. He's trying to rape Chrome-chan.

1024x768 is pretty decent. it could be higher though. stuff way past 1080p is probably placebo. hard to tell since all media you can get is compressed to shit. pretty much all existing 1080p media is so badly compressed that on a 1080p or lower screen, a 4k video looks noticeably better

americuck

Why does pixel density matter for readability? Does it matter for books? CRTs blurred text and noone ever complained (they complained about vibrating and other problems with low quality CRTs and flicker - a configuration problem). Only after the switch to LCD when it looked retarded people started complaining that it looked bad on LCD (possibly not even a result of pixel density but the fill factor instead) so everyone started memeing ClearType. Now what you have is just a bunch of fonts with artificial blurring. Also the color fringing you see on lots of LCD setups from broken sub-pixel rendering is very unreadable.

1080p

blurred fonts are a linux thing

The higher the pixel density, the sharper the contrast between the font and the background color. This sharp contrast makes it much easier to recognize the shape of each letter which means the brain doesn't have to invest so much energy into interpreting the words. That's the reason for increasing the pixel density on the screen.

Yes, that's why we use text printing techniques that vastly surpass the quality of image printing techniques.
Grab a newspaper and use a magnifying glass to look at the letters and at the images, the difference is massive.

Attached: ZUtA Pocket Printer.gif (450x254, 835.68K)

Is it just me, or is the 244 ppi easier to read?

Attached: What am I reading_Clue_Winnie the pooh_Confused_Evidence_Facts_Trick_fine print.jpg (515x388, 58.52K)

The 244ppi has slightly thicker lines than 498ppi which might make it easier to read despite being more jagged.

It's shit. I actually want to see pixels if I get close enough to the screen.


No, they're a windows thing.

There's no way that thing is worth how much someone paid for it.

it seems easier to read until you notice how sqashed in the 2nd chracter is

Unless you really need higher resolution for image editing, high-quality content consumption, or you're managing various windows on one monitor, 720p is fine. 1080p is noticeably sharper, but for anything relatively normal, either one of those should do. All these stupidly high resolution displays---4K monitors you're sat two feet from, for example---are gimmicky normalnigger feed.

en.wikipedia.org/wiki/Subpixel_rendering#FreeType
Dude. I'm not saying the font rendering of Windows is worth using the OS since font isn't THAT valuable to me at all but Windows definitely has better font rendering.
When will they finally do that? I want that.

I speak from experience. When I change my resolution on windows it all looks blurry as fuck. When I do it on Linux everything is sharp and readable.

No, kek. Maybe you have misunderstood what sharp means? Because if I compare Windows font rendering to freetype, Windows looks sharper.
It may be different if I compiled freetype myself with the flag for colored subpixel rendering and enabled but when I download a standard Linux, BSD whatever ISO and install that it's simply not the case.

I don't know what you mean. I never use any OS with a different resolution then the screen.
You're not supposed to. If I want to see something in low resolution I just drag the window on a low res monitor.

The opposite of blurry and difficult to read.
I'm talking about literally any normie distro, like Mint and Ubuntu.


Lel, fuck off. The reality is windows doesn't handle upscaling your resolution at all while Linux does.

what the fuck are you talking about

Yes but does it matter once you go past 1 pixel per centimeter?

non argument. fag trying to justify his 4000xWhatever resolution (and resultant $2000 machine to drive it) detected

No you don't. That's not criteria for a good screen. Maybe you do for some odd reason but it's irrelevant.

look at the link from the shit you pasted: freetype.org/patents.html
whatever the fuck you're talking about sounds wrong. also if it's merely subpixel rendering, that's been in Linux for ages

If you change your LCD's resolution nothing will look sharp. I don't think it's even possible for the computer to compensate for the loss of resolution because the LCD is typically using an algorithm that combines a weighted average of 4+ neighbouring input pixels for a single output pixel.

Scaling on an LCD will look bad no matter what you do.

Now look at all the people in this thread saying 720p is fine for desktops. The point is that every time there is a leap in visual technology we have to adjust to it. I remember how jarring 720p to 1080p was on a desktop. 1080p to 4k seems even more jarring, but I'm sure I could get used to it.

The ideal resolution is whatever is most common on the market, because devs in the year 2019 are still too retarded to make sure their UI scales properly and you will have to squint/zoom constantly with a very high res screen.


Another way to make reading more comfortable is to use h'white writing systems instead of chinkrunes.

You must have been really fucking gay to use 720p. I went from 1024x768 to 1280x1024 before being forced to 1080p by (((market forces))).

If standard resolution for today's processing power was 480p. Most gaming companies probably would've went full speed ahead with Stadia since hosting the render farms would be much more trivial and the bandwidth required wouldn't be as high.

There's always the problem with latency and input lag, but already they ignore that.

Wait, aren't both of those technically 720p?

Holy shit, I apparently know nothing about display standards. I've never heard of any of these terms for resolutions that were commonplace back in the day. Who ever actually used legit 720p?

AFAIK 720p and 1080p specifically mean 1680x720 and 1920x1080.

However if you think about it, monitor resolutions require 2 numbers to specify them, just vertical is meaningless. *p is only a thing in TVs because those are specialized displays that always have the same aspect ratio. But once tube-less TVs became a thing, the industry started pushing monitors in the same format. We are forced to tolerate an utterly shitty standard in computing because... It'll be convenient if you want to watch TV on your computer?

Goes to show how completely cucked we are as consumers.

HAPAS ARE SUPERIOR TO WHITES

HAPAS ARE SUPERIOR TO WHITES

HAPAS ARE SUPERIOR TO WHITES

I think you mean *i is a TV thing. I know the *p and *i stuff have to do with interlaced and progressive scan. Which isn't strictly a TV thing as it applies to emulated games on PC as well. It has to do with displaying objects in motion. *i produces artifacts on monitors and certain TVs, *p is used to provide a more fluid image when objects are moving than *i.

Holy shit, the Masonic influence here is glowing.

Where are all these shills coming from?

Whatcha saying schlomo?

These are our enemies. Why are we supporting them?

This is why Zig Forums needs IDs

Judensheim pls go

Masons, Masons everywhere...

I'm seeing a lot of bot posts spammed around here. Be careful anons

Whatcha saying schlomo?

where the fuck did you get a 720p monitor? there was no 720p era, it was marketed as a "cheap"/"value" alternative to 1080p, while monitors were already much higher than 1080p before 1080p even became a thing. AKA: save $5 to buy this thing that's twice as bad

and the input lag of remote rendering is far worse than anything you'd get with a screen, even the pathological 16.66ms of the Dell 2001FP, Dell 2007WFP, etc back in 2007

720p is 1280x720 you fucking absolute nignogs. it's a marketing term, and you saying random shit doesn't change what the marketers are intending to say. and yes, 1280x1024 was available and commonplace a decade before "HD" became a thing. yall either trolling or underage

"i" means interlaced period. "p" means progressive.
motion issues are just one issue of interlaced mode. and all LCDs have motion issues as well, regardless of progressive or interlaced. even still pictures in interlaced mode have artifacts, because each field is toggled off and on constantly (on both LCD and CRT). "field" is a real term, look it up if you want

the best resolution is the tor browser resolution
speaking of tor could fellow anons post the resolution that the tor browser uses? im not trying to track a tor user who uses my website

yes

1280*720

...

Why was this weird 5:4 resolution a thing? why no 1280x960 or 1366x1024?

Ahh, well that's OK then. For a minute I was upset but then I'm fine. Chrome-TAN is plainly for rapes, not gropes.

2103x1183, 2351x1323 and 3325x1871 are the only acceptable resolutions if you dont use one of these resolutions you're an idiot

those are both 4:3, i thnk ive seen crts with these modes listed in their EDID

...

ultrawides - beware of ultrawide marketing

Attached: 1557136870022.png (311x568, 14.71K)

Source?

1,7776838546069315300084530853762
1,7770219198790627362055933484505
1,7771245323356493853554249064671

Attached: 1557137437852.jpg (1000x1333, 629.7K)

Of course we need it. I refuse to watch big delicious anime tiddies in anything less than 4k.

And 10 bit, I hope?

Because downscaling is a bitch and video resolutions keep getting bigger.

webtoons.com/en/challenge/internet-explorer/bonus-article-13/viewer?episode_no=46&title_no=219164&webtoon-platform-redirect=true
imgur.com/gallery/XEEzVY1
i.imgur.com/EJ5Y8VK.jpg

Anime isn't made in 4k. Some of it still isn't even made in 1080p

It can't be!

Attached: 1557167282510.png (857x478, 778.39K)

/thread
A high resolution/density is beneficial for the end user. It makes everything clearer, sharper and easier/more pleasant to work with.
Anyone who deliberately chooses to use a lower resolution for any reason other than potentially lower power usage is really just being a contrarian nu-luddite.

hmmmm

No faggot, you pick the one that covers all your use cases. If you can't tell the difference between 9999 pixels wide and 99999 wide, you would go for the former because it's cheaper to implement and cheaper to drive and more feasible.

Scalers

Attached: märchen mädchen upscale.jpg (1920x2160, 1.51M)

I actually went back from my small 4k Dell HiDPI screen to an older 1600x1200 screen. I always went with the next big thing (tm) when it came up regarding screens and all it did was help other people make money. Yes, outlined fonts are sharper on such dense screens and they do look great, but that's one upside amongst tons of downsides. First of all, Widescreen makes no sense for computing. You usually end up with reams of empty space that might as well not be there. I got by with a tiling manager but they had their own collection of problems regarding software compatibility and pure looks. (which I find important psychologically) Lots of the programs I used couldn't handle 4k well either and I doubt some of them ever will. Neither could the computer the screen was connected to. I don't care about newer offerings because they're usually terrible bloatware.

I also use bitmap fonts everywhere, outline fonts are for niggers anyways, so nothing is blurry where it matters. I went even a step farther and ripped my system fonts directly from a Windows 95 ISO, (MS Sans Serif) it's a non-monospaced and incredibly readable font that comes in several sizes. Somehow in the last 10-20 years people forgot that bitmapped fonts are a thing. There's literally nothing sharper and readable, the resolution of the screen doesn't even matter. Yes they don't scale as easily but that's a non-problem if you only want something that works on your screen. I also don't give a toss about foreign charsets because I speak god's language and people who don't don't belong in front of a computer anyways and never produce anything of value.

My 4k Dell also had a retarded problem where it wouldn't come out of power saving mode or not enter it sometimes. Sometimes it would also get stuck and the OSD would stop working and since that retarded monitor didn't even have an old school power switch (just a software controlled one that'd get stuck too) you had to actually pull the plug to reset it. A known problem that somehow survived several screen generations. The firmware in that thing has literally one job and they managed to screw it up. Pure niggerware and a great example of where tech is heading.

Seeing that 1600x1200 screens keep going up in price on eBay, I don't think I'm the only one feeling that way.

...

1920x1080 is minimum for viewing textbook pdfs

If you have 20:20 vision, that means you can resolve one arcminute of detail (although this is merely for parallel lines, rather than vernier angles, for which human visual acuity is nearly 8x sharper). For video displays, the largest single screen that could legitimately be called a PC monitor (~30") at the closest comfortable viewing distance (~1.5') could certainly go up to 8k resolution without exceeding a 20:20 viewer's ability to distinguish individual pixels:
stari.co/tv-monitor-viewing-distance-calculator

Attached: 720vs1080-625x1000.png (547x461, 14.43K)

eInk is best for reading. I have a 13" eInk screen and it's been a fucking feast for the eyes for writing text and coding. You'd not think so, but you actually get used to the latency and colorless-ness rather quickly and while the latency is very noticeable, it eventually just doesn't factor in anymore. (For text, these screens are obviously not made for FPSes or even mouse operation) Sadly the hardware is complete chinese shite. There's only one offer on the market. The other one with the same screen is a seriously outdated Android tablet that's also chinese garbage and obviously full on botnet. Sad that eInk doesn't quite manage. A 14" 4:3 eInk computer screen would be a dream. Throw in a serial or usb port to make it usable as a dumb terminal and I'd cum buckets.

E-ink based stand alone monitors are an idea that's missing in today's tech market. If I wasn't already occupied, that would be a market opportunity that I'd love to get into.

I wouldn't mind it for reading pdfs and manga but I would be lost without muh syntax highlighting.

In video games there's a concept called "LOD"s (Level-of-detail) for a given distance. LODs aren't the same as render distance. What LODs are for is to prevent too much visual noise in the distance (seen as shimmering). The reason this happens is because the screen resolution is not high enough to properly resolve a high detail density when you see high detailed geography in the distance. For modern 3D games the answer to this question is simply "When the shimmering stops at high LOD distances" or when anti aliasing is no longer needed. Since Anti aliasing is also essentially caused by the 3D models being too large for the screen resolution to properly resolve.

Attached: Real_vs_emulation.png (3832x1080, 5.39M)

I mean aliasing in general

Little known fact: there are eInk screens that can actually display up to 4096 colors. Together with dithering that could make a pretty usable color display and if carefully designed, it would be more than fine for every chinese cartoon in color you can think of. The sad thing is simply that nobody is really interested because eInk screens are fucking expensive. eInk Inc. holds all patents and basically has a monopoly on the technology which makes it very unattractive for many companies and basically chokes innovation in that field to death and makes all but sure that eInk will never seriously go anywhere. Also the public interest isn't high because you can't sell them to 14 year old fortnite players als xxxtreme gam3r screenz, and the normal screens are "good enough" for office work.

You can order eInk screens directly from the eInk inc. website and design something yourself. It's a lot of work (eInk isn't easy to design around) and not cheap either though. There were also a few kickstarter projects but they've been more or less all garbage. It is a huge pity.

Aliasing isn't actually caused by excessive detail in the signal being sampled per se, but by the unnatural regularity of the sampling pattern used, most often a perfectly square grid. You're right that liminating high-frequency elements from input, such as through LoD or MIPmaps is one way to combat aliasing, but even without multisampling, a more random pattern will greatly reduce aliasing as well. This is why truly random sampling, such as the grains of dye on a piece of photographic film, or the foveal cells in the human retina, do not suffer from aliasing.

Furthermore, in a system that suffers from aliasing, use of multisampling or other AA with a higher-than-output signal won't just look more pleasing to the eye due to a lack of aliasing artifacts, it will transmit a greater amount of information. A hardware equivalent of this was running a CRT above its physical phosphor pitch, which blended together adjacent pixels.


There are also a number of other e-paper technologies capable of much higher framerates, such as the University of Cincinnati's electrofluidic displays, the framerate of which is largely dependent on the size of its pixels, which is capable of well over 30 FPS, probably up to well over a hundred FPS:
mafiadoc.com/electrofluidic-displays-university-of-cincinnati_59e70dd51723ddacaa177628.html

Attached: rods & cones distribution.jpg (1024x768 39.95 KB, 54.28K)

So basically what you're saying is the way to solve this is to fundamentally change how subpixels are displayed.

That is has been one solution, yes. But various ways of injecting randomness (or sufficiently good pseudorandomness) into the sampling stage between original and displayed signal, particularly as resolution increases, is sufficient to pretty much eliminate aliasing artifacts.

Of course, for artificially generated signals such as CGI or vidya, using lower quality assets in areas that don't contribute a meaningful difference, so computational resources can be concentrated where they matter, is also important to keep your PC from exploding.

No one knows, user. It's just hype.

archive.is/hlezu

64x1

ye. LCD controls are some of the worst software ever made
does your 1600x1200 have the switch? NEC detected

False. assuming you mean a 24" 1080p higher LCD. Anyway you should read with big fonts or your just wasting your eyes by forcing them to focus all the time. So I'm not sure how pixel density even applies to reading.

Actually the latency is unbearable, but LCDs are so bad it would be worth it. I've been reading on a high end CRT for the last year and it seems less harsh on the eyes than LCD.

pajeet pls

so you're talking about temporal aliasing/moire? because that still happens on any game with any "anti-aliasing" features enabled. it also happens in digital video. get in CS:GO in de_dust and walk towards/away from buildings and watch how the edges constantly change as you get closer/further
[6 captchas were solved to make this post]

No, I'm not, temporal aliasing happens when you move closer to or farther from an edge in a scene. LOD shimmering is completely different. Its not something you typically encounter in most games since most games (I'm talking 90 percent of them) use LOD management. If you disable LODs though and look at a distant object with complex geometry you'll know what I mean. Its caused by too high of a detail density to properly resolve for a given resolution. Its has to do with how the scaler is rasterizing the image to a given resolution, or rather is attempting to. That's why most LODs are set based on a target resolution.
You can't really see it in a still image, but look at the distant mountain (forgive the high contrast, its just to help illustrate). The level of detail is maximized thanks to the power of emulation, all that visual noise will typically show up as shimmering in real-time unless your resolution was high enough. Its more related to conventional aliasing

Attached: Screenshot (9).png (1920x1080, 3.22M)

now this is the final redpill

16:9 is good because you can put your monitor on its side and have a near golden ratio screen.
Anything else is a meme.

most

Attached: abf04ead3ac2ea990d97fcfc11a453b167d5ba0d9975c688c2fba1941b40dc2d.jpg (616x578, 32.58K)

You don't know what you're talking about. Lurk more.

Sounds like you're talking about what I meant by "temporal aliasing/moire". Not sure what it's actually called since I looked it up and found nothing. If you downscale a 2D image with bilinear interpolation (or any of several other scaling algorithms) and use that to do an animated zoom out, you'll see annoying noise everwhere, especially on lines. But while the image is still it will look "fine".

Is this what you're talking about?
youtube.com/watch?v=nHG131WPRFE
When I first saw this shit 10 years ago I thought my hardware was broken.

The video is from old.reddit.com/r/pcmasterrace/comments/433c5x/can_someone_explain_lod_bias_texture_filtering/ and this nigger is saying games used to solve the problem with trilinear filtering. I will have to try more games. I've yet to see a single animation from {image viewers,games,videos} that solves this problem.

I assume you're not talking about this "z-fighting" shit:
youtube.com/watch?v=XjHt-4Z6PwI at 42 seconds
en.wikipedia.org/wiki/File:ZfightingCB.png

[9 tab switches and 5 captchas were done to make this post]