Remember when Intel chips had pictures of phrenology diagrams on their box? What did they mean by this?

Remember when Intel chips had pictures of phrenology diagrams on their box? What did they mean by this?

Attached: phrenologybust2.jpg (750x848 19.07 KB, 137.32K)

Other urls found in this thread:

en.wikipedia.org/wiki/Intel_Active_Management_Technology#Known_vulnerabilities_and_exploits
en.wikipedia.org/wiki/AMD_Platform_Security_Processor
iswc.pw/
iswc.pw/files.html
web.archive.org/web/20181221094023/http://iovo.io/
twitter.com/SFWRedditImages

Probably because a marketing department thought it was "cool."

because it perpetuates the incorrect idea that expensive CPU = galaxy brain

neckbeards with $2,000 gaming PCs who only use their computers for entertainment get convinced that they're intellectuals for spending time having fun

gimme a fucking break

I knew way too many dudes in undergrad CS who "needed a good PC for computer science" and then flunked out due to playing League of Legends or Overwatch instead of doing their fucking homework

yeah i hate people too

Cyber Phrenology. That Cult State / Butterfly War user's on about it from time to time.

Survival of the nerdest.

its obviously equating buying overpriced products with drilling holes in your brain, they think you are the patient there

gaymers are the worst

Why aren't you using your time to study mathematics, computer science, electrical engineering, and physics?

Mandella effect

Gaymers are truly the worst.
It's like the art community where there's this guy with the whole adobe suite and z brush, intuos displays, and a complete set of actual art materials but always gets destroyed on /ic/ for being trash and meanwhile there's these anons who just use the fucking cheapest wacom or a TXX series thinkpad that does the job and is a pixiv rockstar.


That's what they get for playing that game.

this, cyber phrenology is real

"Gamers" are mentally ill people. Seriously.

gimp, blender, faggots don't git gud because their tools are shiny

What's VSync for then, faggots?`
I have a 144Hz IPS 1440p Acer meme monitor for 330€ and it doesn't have motion blur. It's way better than expected. Really like it.
Faggots here warned me of an unevenly lit panel but mine is build 2018 and has is evenly lit.
People even complained in the comment of websites where you could buy it that white was yellow because the default setting was "warm" and not normal or custom. Retards can't even check the monitor setting.

Probably

Yes. Identifying with vidya is gay as fuck. It's like saying you're an electrician because you connected a lamp to the wall.

I always see faggots on /v/ talking about expensive end of the era >=full HD high framerate CRTs?
Or are you referring to the average TV?

Imagine being this delusional lmao.


Only a complete freetard would pretend that blender and gimp are anywhere close.
The former is actually good if a bit rough for newcomers, the latter is garbage through and through.

recommend something

I'm trying to learn blender atm so if you got better software I'll torrent it and tell everyone how great it is.

VSync removes tearing but supposedly increases lag. I've never noticed it though. Also it means a 60Hz monitor can only display 60/30/15/etc Hz. I prefer that tbh, I'd rather have steady 30Hz than shifting from 60 to 30 to 45

Imagine being this delusional lmao.

Let's get JUICY with it, gamers!

Attached: 20181225_111008.jpg (900x1600 521.16 KB, 431.58K)

Adobe CS6 masterrace tbh famalam.

It can also display 56fps if it can't keep up. It just ensures that you don't get partial picture updates because that looks ugly.
Less updates obviously means less updates. "more lag" etc
Gsyc and Freesync are monitor features which allow for dynamic framerate on the monitor. The monitor can then slow down and show 56Hz.
That results in no double output pictures which is the same amount of frames but is supposed to feel smother.
Have never tried it but totally overrated imao.
Just the newest bullshit to sell to retards. Technically it makes sense but practically the only purpose of it is smother display of BADLY OPTIMIZED vidya.
Not to mention that the effect would get way less noticeable on a higher framerate monitor like 144Hz or 200Hz. Even if it's 56fps because the time gaps to the next output picture get smaller.

I usually leave it on, never noticed any lag except for one game

This debate is pointless without reaction games where milliseconds matter. I used to play StepMania at a very high level and in that particular game you can feel almost down to individual milliseconds if something is off. It's actually quite horrible because you essentially want a hard real-time system but can never have it. CS:GO at a professional level is much the same way. Your ability to perform knee-jerk reactions isn't limited so much by your obscene reaction time being maybe 60-90ms and the frame changing every 16.66ms but rather by being able to observe an effect and your effect on the system within a very consistent time-frame. Gaining milliseconds in the jumping-off-point might be the thing that lets you win.
Vsync works by having the render context (either the GL swapbuffers or D3D/DXGI swapbuffer present call) block until a frame fence is hit. It makes no guarantees about tearing if you're not rendering faster than the vsync rate. You can (usually) combine it with double buffering to get consistent lack of tearing when your render rate is below your vsync rate.
The problem this has is displaying old data. If you're rendering in 1.6ms you're waiting 15ms to display that frame, which means your frame is 15ms out of date. This is why for professional multiplayer reaction games it's usually advisable to turn vsync off and run at insane over-capacity. You can guarantee that you will render AT LEAST one frame within the vsync time, but should you be so fortunate as to render more: the NEWEST frame will be displayed, and if you turn the overcapacity dial up to 11 you might get something only 1 or 2ms out of date.
Gsync/freesync synchronizes the monitor to the output rather than the other way around. This is absolutely fantastic if your render rate is lower than your maximum vsync rate, because it means a frame can be displayed immediately upon completion, ensuring it's 0ms old every time (barring the hopefully constant frame display overhead). If you're running above it then we're back to the same problem as vsync with double buffering.
Ironically the best place for gsync/freesync to be used is the one place it's not used and shows no interest in using it: video playback. You can eliminate the need for changing display refresh rates and pullups/pulldowns/etc. from retard rates like 29.97 which fucks your audio synchronization to hell and back, and just display frames at the correct time, always. This would even allow playback of variable frame-rate video which might be amazing for compression.

Yes it does. Just play any 2D side scroller and try to look at any object while your player is moving. Learn what motion blur actually is. I tried 144Hz LCDs and the motion blur is still bad. You need a much higher refresh rate to eliminate it in practice unless you use a CRT or strobed-backlight LCD. Also the "fast" LCDs still have shitty smearing/ghosting artifacts, none of which CRTs had aside from white transitions.
The internet is absolutely useless for getting useful information on a display purchase. This is the reason I haven't bought a new LCD since 2010 or so, I just don't trust any of that shit. All the new LCDs I try in store or at friends' houses are shit too. Yes you can adjust to correct the yellow tint in the monitor settings or on the software side, but then there still may be not enough color resolution or huge inaccuracy in other parts of the spectrum. Look at a site like TFT Central that benchmarks color accuracy.

Most computer CRTs (not just end of era) had better color than most current high end LCDs. High resolution is mostly a meme, as are high refresh rates. The standard for CRT though is 85Hz to remove visible flicker. I have a few of the highest resolution CRTs (2048x1536@80Hz) but also some with a simple 1024x768@85Hz which is still better looking than most LCDs. On my high end CRTs I usually just run at 1280x1024 or 1600x1200. I don't think the picture is stable or accurate on the max resolutions.

>>>/reddit/
also a few years ago you would have said the same thing about gsync if it wasn't yet accepted in the market.
see every forum post ever from then:

Do you play FPS? Double buffered vsync (what most games use, and when they don't, they use something even worse) adds 16.66ms of input lag, which is hugely noticeable and makes it impossible to competitively play an FPS. The real solution of beam racing works on any monitor ever made (but no game does it). VRR is unnecessary and doesn't even make sense conceptually. Yes beam racing requires a hard real time constraint, but games are meant to be hard real time anyway and the fact that they fail that makes them shit.
It wont be a steady 30, it will toggle between 60 and 30, etc. Showing 30FPS on a 60Hz monitor will cause double images except when there's so much motion blur that you can't see the double image (possible on LCD). But 60FPS shouldn't be hard to maintain.... the real problem is that you'll have input lag no matter what.

That's not the reason vsync causes lag. Vsync causes lag because of the shitty double buffering algo which is easier for shit gamedevs to implement. play an old game and it will almost never drop below 60FPS and you'll still see input lag. turn off vsync and cap at 60FPS and you'll suddenly feel like your hand is a lot lighter
A single double frame at 60FPS will look like a stutter and you will think the game is badly optimized.
This is correct.
Stutter from a double frame would probably still be bad on those rates (never tried it). But again the real issue is lag. The framerate shouldn't be dropping below the refresh rate. That's just a meme for gamdevs to justify their shitcode, which will drop below 60FPS on even the most high end system that exists at lowest graphic settings. I ran vsync at 120Hz on my CRT (8.33ms of lag due to double buffered vsync) and it was still enough to give me a disadvantage in FPS. It still has that feel like your mouse is stuck in molasses. I run all games with vsync off.

Those aren't graphically intensive so you can run them at full framerate anyways.
And when you're playing at full framerate it doesn't make a difference to VSync.
Stop pretending. They surely have machines good enough to play fucking CS:GO at full framerate.
like that matters on my 144Hz monitor when all videos are 60Hz 30Hz 25Hz or something in between.
However the file size increase in frames saying nothing changed on the screen is not as big as you think it is.
No problem with that. You don't watch it on a 30Hz monitor so you shouldn't even be able to see that.
No. Ever considered that your media player is trash?
Even VLC has it's problems. For example: At low framerates VLC just spergs out and displays garbage.

You're right there is a reason it's there and it has some slight advantages but look at the price and that you require a matching card.
People with integrated graphics can't even use it to begin with, so your movie watching club is already out.

What I'm saying.
Of course if you turn VSync of the game will run as fast as it can and if the rate with VSync off is way higher than your monitor rate the game loop will calculate physics and input etc way more often but you may have screen tearing and if your game is well programmed it shouldn't be way more reactive.
The problem you describe is not even solved by freesync or gsync.
Not buying into it. Git gud.

Studying is for faggots. To master anything you need to slam your head against it repeatedly until the universe blesses you with the one attempt that finally works, at which point you can then laugh at those faggots still nose-deep in their books, for you have outwitted them.

Double buffered vsync at 60Hz is so bad that even moving the mouse around is painful.
You're describing OpenGL's triple buffering right? Normally a monitor will show pieces of each frame stacked on each other when vsync is off. OpenGL's triple buffer isn't used much because it still has too much lag in practice because it really needs above 300FPS to get a low enough latency. It also has judder.
huh. I was about to say you're wrong but now that I think about it, VRR can't possibly have less lag than vsync at max refresh rate, no matter how it works (I never used VRR). How has noone noticed this? Most gaymer LCDs have been 120Hz for years so it should be easily noticeable (8.33ms lag).

False and most games have crappy framerates that vary from 0 to 100+. Even CS:S and CS:GO gave me moderately bad frames on maps like cs_assault no matter what settings across multiple machines.
He's talking about input lag caused by vsync. With full framerate you still have 16.66ms of unnecessary input lag. Go in CS:GO and enable any form of vsync, it will immediately feel like the mouse is unresponsive.

You can easily feel the difference between 120FPS cap and 120Hz double buffered vsync. I've tested it multiple times. Just moving the mouse around for 5 seconds is enough to tell. If you can't tell that the mouse is slower, _YOU_ need to """Git Gud""". How can being able to feel latency because you're not slow enough to not notice it, make you bad? It's the other way around. That doesn't even make sense: I said 120Hz vsync feels (and should be) laggier than 60FPS cap without vsync. How can that have anything to do with me being good or not? (Note I use a simple USB mouse and keyboard and it's possible the lag is only noticeable because of the lag they add).
Actually you're right. I never thought of how VRR works until now, it doesn't solve input lag. Sounds even more useless than what I previously thought.
This is how double buffered vsync works at 60Hz (approximately, since in reality, games use a PC timer which is slightly slower or faster than the GPU, and vblank is handwaved):
Oh look at 00.00 user input was sampled and it took until 16.66 before a single pixel was told to change. Contrast this to a 60FPS cap without vsync:
In this case the latency varies from 0-16ms across the screen (not necessarily top to bottom), whereas with double buffered vsync it ranges from 16.66ms-33.32ms from top to bottom.

All that spying "bugs" in the cpus they were just starting to inject -- what, you thought Meltdown and Spectre were actual code bugs? -- so that the nsa couldn't just watch you but actually deduce from the stream of data the cpus were processing exactly what sort of terrorist / threat-to-the-illuminati you were ... and some of the techies, horrified that Intel was captive to the nsa suggested phrenology images to the marketing people -- who just thought "yeah, cooooool images" -- as a way of warning the world as well as mock the nsa's belief that "data = personality"


But, in reality, probably this

Attached: is-nsa-spying-on-me.jpg (425x301, 31.93K)

And off the tracks we go ...

Attached: derailment-cause.jpg (333x500, 83.25K)

Wait an minute, with VRR you can just cap the framerate below the max refresh rate and that should eliminate all vsync-induced input lag. So no, VRR _does_ give you vsync without input lag. As long as the software implements this correctly it should work.
Consider example with a 100Hz monitor (with instant vblank interval for simplicity) and the game rendering every 11ms:
but now you have the problem that the periodic timer will drift with repsect to the GPU. maybe it gives you its own timer? or it could just measure how off it is by watching vsync pulses

you know how I know you're not a software developer? because you think bugs don't happen
software development is complicated as fuck, and when project managers tell you to cut corners to meet some shitty deadline, you end up cutting corners and it really shows
you faggots have no idea how buggy most code is, it's not a fucking conspiracy, it's a miracle anything even works at all

software is only "hard" because of unix/webdev braindamage. your deadlines are unreasonable because of consumerism brain damage

DO NOT BUY INTEL OR AMD CPUs. THEY ARE BOTNET
en.wikipedia.org/wiki/Intel_Active_Management_Technology#Known_vulnerabilities_and_exploits
en.wikipedia.org/wiki/AMD_Platform_Security_Processor
Intel and AMD cpu's have a secondary CPU that is outside of your operating system control. this chip can access your storage, mouse, keyboard, network, it can remotely send your files or keystrokes. it can also remotely destroy your PC. Installing Linux won't deactivate it. The only solution is to not buy and not use those CPUs
Also, Intel is israeli jewish company, if you support them financially you support the white genocide and jewish supremacy.

Implying I wasn't in favor of it immediately.
Motion blur is an issue, but tearing is infinitely worse.

There are only two major companies for desktop CPUs.

Actually, VIA, DM&P (Vortex86), RDC (EmKore), Cyrix (MediaGX) and other companies manufacture CPUs for desktops and laptops but they are very slow so only few people ues them.

What about RISC-V?

vs framerate~=refreshrate cap:
luckily you don't need VRR. beam racing solves everything, it just requires devs to stop being niggers. in some cases you can even do it without any dev intervention. you can synchronize your monitor by tweaking the pixel clock until it matches the game's framerate and adjusting the phase such that vblank happens right before the game sends a new frame. i've done this for Jazz Jackrabbit in Dosbox and it works. VRR literally exists because devs are retards, and it's mutually exclusive with removing motion blur. beam racing solves synchronization with no latency penalty, as well as allowing motion blur. what would be more useful is a limited form of VRR where framerate variations are small enough to not cause luminance changes, so it can still be used on a strobed display

as well as allowing [removal of] motion blur

Okay guise i've been wondering why is the net so bloated today heres an example:

iswc.pw/

like why the fuck this uses 40% of my cpu

plus the design is so fucking hipster trash

is this the state of modern web development??

Attached: burger.jpg (630x446, 109.72K)

so what? if both are botnet then don't buy either

who needs speed? for what purpose?
to run pajeet javascript "software"? properly designed and written software can run properly on slow CPU.
decades ago, people had 50-300Mhz x86 CPU yet they could:
-office work (writing book or document, spreadsheets, presentations, etc)
-play computer games (3-dimensional too)
-watch photos and videos
-edit photos and videos
-web browse (when websites were made in HTML instead of javascript)
-many more things

iswc.pw/files.html
lol what is this gay shit
well it looks like some NEET shit so no. a better example is twitter or pinterest. there are better examples but i don't know any, but i come across them all the time

It's called sarcasm, dude.
look it up

Attached: cool-story-bro-deadpool.jpg (530x327, 41.03K)

ok here I found a good example
web.archive.org/web/20181221094023/http://iovo.io/

k
also
are you autistic and did you also just wake up from a coma?

What kind of shit monitor are you using?
Even my shit monitor isn't that bad, and it's literally something decommissioned from work.

If you search "porcelain human head" on Amazon, those things come up first. They probably wanted one of those on the box, bought one, and took a photo, then digitally wiped the other crap there (if they didn't physically buff it off before).
You'd have to be retarded to think that Intel's art department (which is probably being fed peanuts for pay) thought about any deeper meaning. Hell, there's a very good chance that they had no idea what phrenology even is.

That's not how it works. It has nothing to do with the quality of the monitor. Sample-and-hold LCDs all have this problem and even at 144Hz (blur decreases with higher refresh rate) it's still bad.
You probably scroll very slowly. Learn what motion blur actually is before talking about it. I have 30 LCDs and they all have this issue. Newer LCDs have strobled backlight options which removes motion blur but I've yet to test one.