People still call coprocessors where graphics rendering is typically offloaded to with fully programmable shader models...

Why are we still doing this? The term GPU comes from a time when everything used fixed-function rendering pipelines. Modern GPUs are fully programmable. Even calling the programs that are designed to take advantage of modern GPU execution units "shaders" are a misnomer because even in the context of graphics they're being used for more than just shading shit now.

We should start calling GPUs "Manycore coprocessing units" (MCUs" instead. That would be a more fitting name honestly. It would certainly be less retarded than the absolute abomination of a buzzword "GPGPU" I mean seriously who the fuck thought that was a good idea?

Attached: shoulder_shrug.jpg (4288x2848, 3.78M)

Other urls found in this thread:

developer.amd.com/resources/developer-guides-manuals/
twitter.com/NSFWRedditVideo

They can be called "GP"GPU when they start publishing full specs on how to program them.

openacc
nvidia cuda
openmp

Attached: wizard2309823.jpg (259x194, 9.06K)

Literally answered your own question. Shit thread.

GPUs have taken on a more general purpose role sense the rise of machine learning and shit

Also OpenCL and DirectCompute

Also why do we call computers computers when they do more than compute stuff

such as?

GPUs are also distinguished by their focus on vector operations, so "manycore vector coprocessor" might be the best description. Another distinction held by GPUs is their being the only application where VLIW significantly outperformed RISC or CISC.


Storage and retrieval?

...

MVC? Sounds like the most appropriate name for them. Now we just have to autistically start calling them that every time the latest GTX or Polaris cards come up and hopefully it sticks with normalfags eventually

Hoping this becomes reality. How to meme this into existence?

Holy fuck you people are autistic.

out

member PPUs?

I member.

Attached: mbry.jpg (500x500, 4.06K)

guise my x86 abacus stopped working

The name PPU was appropriate though since it could literally only do picture processing

I was referring to physics processing units.

Didn't only PhysiX make them? And then nvidia bought them out really quick and killed off the tech?

Nvidia never killed off the tech. PhysX still lives inside of every new GTX card and driver. The PPU is integrated into the die and while few software/games have dedicated PhysX libraries anymore, the drivers still use the integrated PPU to offload general physics computation from the CPU

+Vulkan compute shaders

*SPIR-V

Because they're opaque, undocumented piles of shit that do fuck all else.

These are APIs exposing features, but they don't actually document what the GPU is doing, how and in what order, or provide lower level access to its internals.

Well guess what. Some do!
developer.amd.com/resources/developer-guides-manuals/

be autistic about it but dont overdo the copypasta pls
hopes that mememagic exists

No modern GPU is VLIW since the Radeon HD 6xxx days.

OpenCL is dead, thanks to a consolidated effort by Apple and Nvidia.

There is no "integrated PPU", it's all done in the shader cores.

...

Because for an average user the GPU is doing fuck all besides graphics. GPU offloading haven't really caught on outside some very specialised domains and GPU-accelerated programs are still a very rare breed - unless they're doing graphics of course.
There's still lots of fixed-function hardware in a modern GPU. Rasterisation, texture sampling and filtering, ROPs...
This is pure DirectX faggotry. OpenGL has called them "programs" since the very beginning. Of course, since normalfags are buying GPUs purely for DirectX gaymes, DirectX terminology is what vendors end up using to describe their HW in marketing materials.

There's still a Physx controller that uses the cores though

Isn't writing directly on the metal for individual GPU dice a fucking terrible idea in principle? Having a better idea of how a badly written driver might be fucking up is fine, but shipping binaries optimized for particular GPUs absolutely isn't.


Pretty sure even the likes of Excel are exploiting GPGPU for parallel compute.

...

...

Pretty sure only LibreOffice Calc is doing that, and then only for very large sheets a normalfag will never encounter.