Why are we still doing this? The term GPU comes from a time when everything used fixed-function rendering pipelines. Modern GPUs are fully programmable. Even calling the programs that are designed to take advantage of modern GPU execution units "shaders" are a misnomer because even in the context of graphics they're being used for more than just shading shit now.
We should start calling GPUs "Manycore coprocessing units" (MCUs" instead. That would be a more fitting name honestly. It would certainly be less retarded than the absolute abomination of a buzzword "GPGPU" I mean seriously who the fuck thought that was a good idea?
Literally answered your own question. Shit thread.
Easton Mitchell
GPUs have taken on a more general purpose role sense the rise of machine learning and shit
Caleb Roberts
Also OpenCL and DirectCompute
Dylan Bailey
Also why do we call computers computers when they do more than compute stuff
Elijah Nguyen
such as?
Xavier Rivera
GPUs are also distinguished by their focus on vector operations, so "manycore vector coprocessor" might be the best description. Another distinction held by GPUs is their being the only application where VLIW significantly outperformed RISC or CISC.
Storage and retrieval?
Brayden Adams
...
Charles Long
MVC? Sounds like the most appropriate name for them. Now we just have to autistically start calling them that every time the latest GTX or Polaris cards come up and hopefully it sticks with normalfags eventually
Austin Walker
Hoping this becomes reality. How to meme this into existence?
The name PPU was appropriate though since it could literally only do picture processing
Cameron Adams
I was referring to physics processing units.
Owen Hill
Didn't only PhysiX make them? And then nvidia bought them out really quick and killed off the tech?
Aiden Price
Nvidia never killed off the tech. PhysX still lives inside of every new GTX card and driver. The PPU is integrated into the die and while few software/games have dedicated PhysX libraries anymore, the drivers still use the integrated PPU to offload general physics computation from the CPU
Tyler Davis
+Vulkan compute shaders
Zachary Phillips
*SPIR-V
Samuel Morales
Because they're opaque, undocumented piles of shit that do fuck all else.
Cameron Wright
These are APIs exposing features, but they don't actually document what the GPU is doing, how and in what order, or provide lower level access to its internals.
be autistic about it but dont overdo the copypasta pls hopes that mememagic exists
Michael Anderson
No modern GPU is VLIW since the Radeon HD 6xxx days.
Liam Rogers
OpenCL is dead, thanks to a consolidated effort by Apple and Nvidia.
Jose Evans
There is no "integrated PPU", it's all done in the shader cores.
Zachary White
...
Andrew Reyes
Because for an average user the GPU is doing fuck all besides graphics. GPU offloading haven't really caught on outside some very specialised domains and GPU-accelerated programs are still a very rare breed - unless they're doing graphics of course. There's still lots of fixed-function hardware in a modern GPU. Rasterisation, texture sampling and filtering, ROPs... This is pure DirectX faggotry. OpenGL has called them "programs" since the very beginning. Of course, since normalfags are buying GPUs purely for DirectX gaymes, DirectX terminology is what vendors end up using to describe their HW in marketing materials.
Hunter Carter
There's still a Physx controller that uses the cores though
Austin Howard
Isn't writing directly on the metal for individual GPU dice a fucking terrible idea in principle? Having a better idea of how a badly written driver might be fucking up is fine, but shipping binaries optimized for particular GPUs absolutely isn't.
Pretty sure even the likes of Excel are exploiting GPGPU for parallel compute.
Lincoln Fisher
...
David Allen
...
Gabriel Sanchez
Pretty sure only LibreOffice Calc is doing that, and then only for very large sheets a normalfag will never encounter.