End of Codec Wars?

What you said has no value until the encoder is optimized. I mean, come on, the bitstream spec was frozen only 4 days ago.

x264 is not that good, is the best because compatibility.
4K is not a joke, human eyes can focus their vision up to 7K IIRC, so 8K IS a joke and will be forever.
But yeah while I do admit official encoders suck ass compared to some actual torrent pros Like Grym or RARBG which can keep quality while reducing size by 17%, depending on the movie it will start to suck below from there but I've seen some good looking 1080p movies from BDRips at 4GB but they're usually light on special effects or cartoons.
x265 is slightly better though not as good as advertised maybe 33% tops.
vp8 is a joke, it looks like ass, even if you copy the freaking bitrate or render above it which makes me paranoid about AV1.
vp9 I haven't tried, I'm a poorfag and I'm afraid my shitty PC will implode.

The real fucking joke here is Level 6.2 for h.264, like, who thought of that, or they made it for x265 and they just realized they could use it for 264?

I honestly don't know why everything doesn't just use ffmpeg as its backend for playing video/audio. It's good, and it just works. Every unix-like OS tries to do its own snowflake thing but they can just use ffmpeg already.
This is where the real faggotry lies. If the asics were more versatile they could be adapted for a multitude or standards and do hybrid software/hardware decoding which would be overall better.
What really happens is they either enforce a monopoly or they're only useful for 2-3 years after which either software is better or everyone is using a new level/profile/whatever that the asic can't touch and it becomes unused hardware.
Hardware decoding ASICs are the graphics card version of "have a CPU with a trillion CISC instructions nobody ever uses and that you can't do away with because you need to maintain backwards compatibility for the nobodies"

Attached: butter lube.jpg (750x589, 48.39K)

how about encoders?

Encoders are a different story, they're actually useful.
There's a capture card market, but there isn't a decoder card market. Encoding is usually too heavy to be done in real time well, but hardware decoders in GPUs nowadays can make lossless video with good compression, and they benefit from being integrated into the GPU.

Well I use Microshits Windows and while is true that ffmpeg can do pretty much all you need is really annoying to use, is better to have three GUIs (xmedia recode, mkvtoolnix and handbrake). Might seem like it's more of a hazzle at first but is easier to correct and spot user mistakes.

You mean workstation GPUs?

The whole reason behind creating h.264 is to produce better results for low bitrate videos. Any other debatable improvement is just a bonus. H.265 at least adds something meaningful for high bitrate: HDR and 8K support. But h.264 is only good for collecting royalty fees if the video used commercially.

SVG is a piece of shit format.

ffmpeg is also a library for encoding and decoding.
A lot of video players like mpv and vlc use it as a backend, and the user doesn't need to touch the terminal program. What I'm saying is we have whatever faggoty way codecs work on Android, garbage like gstreamer and developers manually implementing every single codec they want their program to have.
When using a single thing for everything is a really fucking dumb idea like with Systemd it happens, but when there's only one true answer in a sea of trash as is the case with ffmpeg it doesn't become the norm. It's a fucked up world.

No I mean every AMD/Intel/nVidia GPU in the last 5 years.