Ooftel

Let's take a moment to remember the good times shall we?
My last decent i7 was a 920 that oc to 4ghz I miss it but this makes me glad I went Ryzen 2700x
Wtf Intel? The i9 9900k is a 5ghz housefire that's double the price for I bet not even 10% more performance

Attached: oi_vey.png (645x403, 195.38K)

...

No thanks, if you're on the internet you're already pozzed, ME/PSP free CPU or not.

Enjoy those ALU clusters. You could have at least recommended a Phenom II X6.

Intel fugged up and doesn't have a new architecture ready so are just pushing their old one until it melts down. All engineering effort shifted to the ME and spying.

i wonder how bad the 2020 14nm stuff will be?
6ghz 28 core meltdown?

How much efficiency would you get if you hooked up your water cooling system to a generator to feed back the recovered energy?

Not hot enough for a meaningful thermal gradient. The only efficient way to recover it is if you live somewhere cold enough to benefit from a spaceheater.

Attached: weather-uk-vs-canada-vs-austra-4550.png (757x1146, 951.87K)

ausfag here 20-30 is perfect
40 is ok but we get 45-50 here its fucked

You probably meant FPU, retard. There's an FPU per 2 cores, but still 2 ALUs.

Yes, and that's why it lacked floating-point performance--the cores share the cache, FPU, and frontend.

Not even with a low boiling point liquid shooting a steam jet into a small turbine?

That sort of thing only really works if there's a wide gradient between the sink and source, even if the absolute amount of energy is tiny. An LC loop, however, is usually just 10°C above room temperature.

Attached: 76096_1000x1000.jpg (1000x1000, 96.61K)

neato

About 30%. If you use solid state thermal generators you can recover enough energy to power your faggy LEDs.

Guess what, floating point performances are less important than integer since GPUs exists. Honestly, I rock a 8350 and I've never been limited by it (well, except in x265, since it doesn't have AVX2) and the only thing worrying me a bit is RPCS3.

Guess what that CPU sucks ass
T. Ryzen

Floating point performance is pretty god damn important for modern day vidya. People have stopped using integer math for video games over a decade ago. For no good reason I must say, but still.

Because most devs are lazy fucks and use the physics equations 1:1 without ever trying to modify them to make them give out mili-something or micro-whatever with integers.

This, intel cpus have been doodoo for at least 12 years now.

because devs don't understand computer and think they can just use double nigger for everything

...

you don't need a housefire to reach 5GHz with Intel's cores, just a moderately competent cooler. I bet that part will have a double-digit TDP.
...yet barely more expensive than yesteryear's 4GHz quad-cores. Thanks, AMD!
It'll get much more than that from clock speed difference alone (I'd expect it to sustain >4.4GHz under realistic non-AVX all-core loads), not even counting the IPC advantage over Ryzen. Then again, it's more expensive than a 12-core 'ripper.
All in all, jews gonna jew, dumb goys gonna pay. This shekel-squeezing shitshow has been happening ever since the first Pentium was a thing, and probably even earlier.

PC's where stupid expensive until the early 90s when the price for a very avg machine dropped to about 1500-3k so similar to what we have now

Laptops were stupid expensive in the early 00's too, saw a advert for a Inspirion 4100 with the proof of sale from 2001: 5800 guilders back then, or 3050$ in today's money, and that for a middle of the line Pentium III model with XGA screen!

apparently the Precision M40 I have was even more expensive at around 10k guilders

Mmm tell me about it.
Pc home market is shrinking as well mobiles even own most of the gaming media and general application use since nobody cbf putting their phones down to turn use a pc anymore
That Said even pc tech is slowing we have reached the logical ceiling of what current 7-1nm silicone can do and amd went mcm early with CPUs next step will be gpus and now with rtx its gonna be a fun couple of years.
Cbf buying new pc every year again tho

Depends on what you mean by "PC". If it was 32-bit (Mac, Amiga) or a high-end 16-bit (386), you could easily blow $2k-$8k on a complete system, but many 8-bit (C64, TRS80) and some 16-bit platforms could easily be had for $50-$500 from the 80s into the early 90s.

Aside from the extremely stagnant nature of many lower-end platforms (the Apple IIe and C64 were both in continuous production and active support for over a decade), this was largely due to the fact that an interlaced SDTV's composite/RF input, simple audiocassette/floppy storage, and sometimes even an electric typewriter for a keyboard/printer in the early days, could be used as PC accessories.

Once a dedicated high-res p-scan PC monitor and dedicated HDD became mandatory in the mid-90s, the price floor catapulted north of $1k until the advent of eMachines/Gateway/etc. in the late-90s, and didn't truly drop until netbook/nettops hit the market in the mid-00s.

...

I swear, hipsters are going to start defending NetBurst and Itanium some day.

Do you have any argument, anyway?

NetBurst and Itanium didn't even have niche use cases. They were inferior in every way.

Bulldozer was a turd, no doubt, but Piledriver had a few niche uses where it was a very good value, like compiling tons of software, rendering, transcoding video. You definitely got what you paid for, and single thread sucked, but at least it tried to offer something unique for the price.

NetBurst was just "we 10ghz in 5 years" and Itanium is just another "Intel can't make a successful product besides an x86 CPU" situation.

Name batter botnetless CPU.

user...

Every piece of turd can be "a very good value" as long as it's useful for something and the price is low enough. That jewtel has a habit of pricing its turds as if they're made of gold is a completely separate matter from their turdiness.

Well, to be absolutely honest, the last two iterations were made very power-thrifty in lower frequency ranges, making them very capable mobile parts. AMD managed to squeeze four cores with decent (>2.5GHz) sustained frequencies into "ultrabook" ~15W thermal envelope, where Intel had only

I'm a pacifist, I don't batter CPUs. Especially with names.

What the fuck is Intel even doing nowadays? Next to me I have a Northwood Pentium 4, with a copper cored stock cooler that's just loosely sitting on the CPU, only being held on by gravity, and at idle with a 1.6V vcore, it only runs at 27C/82F

Meanwhile, my main computer with a Haslell i5 overclocked to 4.1 GHz and with a Corsair water cooler idles at 41c, and that's with only 1.2V. How did everything go so wrong, a CPU known for running hot runs cooler than a 10 year newer CPU?

AMD isn't any better with voltages

But it still uses fluxless solder instead of gay cum under the spreader.

AMD is annoying in that they always try to hide their idle power scores, and if they do then potentially make something with low idle power, they make it vendor only and pretty much unavailable to buy anyway

I really like my Ryzen 2400G, it's working great in Linux, but the idle power draw is still 25-30 watts and I might have considered getting one of these V1000 based socs if they actually had wanted to sell them to anyone besides poker machine makers (and they had low idle power draw)

If your idle power is that high then something's wrong with your idle states, does your CPU drop frequences at all?

i think it's just the chipset

The cpu only goes as low as 1400-1500mhz on the 2400G as well

This is measuring from the wall socket too, not from some thing you can't trust inside the operating system

If you're measuring from wall socket, you have to take into account that your PSU is most likely operating at some ridiculously low efficiency. Even 80+ titanium rating only measures starting at 10% load.

This is why I will probably never buy a desktop pc again. I currently use a sandybridge laptop that idles at below 10 watts and that's including the screen. There's no way a desktop pc can achieve this, but I'd love to be proven wrong.

Did they just axe the i3?