Whats the point of this? Is this just marketing bullshit to attract normalfags?

whats the point of this? Is this just marketing bullshit to attract normalfags?

Attached: Capture.PNG (711x455, 242.67K)

Other urls found in this thread:

arstechnica.com/gadgets/2018/12/intel-introduces-foveros-3d-die-stacking-for-more-than-just-memory/
twitter.com/NSFWRedditGif

it's thinly veiled gay interracial symbolism

yes

Wireless charging doesn't work worth a shit as is, and now they're using something with that limited a power supply for it? Dumb as hell, bullshit marketing gimmick. I also wonder about heat dissipation with this.

We've arrived at 7nm with Snapdragon 855.
Moore's law is as good as over for mobile and will be for desktop soon this year with Zen 2.
Companies need an unending amount of gimmicks to make phones marketable now, specially because phones reached the commodity threshold.

how would those arm things perform with proper cooling.. they must be really limited by the passive cooling on those bricks

I hope this is true, because if it is it may mean that they'll stop making Retardware and Defective-by-Design shit and people will have to learn to actually use their computers rather than throw them away every 3 years to play with "new" features.

They'll start using quantum computers for phones with thin clients as the next big gimmick.

cant you just make them bigger? if they can make that tiny cpu 20cm in every direction then it could have much more cores than these current thing

sir please, apply to MIT right nw, pls , even , u may get honorary degree,,,

that reminds me of this

Attached: docking.jpg (474x355, 16.49K)

its probably not easy but its something that they will have to consider eventually. those new amd cpus are already much bigger than most have been

breddy hot tbh

god dammit

Attached: Awful-vifYelSTlMo.webm (640x360, 1.41M)

I'm not sure you understand how a CPU works. It's not like RAM where you have a bunch of replicated components and can just expand it. Simply making it bigger doesn't make any sense at all. There's a reason they've been focusing on shrinking the lithography instead of making CPUs physically bigger.

oops

Intel will experiment with 3D stacking for the CPU, just like HBM memory does.
But this too has a limit, specially considering the size for phones, and the amount of space it would take in the motherboard - everything would have to become big and fat, contrary to the small, slim, light trend we're having.
We would basically return the downscaling we did throughout the decades and return to room-size computers.

Unless they discover something. They've known this moment would come for a long time, maybe two decades or more, and have been researching since.
Sadly, they're hopeless and have nothing.
Quantum computing is a meme, btw.

Ops, forgot the link.
arstechnica.com/gadgets/2018/12/intel-introduces-foveros-3d-die-stacking-for-more-than-just-memory/

And what's the problem?
That the faggots dream of all software being written in Java will never come true?
I'm actually quite happy with this.

We're already almost there. What local processing do most phones do these days?

We're already moving in that direction. Pic related was just announced.

Attached: energizerPhone.jpeg (728x485, 38.87K)

It doesn't have a fucking headphone jack. Pathetic.

Attached: 720d2f196c612a0202202ead2bdbee43e5cf369f1ec0cf1cf00d1f354f34604d.webm (800x450, 5.6M)

The fastest ARM cores perform similarly to Intel's Core m chips given the same power constraints.


You're fucking retarded if you think the size of the phone is what constrains 3D silicon.

This is exactly what multicore and SMT was supposed to do. Problem is, Intel and the rest of the hardware industry has been trying to forcefeed multithreading down the throats of stodgy programmers for over a decade now, without any real success. Much like the slooooow process since the early 90s of forcing programmers to make use of vector coprocessors (or, more recently, GPGPU offload), 90% of software is going to remain completely useless on a highly multicore system for the foreseeable future.

Attached: Childreneatbroccoli.jpg (400x892 27.38 KB, 17.36K)

Can't wait for bag phones to become a thing again.

Stop doxxing yourself you kinky woman. Don't you have a Monday class to prepare for? ;^)

Moore's law can continue with efficient programming. Look at what was possible with a mere 64Kb and realize the amount of resource waste now going on.

(in terms of task efficiency gains)

You're the retarded to not understand that I was talking about supersizing things in general.

Yes, we could say that the hardware's moore ended, but we'll be entering an era, I truly hope so, of constant software optimization.

I have high hopes for Google Fuchsia, botnet aside, to kikckstart a new wave of operational systems.

Also, Zen 2 is just partially 7nm, so we'll have to wait until next year's Zen 3 to be probably full 7nm. Next year promises DDR5, AV1 hardware acceleration and the new AM5 socket as well.

Rape that girl!

Having doubts a 20 year trend of bloat is just going to go away.

It does for safety reasons. In case the circuitry fucks up due to some moisture it won't fry your brains out.

Can't we just write graphical operating systems out of pure assembly? Like what terry did.
Also hope we don't get into web assembly anytime soon.

Soon

Attached: ky38_manpack_on_man.jpg (397x500, 92.02K)

They are going to have fun with heat dissipation.
The upper CPU would be fine, but the lower layer won't have anywhere to remove heat correctly.

That's a futile battle. GCC is a top tier assembly koder.

We have been doing that for over 50 years. It's called compiling.

That's just another way of saying install gentoo

Talking about optimizing the crap out of current hardware is pointless when most "programmers" don't get simple things like "it's not necessary to redraw the screen hundreds of times a second when nothing is changing"

Yeah, pretty retarded. Even if wireless weren't wasteful it isn't like it's hard to find someone with a charger.

Moore is about how much shit you can put in a chip, not how much it can do. So no amount of "efficient programming" will change it.

What for? Is there some part of your OS you feel is too slow due to the CPU?

No, it is about the price of transistors.

Lol, this is why I stopped using a smartphone a long time ago, there getting worse and worse.

anyone cared to shoop the brick as this phone?

Attached: protester-throwing-brick-©-Isopix-REX.jpg (700x455, 118.47K)

How do you communicate with your friends?

I'm going to pretend for a moment that you are actually retarded and aren't just being facetious for giggles.
You can talk to them in person. You can text and call on a basic phone. Send an e-mail. Social media via computer. Please tell me you aren't so retarded you think a smartphone actually makes it all that much easier to stay in touch.

No, the reason you can fit more transistors is technological, not economical.

This, I don't know what those tards are talking about. The whole Moore thing was:
For the same area of CPU, you could put double the amount of transistors every passing ~2 years because they would be shrinking at about half their previous sizes, and then, with more power, you could create even smaller ones.
But then, there's a physical limit for that.

This is correct. You can double the number of transistors in a given area every 18 months or so. It's slipped closer to 2 years.
But this has never meant "double" the performance. There are many other design factors involved.
Voltage
Clock
Arrangement
That last one is big. Dedicating chunks of space to dedicated circuits for calculating specific things is a trade-off with the others
Every CPU is a series of trade-offs
But over time we've been able to design both faster circuits at the same clock, circuits with lower power draw, and circuits that can clock higher
Modern CPUs are combination of all these technologies
But they all rely on 2D lithography to build them
And transistors can only be built so small

Even if we could have a transistor be the size of a single atom, there's a limit there. And realistically you need "walls" that are many atoms thick.
We'd need to tame quantum mechanics (in at least a limited form) to go any smaller
Even then, how easily can we manufacture hardware like this?
Not very

Intel has a lot of intelligent people working in their labs and they are doing the best they can to continue performance gains
But I for one am looking forward to the return of large, bulky machines and concerns over performance
If every developer today put more effort into optimizing their code then we'd probably save billions on electric bills each year, cumulatively; so many cycles are wasted brute-forcing a problem with a languages that have massive overhead
Of course, maybe (((they))) will just insist we use phones as ultra-thin clients (literally) and offload all processing to server farms where we won't have to see ugly, bulky hardware.
And the masses will hand their phones in for a glorified Remote Desktop session so that Apple and Google can just stream their device to them

Attached: 19c85ed0e5ae028fbee278903b0bf7cbab6dab1cb173fb576bdb1e433df85456.jpg (400x300, 25.1K)

I wasn't commenting about the amount of transistors on a chip. I was correcting you that Moore's Law is about the price of the transistors and not about the amount of transistors on the chip.

Wrong that is something different. Moore's Law is about the price of the transistors.

/thread
/board/
universe

No it isn't. Go look it up, takes 5 seconds.

Okay looks like I was mistaken. It seems both are called Moore's Law. Both more transistors on the chip and becoming cheaper.

I can top that.

Attached: silicon injections.webm (864x480, 1.91M)

I am all at once confused, disgusted, and curious.

Attached: 1509324482933.jpg (1148x746, 147.3K)

Why do you have that saved?

Not enough people know about Silicon Injections. Pretty hot, tbh.

Attached: panting.gif (450x250, 1.97M)

Putting aside whether quantum is a joke or not, this actually is what is happening with quantum computers. Have you seen the IBM Q System 1? It's absolutely ridiculous. Dumb though, even slight vibrations render it unusable, and they're dumping a fuck ton of money into hiring design firms to make it look pretty. Retard priorities.

could become a cute mating ritual

I'm disgusted. It's called a silicone injection, not silicon you illiterate faggot.

He has the freedom to do so, that's why you redditfag.

You need to kys.

Why not just use nanoscale vacuum-channel transistors?
They're fast as fuck and much more heat+overvoltage resistant than semiconductors.

do you even know how vacuum tubes work?
electrical field, high amount of heat dissipated power, etc etc,

He's not talking about traditional vacuum tubes though. He's talking about vacuum channel transistors which are their own beast entirely.

Why?

The dock charging 3DS used seemed fine. It just used springs to push the pins into designated connectors when placed on the dock. Why not just use something like that instead of buzzword technology?

I guess in an emergency if your friend has full power and you're about to die you can transfer, and both half half and be able to contact each other but other than that it seems pretty counter productive in a retarded way. "drain your phone to charge theirs!" I can understand people using it in the first scenario on their own, but to tout it as a feature up front seems silly.

It's for people with friends and a social life user. You don't have to worry yourself.

t. fresh off the boat from cuckchan
not even reddit is this retarded
now leave and promptly kys

Take your meds.

Top yourself.