Redesign Computing From Ground Up

What would computing look like if we redesigned everything from the hardware up? with no attempt at backwards compatibility? Reinvent computing in 2018

Attached: images (1).jpeg (195x260 44.66 KB, 16.19K)

Other urls found in this thread:

A dead project.

Mill CPU

There wouldn't be an OS and the hardware itself would handle things like scheduling, memory management, garbage collection, I/O, virtual memory, files, and access control. There are also HLLCAs, a non von Neumann architecture that actually executes source code of programming languages. The CPU has components for parsing statements and expressions instead of decoding instructions. There are books on this from the early 80s that described what was supposed to be the future before x86 and RISCs took over everything.

In an alternate universe where these designs became popular a long time ago, we wouldn't even be able to imagine what their 2010s computer would be like. The most recent thing I can find is TIARA from MIT, which is a descendant of the Lisp machine, but that looks like it was abandoned (or the NSA wanted to keep the technology out of the public's hands).

That's the dumbest shit I've read on Zig Forums in my entire life. The hardware does all that shit? Do you have even a modicum of knowledge of the complexity of these subsystems you want to implement in unmodifiable silicon

I'd turn the 0 and 1's into 0 and 8's, just because it would make AI more plausible.

0 8 88 8888 etc

I don't know what it would look like, but for sure the hardware would give the user absolute control, such as the ability to halt execution, inspect & modify registers & memory, set breakpoints and step through... all done at the hardware level, and at the most priviledged level (everything else would be less priviledged, including any firmware).

You can already do that

If you enjoyed Intel CPU bugs that don't implement an OS, you will thoroughly enjoy your CPU buts that implement a full OS.

Not if you're using current-generation CPUs. There'll absolutely always be black boxes with higher privileges than you.


Attached: squidward future.mp4 (320x240, 656.34K)


Use trinary just to see what happens

Everything is reconfigurable configware/flowware running on FPGAs.

I think there was a Soviet computer that did that.


The Soviets played around with trinary for a while, that could have some interesting benefits over binary but the hardware issues were tougher so they copied the West instead.

Is trinary not what quantum computing is?


Quantum computing has three states. Trinary computing has three states. How are they different?

Instead of 1's and 0's you have -1, 0 and 1 as well. Immediate benefits appear such as not needing to sign numbers with tricks like we use now, also it seems that code density should improve somewhat as well. You could also implement more flags and features in a given trit width architecture for instance.

What if current day quantum computers are just tricky forms of trinary and it's all snake oil? I wouldn't put it past (((them))) after knowing what we know now.

Doing some Wikipediaing, the number of states goes up the more qubits you add. So a 1 qubit quantum computer has three states, but 2 qubit would have 4 and 3 qubit 8 states.

I kind of had a branching fractal ideas that would replace Von Neumann architecture back in my edgy "im cool mom, fug off" phase but kinda forgot about them and lost the notes.

Would like to see some new type of architecture tho. Von Neumann is too rigid in my opinion.


I'm not claiming to know anything about quantum computers, but I don't think a superposition is a third state. It just means the state of the bit is uncertain, with some probability associated to it. When you have 2 bits, they can have 2^2=4 states. (00 01 10 11) When you have 2 qubits there are still these 4 states, but the difference is that you can only manipulate the probability that the qubits are in one of those states. I think.

Yeah and it didn't do anything spectacular compared to binary. It just stores data in a different fashion and all that does is makes it use less bits per one decimal place or text chracter. That is all.

Tertiary not "trinary" you faggot = 3 possible states: -1 = OFF, 0 = MidRange (e.g. 5v), 1 = HighRange (e.g. 10v)

Quantum = 2 Possible states: 0 = OFF, 1 = ON

Quantum computers however ever bit both 0 and 1 at the same time and is undetermined until such time as it is observed ala Heisenberg's uncertainty principle

wow I butchered that.
Every qubit in a quantum computer is both 1 and 0 at the same time. This allows you to perform certain functions, like prime factorization, extremely fast as you can compare outputs a magnitude of 2 quicker per qubit

the bigger question is weather or not the quantum computer will come with ME or PSP

Why not both?


Replace von neumann with harvard architecture or make purely functional CPUs like urbit but in hardware and less kooky

how about no?

I binged on the lecture videos about the Mill CPU recently, they gave me an immense tech-boner, but it's still very much based on being backwards compatible.

Curious how long it would take for an original x86 type design.

Analog CPUs ftw.

That's because in the world of business sits upon using old but good working software that long lost its parent company let alone source code and cannot be ported to any other platform, so they stick with hopelessly obsolete OSes and hardware just to keep using it. And it's not your cookie cutter office suite type of software either, they're all purpose built so you can't just replace them with something else.

Intel does implement a full OS, and apart from a few bugs it’s been running smoothly in tens of millions of devices for literally decades.

Zilog processors have been running on tens of billions devices for few more decades than Intel did, never had any problems. You can just admit that Intel does substandard job at CPU design.

This was new to me. I'm familiar with VLIW as used in the Itanic architecture. How does Mill differ? Is it VLIW done right?

I note one advantage of VLIW is that it eliminates speculative execution and hence is immune to SPECTRE vulnerabilities.

And no. Ternary means that instead of 0 and 1 there is 0, 1 and 2, thusly enlarging the informational capacity and speed of processing said information.

Basically with same amount of transistors and little bit of electrical engineering trickery you could speed up your processor. Thing is, this shit needs to be built from obsidian bottom to top, which is why nobody fucked with ternary but Soviets in 70s (and they only made ternary computers transistor by transistor, not this silicon dye shit)

It's basically VLIW with variable-length instructions. ~33 instructions/cycle/core for a 32-bit design as of 2013.

Fractal computation? What does that consist of?

It's not faster. A binary computer evaluates data bit by bit, a ternary computer evaluates data trit by trit. You can scale the register size however you want to process more data units at a time, but the principle stays the same. Using wider words is only a speedup in a sense that computer only capable of short words needs to carry out multiple operations to evaluate a long word. Using wider words doesn't allows you to compute multiple short words at a time, you need vector processing for that and again it scales the same way, it doesn't have to do with the type of data storage. Nobody makes ternary computers because nobody can get transistors to work well in linear mode under such extremely tight constraints and huge voltage tolerances - it's hard enough to make it work in binary.

The Soviets claimed that they were getting better state transition speeds with the ternary transistors. Americans couldn't replicate it, then spent billions of dollars making fast binary transistors. One of the reasons that it wouldn't be faster is because ternary transistors are still in the 70s/80s.

Just passing to say that Quantum computing will never happen because the theory behind it is wrong and therefore its predictions.
That must be why all the hype so far has been from simulations and nobody actually managed to make it.

Soviets were claiming a lot of horseshit. Anyway to get a transistor to work with ternary data it needs linear mode, but at current scales quantum effects dominate the physical process of transistor operation, making it difficult to distinguish between two opposite states, let alone tell some third state in the middle.

Also in linear mode transistors' efficiency plummets and they start dissipating shit ton of heat while drawing shit ton of power. Hence transistor based power supplies all work in PWM mode, not linear mode. For all the drawbacks of having power only going either zero or full blast, it beats using 10x as many transistors and batterries/electricity and putting huge radiators on them to boot.

Enjoy the safety of a dual security system.* *At the cost of 1970's performance. Inovation that excites and is more diverse

Nigga what , they are real. Just look up dwave!

He's suggesting dwave is just using a simulator.

I want to redesign the entire world and be paid in food, clothes and a room to live in.

Everything else is BULLSHIT.

You dumb faggot
Trit has more information stored in it than a bit
It takes 2 trits to store information of around 3 bits
Dumb cuck

Also, linear mode is not needed
Only negative and positive voltages

t. EE student with uni project based on ternary computers

This is why, even though you are somewhat intelligent, you will never look further than "I know the best" attitude. Your potential bright future will be denied by your bloated ego.

If you looked up maths behind ternary computers then you would see what I am talking about
go sesrch for texas masters paper on ternary computers

Shiggy the diggy

Marketing brands

Dwave is not a quantum computer.
IBM Q is a research product, it's not something that you can just buy and run turnkey.

And just how do you drive a transistor with negative and positive voltage, to get it to output negative or positive voltage?

And decimal has more information than both, it takes just over 2 trits to store one decimal place. Your point is fucking retarded and is not relevant.

Draw a diagram of the operation, blanket explanation will not suffice.

Dumb nigger

Moving the goalpost
Also, yea Dwave is quantum

I won't teach you basics of electrical engineering, but here is a pic of ternary not gate for you, dumbo

You do not know how decimals in binary compiters work
Or what is informational capacity of trit is

Attached: glXad.png (600x488, 13.41K)

Also, here is truth table for ternary not gate because you are an idiot

Input: -0.5v 0v 0.5v
Output:0.5v 0v -0.5v

Good. Now show me how you pack this gate into millions of transistors.

Here is further explanation for your dumb ass
M2 is pchannel mosfet
M1 is nchannel mosfet
M2 connects output to 0.5v when gate is lower than source (ie -0.5 < 0.5)
M1 connects output to -0.5v when gate is higher than source (ie 0.5 > -0.5)

If 0 is in the input then you have direct line between 0.5 and -0.5
if high capacitance hanging input then resistors make a voltage divider with 0 on output

NOT gate is trivial, post XOR and NAND.

You have no idea of what you are talking about, do you?
Do you know what planar technology even is?
Do I need to explain to you how silicon dyes work? Wtf? Are you this pompous and dumb at same time?

There is a whole book on maths and electrical circuits of logic gates of ternary computers
You do not want to know about it, you want to win this discussion
You are too lazy to learn for yourself, I am too lazy to copy paste shit for you

Opinion discarded.

Yes Yes Yes No

Nice non-arguments, with non-knowledge.

Actually nevermind, 3 seconds on google revealed that you yourself pulled that shit from google. Here's an AND gate. Evidently it takes twice as many transistors to build a ternary gate than for binary gate. It is my assumption that it takes twice as many elements to make a ternary latch too. So it halves element density, die of any given size will have twice as many bit-wise shit in it than trit-wise. Meanwhile using ternary does not halves element count, it only goes by the factor of about ⅓. Therefore you're at a net loss of computational power per unit of die area just for using ternary instead of binary.

Attached: and.png (344x600, 10.17K)

Yea I pulled it off jewgle because you are too dumb to do it yourself and why would I waste any more time on you than that?
Also, why make a picture when you can just download it?

Next thing, your assumption
You are fucking wrong

Other than that pretty good, except not realizing that trits have bigger info capacity, therefore bigger speed for equal amount of transistors

They use the same transistors, therefore they switch at the same rate. Therefore the gates operate at the same speeds. You need log3(X)/log2(X) = 0.63 as many gates to compute a number of X length, but they're double the physical size, so you're at a 26% net loss of performance when space constrained or at 26% increase of die size otherwise. Come up with transistor that can actually handle three states, then we'll talk.

Wrong assumption again
Go away and come back when you learn more about the topic
you never heard of how ternary logic would work with negative voltages few hours ago, stop acting like know it all

Two transistors are double the size of one.

What teo transistors are you talking about?
Just based of one AND gate you got to that conclusion?
You do realize they don't use logic gates on processors to and do optimization to utilize more space?

You know NOTHING of what you are talking about. Nothing.

You have two current directions. Transistors don't work in two directions, so you need a transistor for each direction at each junction. Therefore two transistors are needed where binary system only needs one. It doesn't even have to do with gates, just basic physics.

I'd probably go with a -1 to +1 ternary for the sake of memory (0,1,2 strikes me as more error prone) and place a greater emphasis on distributed computation. We might be capable of optical computing at this point, as well.

dumbest Zig Forums post today. congrats. Garbage collection in the hardware? Really? I think /r/programming would be more comfortable for you.

Thinking about it a bit, I really only want one thing: fully documented hardware and standards. Far more strict interface and communications standards between any components that are meant to be interchangeable. An open and documented boot process that is completely under the end-user's control. Black-box hardware can be interfaced into the system but it is quarantined to a restricted bus and separate memory pool with bottom-bitch status DMA onto the main memory bus. Any black box hardware that can't adhere to the open standards gets the functional, but slow lane to account for their probable inclusion of street shitter firmware coding complete with NSA backdoors. Back on the boot process - a real BIOS with reasonable, if unoptimized, system functions in firmware - somewhat like the Amiga. Also none of this new bullshit where critical parts of the peripheral hardware's firmware is absent until this new convoluted bullshit boot sequence loads it up from *somewhere* (most definitely an ease-of-use feature for NSA).

I guess what I want is control of my system back.

Something that addresses all points made in

Given that what I said was that they were getting better speed out of the machine due to its fundamentally different hardware, I think that you are the only one whose politics dictate their mathematical truth. While the things the Soviets said should be taken with a grain of salt, dismissing them out of hand only reveals your own political biases.

After a quick Google search, I'm going to have to assume that you made "Texas Masters" up. Props to you on sending me on a wild goose chase for a person with an unfortunate name that fills search results with tons of unrelated shit.

The only things that I can find on ternary computers is a bunch of Computer Scientists (applied mathematicians, not electrical engineers) saying that it would be awesome to have ternary computers because they have better information density.

Go kys faggot
Your weak Jewgle skills are your own problem, not mine

Attached: 2018-01-14 13.11.45.jpg (1080x1920, 823.12K)

take notes from on how to not be a huge retarded faggot

I mean that's cool and all, but you're basically trading compromised firmware on the chip for compromised firmware binaries in your "trusted device". I don't see the security benefit at all.

The point is to have the root of trust be an owner controlled key versus a hardware vendor's or Microsoft, whether the owner supplies it by toggling it in from a piece of paper on every boot or some other difficult to compromise system. If we're designing a computer from scratch, we don't need to rely on USB firmware.

What does the key have to do with the firmware still being compromised? You still have to download the firmware for your wifi chip from somewhere, and nothing forces them to make it open source.

2018? Nothing.
What we are stuck on is von neumann architecture, the problems decades ago are still the problems today in many ways.
We should redesign when memresistors are real.

Since you would control the root of trust, you would also control what firmware and software etc. the system trusts to run, all the way up through bootstrap. Keep in mind this is an ideal, from scratch computer. If you don't trust the firmware, don't infect your system with it. Probably isn't a way to accomplish this on today's pozzed x86 designs.

Nobody would use your products and the wintel machine would carry on as usual.

HP cucked out of memristors, the whole thing is total vaporware.

That's not an answer to him question you cock sucking faggot

-- Page 95 (101 of the PDF)

Your arguments so far for ternary being crap are:
1) Soviets (ad hominem fallacy)
2) Maths (which you provided a source that directly contradicts you)

As for the Soviets, did they have success because:

From ( ) it appears that the answer is the second. I had remembered reading somewhere that the Soviets found the ternary to be faster: apparently that was only true because any new machine would have been faster than its contemporaries due to how shit their machines were.

What's spectacular about binary then?

But it's the factually correct answer.

tbh we should have ternary storage devices if nothing else. Converting binary to ternary would reduce the amount of digits the device has to store by a full third, which is significant enough to be appealing to at least some consumers.

is that what multilayered ssds are? the bits on hdds overlap too iirc

No. A multilayered cell is a cell that can hold more than one bit. A ternary cell would store trits, not bits. Multilayered SSDs increase the storage by increasing the amount of bits each cell can store. Ternary would increase storage by decreasing the amount of digits that need to be stored in the first place. Eg, 11011011 base2 is 22010 base3. 8 bits becomes 5 trits.

Question. It's generally agreed that quantum computers won't be useful for normal day-to-day computations, but will probably be necessary for advanced number crunching, crypto, etc. Sort of like how we need a graphics card to render 3D. Does this mean one day our computers will contain QPUs?

Attached: scuttle.jpg (480x360, 16.25K)

You are a fucking autist
In whole ITT I fucking defended ternary computing as it being faster and more efficient than binary

Except the complexity of ternary means each cell is so much larger physically that you can't fit as many on the silicon.