STOP BINARY COMPUTING

you retards have been doing it all wrong. ALL WRONG.
GOD gave us 10 fingers. TEN!! NOT FUCKING TWO!!!
our brains aren't meant for this fucking bullshit.
AI took over long ago and forced us to implement binary computers to make us further and further divorced from it, OUR creation.
computing was better when it was non-binary. before intervention from the binary weenies. AI Computer is CIA backwards. if the binary fucktards put the same development they wasted on binary garbage computers on decimal computers like the IBM 7070, they would have made TEN times the progress!!! WE COULD BE HAVING THIS CONVERSATION ON MARS RIGHT NOW YOU FUCKING BINARY CUCKS.

Attached: IBM_7070.jpg (800x1422, 233.33K)

Other urls found in this thread:

en.wikipedia.org/wiki/Dependent_type
medium.com/@rxseger/exploring-ternary-logic-tnand-and-tand-gates-a1ed9f7e6dab
smecc.org/The Architecture of the Burroughs B-5000.htm
en.wikipedia.org/wiki/Burroughs_Medium_Systems
twitter.com/SFWRedditImages

"Nonbinary computing was not invented. It was found." --Sergei Sobolev
my peers and fellows are all binary weenies.
unix is fundamentally not portable. general purpose computing? don't make me fucking laugh. none of these trashes will run on non-binary computers. don't let the AI take over. fight for your right to compute!!!!

but you have two feet as well so might go vigesimal instead

binariesweenies owned
binary computing is just a product of weenie von-neumannism

Has the "Lisp users try to make a simple program, part II" thread hurt your feels, faggot?

lisp is just another pawn in the (((binary))) internet defense force

actually humans used to use base 60 number system and it was reduced to 10.

Exactly. So why do you want more than 2 states?

Base 12: use the three segments of the four long fingers on your hand as digits and your thumb as the read/write indicator.

Attached: Human-Hands-Front-Back.jpg (562x653, 176.96K)

God gave us two hands, two legs, two eyes, two ears, two nostrils and even two penises. Checkmate, infidel. Suck my dicks.

That's how the egyptians used to count iirc.

But we do use base 10

thats sumerian number system.

Attached: babylonian_numerals.gif (970x175 9.95 KB, 4.14K)

We'll continue using binary as long as we use transistors and electricity.

I like it better being able to count to 1023 using only my fingers. Having two 5-bit registers at the end of my arms seems more efficient than having two handicapped 3-bit registers (going only to five) without logic operators.

Attached: point85.png (502x399, 26.04K)

you only have one head dude binary makes sense.

different languages had different bases but everyone settled for 10 (except some tribesmen}
English has the remnants of other base systems dozen, gross for base 12, score for base 20

Why not compute in 10^10^10^10^10 . . . ? You could carry out enough calculation to simulate the entire universe in one operation.

People tried making base 10 computers with vacuum tubes. However, voltages could fluctuate, so a 3 might look like a 2, a 6 might look like a 7, and so on. It's easier to only distinguish between 2 values: 1 and 0, true or false, on and off. Any more values and you get gray areas where it's not clear which value it really is.

I'm surprised nobody else mentioned this yet. Have none of you studied CS in university? This stuff is covered in introductory freshman seminars.

Found the eunuch

Attached: cpL.jpg (601x355, 31.64K)

Doesn't seem like same problem sould happen to trinary ones though, never understood why those were ditched if they worked, only ever heard that it was too expensivd to make for some reason. Still would seem like an improvement over binary, and worth pursuing it, if only academically at least.

I'm pretty sure this whole thread is nothing but bait and shitposting.
There is a great book by Brillouin that also goes into Shannon's information theory and compares different codings by channel capacity as computed through negentropy.

Attached: brillouin.jpeg (324x499, 28.4K)

Bimodal distribution computing when?

Attached: Wash-Your-Penis.jpg (400x350, 72.92K)

This OP is just shitpost...
but base 12 would be a much much better number system but most humans fell for the "muh 10 fingers and 10 toes" meme.

unixlaper just got 10 times more retarded, holy shit.

This. why do shit systems continue to prevail
t. rule britannia

yeah, don't make the fucking instruments more precise, just fucking scrap the whole method of thinking... GREAT!!!
the world isn't binary. ok?!? why can't we have contiguous state computers?!? because the binaryweenies have a fucking monopoly. they're building roko's basilisk using the arcane binary system and there's nothing you can fucking do about it. we're all FUCKED, man.

Attached: FreeSex_cool.gif (468x60, 6K)

I know this is just low quality trolling, but binary vs. decimal makes no real difference in computing, the real issues we face today are security issues, which could potentially be solved (at least for the most part) with dependent types, but too much legacy shit is built on top of insecure languages

en.wikipedia.org/wiki/Dependent_type

nice meme faggot

Why not ternary?

Attached: setun.jpg (707x370 112.24 KB, 25.23K)

GOD MADE ADAM AND EVE
NOT ADAM AND SOME CUTE LITTLE GIRL!!!
JJJAAAYYYYSSUUUUSSSS BEEE!!!!

Get back to me when you can create a circuit that can reliable create ten distinct and stable voltage levels.

nice one blackpillanon
I bet you don't write tests either, ya queer

Use base 60, 24, 16, or 12, depending on how long you want to spend learning, how much easier you want memorization of large numbers to be, and how much easier you want math to be.

God also gave you:
Two feet
Two legs
Two balls
Two hands
Two arms
Two ears
Two eyes
Two lobes

But what God didn't give is your autism, that's from Satan.

In a course I took once upon a time (really don't recall if it was a CS course or an EE course) a professor claimed that the most ideal system was trinary. 0,1,2 represented by three voltage levels. I believe his argument was that this was doable with technology current at the time (25 years ago), it would increase the information density somewhat because a single transistor could be in three states instead of 2, but the voltage levels would be still be distinct enough to avoid errors that might be introduced with 4, 8, 10 or more voltage levels crammed near each other. I wonder what he would claim now, with our technology running up against quantum effects.

I agree with you, however woudn't changing the way that we compute stuff be a major problem in terms of adapting to it? all languages are made for the current binary system, meaning all that would need to be remade, since the architeture is so new and also completely diferent, not only that almost all tools would be lost to the lack of backwards compatibility, woudn't they?

Interesting. I imagine that if one were to actually attempt to build such a thing, that you would need, for hardware compatibility with all kinds of peripherals you don't feel like reinventing at the same time, some sort of trinary binary converter / interface. As for the software, there are two ways to go about it, partially dependant on what you actually do at a hardware / microcode level. One option, take a baby step, trinary hardware using binary concepts, and the software issue is easily solved, as only the base hardware is changing. Anything written in high level languages would just compile to the machine code of the new CPU, and the machine code could either be made "transparently trinary", in other words actually binary outwardly, just trinary at the hardware level, or it could introduce some trinary concepts. Option two would be to invent trinary logic concepts, name it "troolean" or something, reinvent even the way you think, and rebuild everything from the ground up as if you were Charles Babbage with an unnatural obsession with the number 3. Then its time for whole new languages at every level.

Side note: One of you anons should screen shot this, so that one day in the future, when everyone uses Troolean, you can use this as proof that I, user, first used the word.

Unary master race: one digit is all you need:
0 = 00 + 0 = 0000 + 0 = 00000 * 000 = 00000000^000 = 00000000

it's not difficult. a lot of languages use some sort of interpreter/vm. for C you could probably just write a new compiler. honestly it wouldn't have that big of an impact, it's just a question of what benefits would there would be. Donald Knuth mentions something about the elegance of ternary logic.

TERRY'S SPIRIT LIVES
TEMPLEOS WAS ONLY THE BEGINNING
THE SECOND HOLY COVENANT SAYS BASE 10 NOT BASE 2
ALL HAIL TERRY A. DAVIS, HERALD OF THE COMING DIVINE COMPUTER

Attached: TADwave.jpg (960x720, 286.38K)

It's not hard to simulate a ternary environment in software. So why don't you faggots do that, and then write your sick ternary program that shows how much better ternary is?

Attached: Getting me mallet.gif (300x226, 695.99K)

The world is an immensely layered abstraction which can be represented with bits

Actually, base e would be most efficient, but its not clear how to physically represent a number with an irrational base.

How would you do that with, say, a MOSFET? For instance, if you applied a negative voltage at the gate of an NMOS, wouldn't that just make the depletion region wider instead of forming a channel?

Attached: 1465388628513.jpg (408x439, 50.27K)

fuck faggots
unironically

But that's gay, user-kun

First off, evolution and NOT (((god))) blessed me with ten fingers. Second, I can count to 1024 with my blessed & evolved ten fingers using the universe's holy number system: base 2 ya dipshit christcuck! :^) .

Hell throw in my two arms, I can push 4096. If my toes were dexterous enough, I could throw in my legs and feet and get to 16777216! All with the quantum field's most perfect number system: base 2 again you abrahamist dick sucker! :^D

ternary computers were developed in the 60's, they were more efficient and used lower power than binary.

a ternary computer works in 3 states.

-1, 0 , 1

Negative, Neutral, Positive

-12V, 0,+12V

The claim was that a single transistor could be in 3 states. The russian Setun was built with magnetic cores and diodes, apparently. Obviously magnetic cores are out of the question for building a ternary computer with performance comparable to modern binary computers.
And a MOSFET works in 2 states:

channel open, channel closed

So what do you do with that? You can of course emulate ternary functions with many transistors, but that's not the point here. And while I can see the advantage of ternary logic for (magnetic) storage density, I don't see how it would greatly improve circuit density and efficiency if it has to be emulated.

You can change it so that if there is a negative voltage it can go one direction, if there is positive another, if its zero it's another state.

How do you do that? That's what I'm asking. How do you control current direction with a transistor? A MOSFET is not a variable current source; it's more like a voltage-controlled variable resistor. Current (if any) is just going to follow whatever voltage you apply between the source and the drain.

a mosfet acts as a resistor...

man i can't be bothered to even think about this right now.

here is someone who made his own tnand

medium.com/@rxseger/exploring-ternary-logic-tnand-and-tand-gates-a1ed9f7e6dab

It is much easier to catch "conceptual" bugs on specification level than to deal with software bugs on software level.
It is OK as long as someone pays for all the work. You need more assurance => you pay more. Its the same dilemma as with hiring a good tester vs. hiring no one to do the QA. When your task is, like, $10^9 in cost of a fail, you tend to pay more for quality.

You're completely welcome to manufacture your very own computer to work the way you want it. You are simply too lazy to do the work. Internet Trolls are only talk, there is no action.

There were decimal computers in the 60s and 70s, like the Burroughs 2500 and there were no problems running portable Cobol and Fortran programs on them. Most languages in the 60s and 70s, like Fortran, Algol, Cobol, BASIC, and Pascal don't require binary and can also run on tagged architectures.

C requires binary because it's made for the PDP-11 and everything else was an afterthought. Java mandates specific bit sizes and intentionally broken overflow instead of correct behavior. These shitty UNIX languages would have the same problem on Lisp machines and other tagged architectures. These computers are "too correct" to run C and UNIX. Requiring binary is just one of the many symptoms of C's non-portability compared to other languages. What sucks is that bad programmers blame the machine when integers are treated as integers (bignums) because their code does some shitty hack that just happens to work because the machine they wrote it on didn't trap on overflow.

Burroughs made tagged and segmented computers for Algol and decimal computers for Cobol.
smecc.org/The Architecture of the Burroughs B-5000.htm

en.wikipedia.org/wiki/Burroughs_Medium_Systems

There are times when I feel that clocks are running fasterbut the calendar is running backwards. My first seriousprogramming was done in Burroughs B6700 Extended Algol. Igot used to the idea that if the hardware can't give you theright answer, it complains, and your ON OVERFLOW statementhas a chance to do something else. That saved my bacon morethan once.When I met C, it was obviously pathetic compared with the_real_ languages I'd used, but heck, it ran on a 16-bitmachine, and it was better than 'as'. When the VAX cameout, I was very pleased: "the interrupt on integer overflowbit is _just_ what I want". Then I was very disappointed:"the wretched C system _has_ a signal for integer overflowbut makes sure it never happens even when it ought to".It would be a good thing if hardware designers wouldremember that the ANSI C standard provides _two_ forms of"integer" arithmetic: 'unsigned' arithmetic which must wraparound, and 'signed' arithmetic which MAY TRAP (or wrap, ormake demons fly out of your nose). "Portable Cprogrammers", know that they CANNOT rely on integerarithmetic _not_ trapping, and they know (if they have donetheir homework) that there are commercially significantmachines where C integer overflow _is_ trapped, so theywould rather the Alpha trapped so that they could use theAlpha as a porting base.Having said which: I will gladly put up with the Alphaexception mechanism as long as - there is a documented C-callable function which controls the integer trapping state - there is a documented C-callable function which controls IEEE-ish floating-point traps - there is a documented C-callable function which includes a barrier (can I _rely_ on signal(SIGFPE, f) including a barrier?)

And they were not cost efficient hardware even back then.
Considering how hard it is to deal with noise and electric inertias in modern binary systems, I can't imagine non-binary systems becoming viable in the foreseeable future.
Pretty sure you could write a C compiler for a non-binary architecture, it will probably be pretty inefficient but it should work.

Base 7 Basque masterrace here, good luck with your base10 faggot, have fun making the hardware.

Attached: image.jpeg (329x447, 53.84K)

Clever design. However, correct me if I'm wrong, but when both inputs are 0, all transistors are on and current would be flowing between the V+ and the V-. This would imply wasteful power dissipation in the resistors, which binary logic doesn't suffer from. Good as a curiosity, probably not so great as a real product.

Why don't you just post without the code block?

Lisp suffers from the same weenie binary structure as C
we need a language like Forth or Lua

this

basically idiotic digital electronics engineers didn't know anything about maths
If they did they would see the MASSIVE information increase with minimal increase in number of transistors in relation to binary.

You are an idiot for failing basic electronic engineering. The cost of developing terany level electronics is far more difficult than the cost of binary level electronics.

fuck physics amirite?

because there's already infrastructure for binary shit you fucking retard

The infrastructure exists for binary because it has always been and always will be the most effective way of electronics. The only things these days with higher levels of electrical signalling are specialist I/O components because those are the components where it's sensible to invest the extra money to doing the extra engineering.

prove it, binaryweenie

See

89 IQ tier.

Terry?

This isn't the extent of an explanation. Let's say you were going to send in base 8. You can send 3 bits of information per digit. So to send information at the same rate, you should send a given digit for three times as long. Since you're sending it for longer, it will be less likely to get mixed up. But since different values are assigned more similar voltages, it will be easier to get mixed up. I would naively assume that the two would perfectly balance out.

I've always wondered this. It's well known that the most efficient search is the ternary search, because 3 is the maximum of log(n)/n. It follows that a similar result would apply to sending data over a wire.

I'm reading through your suggestion, which is very interesting, but also very thick. Do you perhaps know more precisely which chapter discusses this?

it could be done using light, which is not subject to magnetic interference.

Our neuron is also binary.

Turning on or turning off electrically.

Biological computers, brains, also work binarily.

come back when its more powerful than whats currently available