OK so hear me out

OK so hear me out.

Our modern transitors work by etching a specific serires of shapes onto scilicon and the patterns of those shapes make transitors. IT is a uniform pattern. However there is no particular reason why we have to use 1 bit transitors. For example we could use a ternary design.

What I propose is that you have an AI have complete control of how it would be etched and reward the ai when the etching produces the proper output and punish the ai when it is the improper output. Also reward the ai when the output speeds up and uses less power etc. And I propose to you that Ai will not resort to transistors. Depending on the chip you are trying to design the ai will choose an amorphous blob/shape that is optimized precicely to move the electrons how you want without any logic gates and that it will be supra ultra low power

Attached: 1552159085030.jpg (600x800 899.9 KB, 29.57K)

Other urls found in this thread:

blog.degruyter.com/algorithms-suck-analog-computers-future/
spectrum.ieee.org/computing/hardware/not-your-fathers-analog-computer
gmplib.org/~tege/x86-timing.pdf
youtube.com/watch?v=r4sP1E1Jd_Y
twitter.com/NSFWRedditImage

quantum computers?

You're retarded.

You beat OP in retardation, congrats.

Why not skip the definite states and go all analog?

based

This is what I am saying I think the AI would do

Wow

Attached: Screenshot_20190705-034129.png (720x1280, 171.58K)

Go back or lurk for 2 years.
The reason they don't put "AI" in charge of designing chips is the same reason they don't put it in charge of programming software. Humans do a much better job.

/thread

It shows you don't know shit about electronics. Ignoring base current for BJTs, transistors dissipate power only when not being fully on or off. A perfect conductor doesn't dissipate energy, nor does a perfect insulator. So your analog computer would waste loads of electricity that would show up as heat.
Plus the whole reason people developed digital stuff was because when you're using analog signals, every step introduces distortion and noise. The data in your memory would become meaningless in minutes, and there's no way to regenerate it.

Do you think 20th century science picked digital instead of analog on the result of a coin toss?
>>>/g/
It's where you belong. Unless /sarcasm

You're assuming that an analog computer would be implementing discrete algorithms like a transistor machine. Analog systems have a lower representational gap between the domain and the circuitry. Transistor systems break everything down and rebuild them with layers upon layers of abstraction, and still have to approximate continuous systems e.g. floating point numbers. The main reason digital systems took off is because they were easier to manufacture, but that gap has closed today.

blog.degruyter.com/algorithms-suck-analog-computers-future/
spectrum.ieee.org/computing/hardware/not-your-fathers-analog-computer

Why not just put the computers on the eagles?

Attached: frodo.jpg (1592x1594, 155.29K)

Technology is magic!
Just put everything in the magic hat, and then a super computer working faster than light appears!

Don't be too hard on the little guy. Modern schools teach that insanely complex code, such as DNA, arrives magically over time.

Computers would actually become insanely better over time (at copying themselves) by themselves if they sometimes copied themselves over by themselves. But they don't, so the theory of evolution doesn't apply to them.

That's circular logic, sweetie.

Transistors are first and foremost analog devices. Digital systems are generally more expensive than analog systems because they require more transistors. Transistor radios are analog systems and they were one of the easiest transistorized devices to manufacture.
You need a one-to-many mapping from data to physical states because then even if a source of noise changes your physical state, the resulting state falls withing the same data value. It's impossible to weed out sources of thermal noise because as the temperature delta increases, the efficiency of heat engines declines, preventing you from reaching absolute zero.

What you quoted does not constitute a complete syntactically valid sentence, and thus does not hold any definite meaning by itself.

Oh cool, someone who is even more autistic than me. No problem, here you go:
> would be better over time (at copying themselves) by themselves if they sometimes copied themselves over by themselves.

Attached: 1db97fe7dd5d2ef0efc9fc0597ce9d074205bf85d711a924dd59c790b82f62f7.jpg (549x563, 42.8K)

How is it circular reasoning? Objects can sometimes copy themselves by themselves, yet be bad at it compared to other objects which copy themselves by themselves (more often than the first kind, since they're better at it).

Completely 100% totally wrong in every respect.

No, but that's how lots of things were done. That or malicious intent.

Well if your intent was to explain the origin, then it is classic circular reasoning because the conclusion was included as a precondition, eg, the existence of a self-replicating machine begets self-replicating machines.

If your intent was to explain how a simple self-replicating machine can become a complex one, then you are begging the question. In other words, your precondition is in greater need of support than your conclusion. How did the first self-replicating machine come into existence? In fact, there is no answer for that, despite what some evolutionary biologist has told you. The fruit of a worldview that teaches that time itself is an architect results in the fallacious and magical thinking demonstrated in this thread.

What is circular about that, exactly?

Thoroughly explained here

We're not talking about biology here. Evolutionary learning algorithms are a fairly well proven computer programming technique.

In either case we are not talking about biology. Materialistic origin of life is purely about synthetic organic chemistry: a prebiological state. That is why any answer which relies on an existing biological process is inherently circular.

For what problem domain? Do you understand computability and complexity? A pathetically small subset of functions are computable.

Then why are you bringing it up?
See, this is actually finally addressing the issue. It only took you about a half-dozen posts to get to the actual technological aspects of what this guy is proposing. Next time try starting out with this instead of misusing phrases you've heard before to sound cool.

There are already highly optimised mathematical algorithms for this topic. Fucking AItards.

I brought it up because it is symptomatic with a worldview in which complex information systems arise magically over time.
The OP was silly and of no substance. I'd rather talk about why people think that coherent and complex systems spontaneously appear (they dont).

The post I responded to was ridiculing the theory of evolution by comparing biological organisms with technology, implying that if what they teach in schools about the theory of evolution was correct, then computers would also come to have "insanely complex code" all by themselves without human intervention. I then explained to him why you wouldn't expect to see that happening even if the theory of evolution was true, because of differences between biological organisms and computers.
He questioned whether we could get "insanely complex code, such as DNA" through natural evolution, not whether we could get any (self replicating) object at all. That's a different question.
However, there's a possible explanation for that question too, which doesn't involve an intelligent entity actively introducing self replicating agents into the world (other than making the pre-conditions for such a thing true, like the laws of physics being true, a geomagnetic planet with the right combinations of chemical elements at the right distance from a star, enough water to form oceans, etc.). Certain oils form hollow spheres in water. Those spheres grow when more oil molecules are added to it, eventually splitting into too spheres. That is a self replicating object. Then some of the molecules inside the sphere (such as aminoacids) could've joined to make a compound chemical substance which catalyzed the reaction that created it, and could be used as a backbone to which certain functional groups could attach to, in some cases furthering self replication by making certain things more likely, such as the absorption of the oil molecules for it's boundary, and thus making it grow faster, or the absorption of other spheres floating in the water, incorporating the chemicals in those spheres into the sphere which absorbed them. The existence of that self-replicating object gives you the pre-condition for evolution to take place.
The reason this is hard to replicate in a laboratory, is that this process (the origin of self replicating organisms and eventual evolution into organisms resembling modern life forms) is driven by very infrequent events. If the Earth is anything to go by, it takes in the order of a few billion years, and that is with a sample size of the whole Earth's oceans. As you reduce the total number of events, the probability of a specific event happening goes down accordingly. So if you're trying to replicate this on a flask in a lab somewhere, it might easily take you trillions of years before you see anything recognizable as "life" growing into it. And that is if you manage to keep your experimental setup free of pre-existing life forms, since those will likely eat any of the building block molecules that made life possible in the first place, preventing new organisms from spontaneously arising.
As I explained above it was not. But I think it should be pretty obvious my conclusion was that highly complex and effective self-replicating objects could arise spontaneously not by magic, but by following the laws of probability, with the pre-condition that there are highly simple and ineffective self replicating objects, like the ones I mentioned above, not that the existence of any old self replicating machine could be made possible by the existence of any old self replicating machine.
You have some kind of misconception. Begging the question means assuming your conclusion as a pre-condition, which is the same thing as circular reasoning.
Maybe, good thing I answered it above.
Wrong, as I'm sure you'll recognize when you see this post.

It's not magical thinking. Random mutations filtered by an effectiveness metric work for designing things, they just worth very badly and are outperformed by a human almost a 100% of the time, that's why life took billions of years to arise on Earth. But they do work to some extent. you might want to look into "genetic algorithms", as these methods are generally called. The actual problem is people in this thread are ignorant, and don't realize that:
1. There's a reason digital computers are more popular than analog computers, and it's not because transistors are cheap and some other mythical device that would make up an analog computer is more expensive (lol)
2. Design work is only like 30% of the work when building chips, so even if you put an algorithm in charge of making up the mask designs, you'll still run into insane costs for producing each wafer
3. The maximum speed of a CPU is barely affected by the designs of the masks, and is mainly a question of what process and what kinds of machines were used to process the silicon
And lastly, but most importantly
4. Current "artificial intelligence" algorithms are barely intelligent at all, and a trained human outperforms in almost all real-world tasks, although he may be not as fast for some tasks.

>There are already highly optimised mathematical algorithms for this topic. Fucking AItards.
Wow, you sound so progressive and smart! Are you my genders study teacher? Lol he says we should kill all men
Anyway, I'm sure you can tell me what those "highly optimised mathematical algorithms" are, right?
Hard mode: no OPC or HDL synthesis software

Read a book and work in the industry. No spoonfeed.

Nah, gimme something that figures out 1M+ tiny triode tubes on a silicon chip.

THEN, THEN we got somthin boys!

Are you doubling down on the larping? Wow! So now you're pretending to work on the industry too yet can't name even one of those supposedly "highly optimised mathematical algorithms" huh?
Let me be clear. I don't have a fab in my garage nor do I plan to send a CV to Intel anytime soon. So why the fuck would I need to be spoonfed about some vague claims of "highly optimised mathematical algorithms"?
The reason I asked you to name one of those supposed algorithms was to prove you're a prententious yet totally ignorant retard who should go back to >>>Zig Forums, not because I expected you to say anything of value since I doubt you have ever done that once in your entire life.

Too bad, mostly proprietary software/hardware utilises them since they're trade secrets. You can rummage through your favourite "free" and "open source" compiler to find them. You're not entitled to be spoonfed.

Spoonfed? With the diarrhea that keeps coming out of your mouth? Oh god no. Please no. Please make it stop.

I've had a similar thought only in regards to assembly

why not brute force assembly code until you get the desired output for a given input.

about 3 seconds later down the thought path i realized it would take a billion years.

I read an article where some people did this, if I remember correctly the optimal design it gave was fragile and incomprehensible and relied on electrical side effects. It was pretty interesting, don't know if it's been researched much more.

I think that was done on a FPGA.

I supplied current to a pile of sand and made a hyperintelligent fulgerite. I'm not smart enough to communicate with it but I think it controls spacetime.

That's a greedy algorithm and greedy algorithms cannot solve NP complete problems deterministically.

metal gear?

a bomb?

Sometimes we make mistakes.

Digital computers are much easier to generalize that's the only reason they're more widespread than analog ones. Sadly, very little research has been put into generalizing analog computers AFAIK.

Electronics isn't magic. There's no possible amount or research that will allow you to keep the ratio of electrons diffused in the plates of a capacitor (used for memory) constant. There's no such thing as a perfect insulator, especially when your device is a few nanometers thick and gated by a mosfet.
Any regeneration scheme will be subject to thermal noise and other sources of interference.
There are other analog memories such as storage tubes but they degrade just the same in seconds or minutes.
Ignoring base current for BJTs, transistors dissipate power only when not being fully on or off. A perfect conductor doesn't dissipate energy, nor does a perfect insulator. So your analog computer would waste loads of electricity that would show up as heat.
The whole reason people developed digital stuff was because when you're using analog signals, every step introduces distortion and noise. The data in your memory would become meaningless in minutes, and there's no way to regenerate it.
You need a one-to-many mapping from data to physical states because then even if a source of noise changes your physical state, the resulting state falls withing the same data value. It's impossible to weed out sources of thermal noise because as the temperature delta increases, the efficiency of heat engines declines, preventing you from reaching absolute zero.

Interesting. Maybe on Hacker News you'll find a more rewarding audience. Thanks anyways.

You sound lost. What are you don't here? Everybody knows Hacker News is nothing but a bunch of wanna be millionaire latte sipping soyboys.

fucking
the rope hypothesis

...

Humans haven’t been designing chips for like 10-20 years now. There are like 2 billion transistors in a cpu. Ain’t nobody got time for dat. Everything is designed by computers and tweaked in vhdl or something similar.

AI’s have been put to work designing fpga configurations to do certain computing tasks. They actually work and consume less energy than human designs and they work in a way that humans can’t understand or reverse engineer. This shit is coming whether we create it or another country does.

After that, the bounds of ai are probably going to be highly unpredictable. I figure programmers will become obsolete in my lifetime as a minimum, maybe even self awareness is on the table.

Yes, so? There are billions of bytes in some operating system too, yet nobody seriously believes they're designed by computers.
I don't believe you. Proof?

what is it with all you abusers tbh?

He's right. Transistors are not binary nor digital, they have all sorts of messy states and non-linear curves. A system is only digital when you've built it with the right combinations of transistors to create a logic gate. Even the simplest logc gates have to be made with 2-3 transistors in combination. Only then does the behavior reliably become "binary", but that's only at the system level. Each individual transistor is still analog.

You're speaking of the implementation of the transistor, not the function it serves in a computer chip. The voltage threshold level only matters insofar as it causes the transistor to be ON or OFF in the context of a digital circuit.

The poster I replied to seems confused about the role of a transistor and refers to hybrid digital-analog as just "analog" on the basis of the underlying implementation of the transistor. This is a category error.

You are confusing "binary" with "logic system". Binary just beans ON or OFF, which is what the transistor alone provides. Logic gates implement logical operations on top of that.

If we're going to play that game, then a simple copper wire alone provides binary, since you can have a high voltage (binary 1, ON) or a low voltage (binary 0, OFF).
In reality, a transistor (in general, including FETs) provides you an output voltage or current more or less linearly proportional to the input voltage or current, from a certain cutoff value up to a certain saturation value and according to configuration and device type. A transistor is not a relay. It doesn't have an "on" or "off" state. Even the cutoff and saturation regions are just conceptual, in reality there are still voltage and current variations in those regions, they just become smaller compared to the swings in the linear region. And that's without considering the weird configurations you can get, negative resistance regions, common base for better frequency response, etc.

I am the poster you replied to. I am perfectly aware of all common roles for a transistor. I wasn't referring to "hybrid digital-analog", whatever you meant by that, as "just analog" or as anything. I was referring to transistors.
There is such thing as digitalness, depending on how clustered the physical states of a device are given all possible external influences under a certain regime of operation. For example, a gun is more digital than a sword. A relay is more digital than a transistor. A switch is more digital than a potentiometer. A vacuum tube is more analog than a transistor, since a transistor saturates and cuts off more sharply than the vacuum tube.
I didn't refer to transistors as analog because of their underlying implementation, unless your idea of a transistor is any three legged electronic device.
I refer to transistors as mainly analog because of their inherent physical characteristics, if by transistor we understand BJT or MOSFET transistors (not thyristors or triacs, since they would probably be more digital than analog).
It's not a category error. As much as arbitrary physical objects can be classified to belong to one of two groups, transistors are decidedly analog.
In most formal logical systems it makes no sense to categorize a device according to the role it's serving at any given moment. In most formal logical systems, statements should be true or false regardless of the moment they're evualuated at. So a statement stating an object belongs to a certain category should be true or false independently of the moment the statement is uttered.
More pragmatically, in your post you made it clear you thought transistors could only be used to implement digital computers, and not analog computers. My post, rather than be a formal classification, was to let you know transistors work in an analog domain and are perfectly capable of being used to implement analog computers.
Not all transistor machines implement discrete algorithms.
This much is true.
This contradicts your previous sentence. Some transistor systems are analog systems, thus some transistors systems have a lower representational gap between the domain and the circuitry, which contradicts this statement (break everything down and rebuild them with layers upon layers of abstraction).
This is just wrong. The first electronic computers (firing solution computers, I/Q modulators, flight simulators, radar storage tubes, etc.) were analog.
What you mean is that analog computers are harder to manufacture to have the same flexibility than digital computers. If you think the gap is closing and analog systems with the same amount of flexibility than digital systems are becoming easier to manufacture, show me how much it would cost to manufacture an analog computer that could browse the web, and we will have a measure of how true your claim that the gap is closing is.

You have repeatedly widened or narrowed context throughout this conversation, and its getting rather tiresome. We are talking about digital computation, and in that context, the electrical characteristics of a transistor are not at all relevant.

Another category error. Logical predicates do not contain any details at all about the devices which implement them. They are abstracted away from that, operating according to digital interfaces.

Attached: predicate.png (294x172, 7.54K)

Attached: predicate.png (783x458, 35.96K)

but everything is already analog. they just dont use tubes and shit anymore but even the modern things are still analog.

OP's Q is a sign of complete fagatronic breakdown in the cerebral leprechaun due to a bunt force unicorn-trauma to the dome.

If you want a better system... Engineers from AMD & Intel have repeatedly stated that better and far faster systems could be made (10+ Ghz) with larger transistors (with lower leakage however) without p-code completely in parallel... That would, unfortunately, lead to 10,000+ watt systems the size of 1940s stand-up radios and the general consensus is that people don't want that.

It would also help if mankind moved away from x86 as it is highly bloated and inefficient. At this point it takes 18+ cycles just to do a basic add op and in some cases 30+ cycles to do a register store...

Most of all... I can't stand it when dipshits talk like game journos as it's like arguing cars without knowing the difference between a 4 and 6 cylinder and not knowing what a spark plug is...

move to what? some non existing hipster thing?

Well, considering that there are 30 or more bloc types. It would make sense to use a structure similar to ARM or the Harvard Model where you have 24 or more bit status register and do inline math like the SS58 or 360.' So instead of one 8 or 16-bit add taking 18+ cycles it always takes 3 cycles. That way when you have a 3 Ghz system the difference would be getting 3-5 times the performance on the same number of cycles. Right now it takes many commands to multiply 2 numbers, but the SS58 unit automatically provided inline add, subtract, multiply, divide, sqr, pow automatically when numbers were put into registers A and B at all times so you get all the compound math in 4 cycles every time. It would be a huge help if comparative logic was handled in the SR the same way without having to burn 30+ cycles to know if one number is larger than the other. Another big help would be having machines that are actually 64-bit without that mode switching crap. It takes 20 cycles to flip the mode, then you lose 3/4 of your registers or their power...

Look all I'm sayin is that the original MIT model from the early 1960s used to make the first micro (DSKY for the AGC) is tired and has run it's course. What's really sad is MIT updated that spec multiple time. We are still using 2nd and 3rd gen blocs on the 99 millionth gen of Intel and AMD chips. I'm just suggesting they use a design that's a little newer than 1960 is all. AMD is the only one of the big 3 attempting it. It's just hard to break away from x86 because it isn't easy for career folks to change operational paradigms at 50 or 60.

if its so easy to make them much better then why would they choose to do it so badly instead

no modern workload outside of i'm a tard writing python wants one really fast core instead of eight cores half that speed

No I didn't. On this thread we are comparing analog computation to digital computation.
It's pretty clear you thought transistors could only be used to build digital systems:
I just corrected your misconceptions.
Yes it is. If you thought transistors could only be used for "discrete algorithms" and building systems "with layers upon layers of abstraction", then correcting that misconception is largely relevant.

The DSKY didn't have any microcontroller, it used relays, logic gates and a few transistors. Micros weren't invented yet.

Wrong. Generally you can do one or more operations per clock cycle. See gmplib.org/~tege/x86-timing.pdf

If you read the links in my post, you'd not only see that that is false, but you'd also learn what a hybrid analog-digital computer is.
This is a meaningless word salad, and you didn't quote or respond the meat of what I said.
You've already demonstrated that you don't know the difference between "binary" and "logic", saying that a transistor is not binary until it implements logic gates. Basic, basic, basic stuff here. This entire series of posts you've made is just obfuscating around that.

I'm both shocked and surprised that somebody yielded a knowledgeable response, but you subversively fail to mention that only applies if the compiler can correctly render the perfect sequence of ops out of a ridiculously human-incomprehensible pool of 1600+ codes depending on what supposed 'x86-compatable' platform you use.
Great Job! tool...

That's not true. These numbers aren't the theoretical maximum performance. The way these numbers were measured is the guy put each instruction by itself inside a loop and measured how much time it took to do a given amount of iterations, minus the time it takes to run the same amount of iterations alone with no instructions inside the loop.

...

it wouldn't be optimized for speed, the number of transistors would be reduced. and AI don't create shit, it only follows instructions no matter how "intelligent" it seems it still needs input.

The fundamental digital element in computers is not the transistor. It’s the logic gate. Logic gates are digital/binary and are made of transistors. The transistors themselves are not inherently binary; they are arranged in such a way that the gates they comprise will be.

You dumb ass... The DSKY was the first microcomputer. It was a joint venture with titans of the industry like IBM and Ratheon. The solid state nature of it's design is the basis of the modern micro controller that came in 74. It had GPIO and was directly interfaced with sensors whose voltages were decoded to 11 bit binary through a series of ADCs. It would then control other pieces of equipment on it's serial bus... That is pretty spot on to what the function of a micro controller is designed for is it not?

That's not true either. Transistors are sequentially infed to one another to create various pools of logic that are chained together to form blocs like the ALU for example. The different layers of this 'pyramid' are the different activation echelons of the system itself. It takes one cycle to fully actuate various levels like a series of tumblers be driven by key in a lock. A 64 bit number in serial actually consumes 64 cycles. There is no variable mystery to how many cycles an instruction takes. The issue of variability started with MMX when P-code crossed a threshold of complexity in duality with the solid state blocs in the unit. P-code BTW is like a traffic system used to re-chain pools in the processor to reduce size by rearranging blocs at run time you dumb bitch.

You're thinking of the AGC. The DSKY was just the user interface for the AGC.

That was very good user. I wish more people actually knew more about actual computer history/science. You sir get a +1 from me.

I just get frustrates sometimes with these couch-niggers that think just because they can 'use' a smartphone and make adds and defs in python they are somehow computer scientists.

I would have considered what you said as possibly not a meaningless word salad if you didn't refer to solid-state anything. CPUs have been solid state since the invention of the transistor, so of course the Pentium MMX is solid state. You just said it because it sounds cool, just like probably the rest of your post.
So unless you can provide a reference talking about p-codes in the context of CPU microarchitecture, I think you're the dumb bitch bro.

Work on spelling before revolutionary science, nerd

You know Enstein didn't learn how to tie his shoes until he was 12 years old. You ever think of that?

all things pointing at einstein being a nerd are fabrication made to make feel NEETs better. Einstein was actually a total chad.

This is a simple regular language which can occur in nature, e.g AAA, ABABA, AAABBB. It does not have the grammatical power to express the code of DNA, or anything even close to that.
I'll let a renowned chemist explain to you exactly why that is a ridiculous notion:
youtube.com/watch?v=r4sP1E1Jd_Y

What do you mean by "this"? I talked about multiple things in that quote.
Define what you mean by this.
I didn't say DNA was expressed by any of the elements I mentioned in my post. You don't need even RNA to have the most basic forms of what could be considered life. There could be multiple steps of building up more complex molecules from the origin of life to the modern DNA machinery, it doesn't need to be expressed by the most primitive life forms.
His whole argument is based on the fact that so far in about a century of modern chemistry humanity hasn't able to synthesize life from the basic organic molecules. But he's begging the question. There's no law that says men should be able to perform in decades what has taken at least billions of years in nature.
He also says that we haven't been able to generate the molecules seen in modern life in pre-biotic conditions. But that's another fallacy too. He's assuming all the molecules in modern life should've arisen from the pre-biotic environment. But nobody is claiming that. These molecules originated inside the simpler life forms that did originate from the pre-biotic environment.
He says carbohydrates would decompose more rapidly than they are formed unless you put it in a freezer. But that's not necessarily true, since in modern life carbohydrates are plentiful in organisms. Maybe the earlier life forms didn't need carbohydrates, or maybe the period between spontaneous formation and caramelization was enough to sustain their existence. And then at some point one of these primitive life forms, through random mutations, gained the ability to catalyze the synthesis of carbohydrates so the average number of carbs molecules in solution would increase to a more useful level.
He says "you run out of material". You don't run out of material, that's nonsense. The synthesis of these organic molecules in the environment would be an on-going process, not a one-time event, so even if each individual molecule decomposes, there would be a steady amount of these molecules floating around to be absorved by cells to perform useful functions. You don't need each molecule to last millions of years. To pretend anyone is claiming each individual molecule needs to last millions of years, now THAT is ridiculous.
And the chemicals which were synthesized inside the cells, catalyzed by some pre-existing chemicals, well, those could keep being produced as long as the cells keep replicating.
He says "time doesn't solve the problem" because life hasn't been synthesized since the first research in the 50s. Well, you don't know that. Have you ran the experiment for billions of years on a lab flask the size of all the oceans on Earth combined? Well, there you go. You have a long ways to go before you can claim time doesn't solve the issue.
Since he's repeating himself by now, I think this is enough.

Correct me if I'm wrong here
Transistors "turn on" at a certain voltage. Taking MOSFETs into consideration, they have three on states. One(linear) acts basically like a switch, which is used for digital ICs, one is called Sub Threshold where the properties are different to the last mode, saturation, both these modes are usually used for analog shit.
The problem is these modes are very variable when compared to the triode mode with a lot of considerations like swing(how your signal can oscillate in order to not take the transistor out of saturation) and speed, among other things I can barely remember.
In my not very informed opinion I think photonic ics would suit "analog" or multi bit transistors better considering you can just increase/decrease the intensity of light without worrying about much(again I don't know much about photonic crystals) using constructive or destructive interference, but photonic crystals are apparently not yet scalable to the level of transistors, are fairly complex and I have no idea how you'd implement bitwise ands/ors/shifts to them or make PGAs out of them.

Strawman.
Strawman.
Strawman.
Not even the simplest cell has been created, much less an evolution from those cells to any other form demonstrated. Fabrication.

Stopped reading your post there.

Forgot to respond to this part. There is no continuity between RE and higher level languages. That is, no amount of AAABBB or permutations thereof would rise to higher level of expressivity beyond the hard boundaries of what a regular language can express. This is why you cannot just rely on some poorly defined extrapolation.

Attached: RE.png (696x402, 65.14K)

maybe digital was a necessary step because it makes the machines easier to understand, and now that we are getting close to true AI, we can use intelligence beyond our own to design analogue systems that surpass digital systems

All but one section of most processors are solid state. What I was attempting to say was that with MMX and a few instructions from prior units the mechanical/internal issues caused with predictive branching have more to do with the excessive quantity of overly specialized sections of modern processors that are above the base x86 line.
In a modern CPU there may be as many as 50-90 different blocs for adding. Some are for outcomes that are whole number only, some are for outcomes that are decimal and negative only, etc. The rail is set up to yield to the bloc that returns first which often causes extremely unpredictable cycle costs. You can add two randomly generated numbers both whole and decimal and do so 250 times and result with cycle costs of 10-50 almost randomly.

One of the better approaches that MIT and Harvard have recommended is a system that uses patterned escalations.
By reserving the numerical space above 99 data/code execution would be as follows. Say the code for add is 0xFE. In this example to add in ASM would be as follows.
FE1224 -> 36
Overflow -> 0
To multiply you would say.
FEFE1224 -> 33
Overflow -> 1
By reserving the space above 99 the code/data read is reduced to simple comparative logic. The pools are sequentially latched in a strictly additive process.
They argued that in this way the various pools could be made significantly smaller with a much higher degree of Lego like flexibility. An escalation based system would provide the efficiency of an ARM with the parallel power of a GPU by adding routing instructions that are noticeably absent in modern architectures.
It didn't catch on due to a higher memory requirement at a time when 4-32k of memory was very expensive.

I you really wanna know how bad some of these companies are ripping us off...

The x86 design has a really bad nexus that requires you to mode-switch a lot. Almost 1/3 of all operations require wasted cycles just to repeatedly put the processor back in 64-bit mode for math at great expense as the operations are still computed in 8-bit chunks...
A killer way to fix this is to make all operations ISO compliant to the 64-bit float standard. Instead of using a code to flip the mode to 64-bit, load a, load b, add a and b, then store the result in x, and finally write x to some address.
They could just use a compound register that held the result of add, subtract, multiply, divide, pow, and sqrt for any number currently stored in mA and mB and just use a code to select result at 1 to store at some address. The code would be reduced to load a, load b, write result of subtract to some address. No flips all math computed at strictly 2 cycles. No need for mmx, sse, etc...

Who’s Enstein, genius?

Please call them MLfags. AI is a far broader field than those fucking statisticians make it out to be.

The real future is Duotronics. Until that is fully implemented any improvements will be minor.