Alternate Zig Forums history thread

If Hurd had just been a clone of the SysV kernel instead of the cluster/micro/cloud clusterfuck that it's failed to be for the past 30 years, would we all be using pure GNU right now?

Attached: freedomquest.webm (500x281, 4.89M)

Other urls found in this thread:

mackido.com/Hardware/AnalogVsDigital.html
youtube.com/watch?v=-aOyAvbj2Fg
youtube.com/watch?v=thrx3SBEpL8
twitter.com/SFWRedditVideos

Yes

If a group of disgruntled Motorola engineers hadn't quit en masse to sell an integrated microprocessor for 150 times cheaper than anyone did at the time, would an alternative to rented dumb terminals and toll service "cloud" storage/cycles exist?

Attached: 6502 team 1975.png (703x515, 748.58K)

Yes

UNIX weenies are so brainwashed that their most interesting idea of alternate history is whether they're using one version of UNIX or another. This GNU wouldn't even be a microkernel, just another shitty UNIX clone. UNIX weenies cannot imagine that things that were better and actually did exist could exist.

What about some more significant alternate history?

Suppose IBM used a different CPU or OS in the PC.

Suppose Lisp machine companies never went out of business and are still being built now that hardware constraints and bottlenecks are a lot different than they were in the 80s.

Suppose AT&T had standards and fired the UNIX team for panic, array decay, null-terminated strings, 00011 meaning 00009, and all that other bullshit, and decided to go with Multics.

Suppose the systems language on the most popular OSes was better than C and C++, so everyone made GUI programs instead of web pages.

Suppose operating systems were designed to reduce the amount of code applications need instead of following false assumptions that broken bullshit uses less code than doing it the right way.

These people are seemingly -incapable- of even believingthat not only is better possible, but that better could haveonce existed in the world before driven out by worse. Well,perhaps they acknowledge that there might be room for someincidental clean-ups, but nothing that the boys at Bell Labsor Sun aren't about to deal with using C++ or Plan-9, or,alternately, that the sacred Founding Fathers hadn'texpressed more perfectly in the original V7 writ (if only wepaid more heed to the true, original strains of the unixcreed!)

What I have today is the best. I don't believe anything better is possible because I don't have it working on my computer today.

The car I drive today is the best. I don't believe anything better is possible.

Why were LISP machines slower than regular PC's running their own LISP implementations?

You might want to look into the Apple mac vs Amiga Mac OS speed comparisons back in the day. It would wake you up from your stupidity bubble.


Might just be faggots talking bullshit after a few beers but it's interesting to ponder. It's hard to deny that Commodore weren't a step ahead of the game on many fronts when you see what the Amiga could do and the low cost of the machines compared to the other options at the time.

What would happen if the computers were developed with several voltage levels to represent numbers? for example, 5 different levels per transistor to represent numbers from 1 to 5.

Better cars exist today and they're available for sale right now. Show me the other OS I can install on my system right now.


That's fine for the past but it doesn't help me today. If better systems exist, then I want to install them on my computer.

What if Gary Kildall hadn't been out flying his plane, signed the NDA, and CP/M was the operating system instead of Microsoft buying QDOS?
What if Jobs had licensed Mac OS' UI for use on IBM PC instead of Microsoft writing Windows?
What sort of company would Microsoft be today (selling development tools and office suites)?

Thinking about it: it is strange that Unix is, at the same time, programmed to be a multi-user environment while also programmed to have the robustness of a single-user workstation.
In the end, though, we are probably stuck with Unix. Whatever flavor, Stalin promoted it with the GNU project and it would still be popular because it is free.

My favorite still has to be the steampunk one:
What if Charles Babbage had actually completed a single one of his designs for the analytical engine and jump-started the computing revolution in the nineteenth century? Church-Turing thesis has a different name due to being invented earlier?

What if encryption had been a mandatory part of internet from the beginning?

They did this. They even had decimal.

Why did the stop using it?
Just because some nigger called Bool wanted to compute logic?
All I'm finding is the usual circular logic for braindead normalfags you can't use decimal because the computer stores it in binary and then the most retarded reasoning ever our current electronics don't allow having more than two values I should throw a book about analog electronics at the next retard that tells me this.

It's impossible to miniaturize.

What if people realised that Tanenbaum was right back in 1992?


The UNIX design philosophy itself for the most part isn't bad, what is bad is systems based off UNIX which didn't also embrace the UNIX design philosophy.

Tanenbaum was right.

What if Alan Turing was killed in a German Air raid and Germany took over and destroyed any idea similar to a network of computers in its infancy so we wouldn't be having this gay conversation and be tending to our Aryan people's and land?

Computing is a cancer, a demon unleashed to enslave our peoples. Crush it, and let machination take over.

Attached: 3707ce78f866ae19276e67e0ef281a257d81436d12ebeaf8e98751c9492abd16.png (809x965, 1.21M)

It's far easier to produce electronics that deal with boolean values than it is to deal with decimal values. Even ternary level electronics is harder to produce than producing binary level electronics.

Tanenbaum was wrong as fuck. That whole design is seen as anachronistic today. HURD is a good example of the folly of trying to fight the hardware like that. KVM is an example of the right way. And "the future" is seamless kernels where there is no division between kernelspace and userspace which is an even more thorough "you're a moron" thrown his way. No one even wants to be associated with those ideas anymore - Microsoft quietly stopped mentioning the NT "microkernel" (it never was, they just liked saying it) and now only talks about Hyper-V.

Would UNIX be considered a meme had Commodore GPL'd AmigaOS just before crashing with no survivors?

I wasn't implying he was wrong.


I am going to ask for a massive [citation needed] on that one, in the growing concern for system security such a concept is like a Dresden firebombing sized dumpster fire.

Except for all the groups giving huge amounts of money to Data61 and all the people and companies which are developing products using the seL4 microkernel.

That's because Microsoft is 170,000 monkeys typing on keyboards with maybe a few hundred competent engineers, they can't make such a thing work because it requires people who actually understand the entire system stack.

Because signals have to be debounced to correct for voltage overshoot/undershoot, which is obviously much easier and faster with a binary system. For this reason, analog (or non-binary digital) systems must be slower:
mackido.com/Hardware/AnalogVsDigital.html

Attached: digital.gif (325x168 2.36 KB, 2.44K)

Huh, you aren't just LARPing a leftist faggot, you actually are one. No more (you)s for you, go home.

You're all stuck in the past and none of you are working to make the tools better. Talk Talk Talk no Work! Wannabe techies. Wake the FUCK up people ...

youtube.com/watch?v=-aOyAvbj2Fg

It's a lot easier to just drive transistors into saturation than to worry about discrete levels, and accurately reproduce those levels throughout the IC.

Attached: 1502099933777.gif (464x381, 424.17K)

ever heard of audiophilac electronics and "hifi"? accuracy and precision are exponentially expensive. you save a lot of money by limiting your precision to "on" and "off". it's not only the cheapest, but also the fastest.

You would have 10kW cooler on your CPU. At 1V there would be ~4V drop across the transistor which means 4V x current of heat losses. Which is also why modern CPUs don't run on 5 or 3.3V. It also complicates the design. How would 5 voltage level AND even look like? How do you even define AND for 5 values? Inputs equal at same voltage levels? That's just shit ton of comparators + references, not practical or simple at all.

Attached: da6ca05288630372707e8efc1b77f2bc_XL.jpg (750x530, 300.14K)

How dangerous would be using an empty case as a barbecue be? Serious question.

Probably fine if you burn off all the paint and plastic first. I don't see anything else being potentially harmful. It's just ordinary aluminum/steel. If you decide to try making this, make sure it doesn't smell of anything weird before you put the food on.

What if BASIC wasn't basic?

We need a Zig Forums and /ck/ cross over board where we use old computer equipment to cook food. CPUs with small frying pans on them instead of heat sinks. Overclocked graphics cards used to make baked potatoes.

That would make one hell of an online video series. It's just too fucking cool.

Figures you're a lispy faggot.

Alternatively: What if BASIC was basic?

I am seconding this, because the "patch" to Meltdown and Specter were more separation between the kernel space and user space, not less. There's a reason the Linux patch had the tentative name FUCKWIT: "Forcefully Unmap Complete Kernel With Interrupt Trampolines".

< HURD is a good example of the folly of trying to fight the hardware like that.
That's bullshit. HURD is a failure because Stalin wanted another cathedral-style project and it never attracted more than a dozen developers at any time. The fact that it even boots, even after these 25+ years of development, it a testament to the design.


Ad Hominem, which is Latin for "Not An Argument".

youtube.com/watch?v=thrx3SBEpL8


I thought the processor material exploded due to the temperature changes. Basically, a processor heats up so fast that the thermal expansion of the materials, combined with their brittle nature, causes violent shattering.


What if BASIC didn't die, kept up with hardware, and didn't become Nu-OOP soy BASIC?

The major problem with Hurd has always been the hardware drivers. There has never been enough development going into the drivers. The rest of the system is already there and only ancient drivers exist for ancient hardware for an ancient version of Linux. There is currently experimental system to support more modern hardware and that's not ready by any means.

No, the issue is the driver model itself. The GNU/Linux driver model is fucking horrific and rather than making a new one which fixed the issues GNU adopted it for Hurd.

A proper microkernel design has the drivers in userspace, Hurd has them in the kernel (along with a few other things) which makes it a hybrid kernel which is a nice way of saying it has the disadvantages of both monolithic and microkernel approaches with none of the advantages.

You can run GW-BASIC or QBASIC in FreeDOS, and then peek/poke all you want. Sure, it's not in ROM, but it also wasn't on all 8-bit computers, so then you had to load MBASIC or whatever from floppy.

Attached: OsborneScreen.jpg (800x600, 177.26K)

Can someone explain the lisp machine meme to me? I never understood it so I just kinda went along with it and avoided discussion about it.

Did you ever dick around with the python shell? Imagine that but lisp instead of python and it's your entire operating system instead of just a program.

A few Hurd drivers exist in the kernel for the purpose of speed so while it's not quite a strict microkernel system, it's close enough. The rest of the drivers are intended to exist in userspace. The ancient drivers they have are ports of Linux drivers. What they're doing today is implementing a new driver model based upon the concept of the rump kernel. This rump kernel has been around for a few years so it's a very new idea.

Is there any point to that whatsoever?

Not all of them. Only some drivers are in the kernel, and the butt-load that they outright stole from Linux 2.4.something are in a user-space server (IIRC). All of Hurd's driver issues are Human Resource issues: they don't have any, and Stalin chose a community model in the project's early days that made it so that it was far less interesting to hack on than Linux. Which is actually hilarious given how abrasive Linus is to the community.


I'm talking about BASIC with OpenGL/DirectX/CUDA/the other CUDA calls.


I'll try to be Shittypedia here (will attempt to be objective): a Lisp machine is a machine that runs the programming language Lisp and runs it well. The language really wasn't important: why Lisp is focused on is that Lisp is an interactive language. You write a function, you run it, you see how that turned out. Lisp gives you a shorter turn-around time than a C compile, link, run, pray process. At the expense of slower code. Lisp also has a more tightly integrated debugger, and you debug in the language. It is nicer for the developer, it holds their hand and prevents them from making common mistakes. It also may allow, depending on Lisp system, for the user to debug delivered code: they may not be stuck with shitty bugs in something they bought or were given. However, this does assume the developer-user. The average Windows user / phone poster doesn't know and doesn't care how the software works and won't expend energy to fix its faults. For them, the Lisp machine is a bunch of extras that only make the machine harder to use (because they have to exit the debugger every time it pops up). Lisp machines are loved for the same reason some people prefer towers over laptops. A tower is for hobbyist computing enthusiasts: people who build and maintain their own stuff. A laptop is a product for someone who doesn't care. While you can modify Linux, it has nothing on the developer-centric environment of a Lisp machine. It's a shitty operating system programmed in a shitty language that frequently mains its developers (the language and the operating system).

Linus is actually very polite to his LKML. The problems occur when his trusted developers do something that is dumb so he'll chew them out for not knowing better. He's very polite to beginners to who want to learn about how and why things work.

...

I agree that Linux is definitely shit from quite a few points of view, but still, it's miles ahead than the other actuall popular and usable OSs like Windows or Mac OS. I think that most people wank to the Unix environment because of this: it's not the *best* environment possible, but it's the best among the ones that currently exist and can actually work with hardware.

WHY IS TERRY A. DAVIS'S DEATH BEING CENSORED?


WHY IS TERRY A. DAVIS'S DEATH BEING CENSORED?
WHY IS TERRY A. DAVIS'S DEATH BEING CENSORED?

WHY IS TERRY A. DAVIS'S DEATH BEING CENSORED?


WHY IS TERRY A. DAVIS'S DEATH BEING CENSORED?

WHY IS TERRY A. DAVIS'S DEATH BEING CENSORED?
WHY IS TERRY A. DAVIS'S DEATH BEING CENSORED?

WHY IS TERRY A. DAVIS'S DEATH BEING CENSORED?


WHY IS TERRY A. DAVIS'S DEATH BEING CENSORED?

WHY IS TERRY A. DAVIS'S DEATH BEING CENSORED?