/unixhate/ #1

Well, Zig Forums, is C and UNIX worship justified?
Did C and UNIX set back computer science by several decades?
What are the best alternatives to these two evils?
Also, general /unixhate/ thread
softwareengineering.stackexchange.com/questions/373385/was-the-c-programming-language-considered-a-low-level-language-when-it-came-out

Attached: The-Unix-Haters-Handbook-OpenLibra.jpg (750x361 202.04 KB, 119.3K)

Other urls found in this thread:

stackoverflow.com/questions/40752436/do-any-cpus-have-hardware-support-for-bounds-checking
en.wikipedia.org/wiki/ICL_VME
twitter.com/SFWRedditImages

C? Naw
UNIX? Yes in that it could be considered the first "modern" OS and all OS design thereafter owes itself to UNIX. But as far as *nix systems in general go? Naw

C and UNIX are amazing. In hindsight, the people whining about them back then failed to accomplish anything in 50 years so we can look at their complaints today as moronic bleating.

What's your opinion on C being responsible for HW array bound checking no longer being a thing?

Back to india, pajeet.

C was created to be low-level enough to write an OS in, and Unix was written in C. This allowed both to spread to other architectures and operating systems. You can run C on your toaster (and chances are that your toaster is running C code). Lisp is a higher-level language that required specialized hardware to get the performance necessary.

You've never worked in embedded, have you?

Okay, we know our modern world is full of C and UNIX. But does that make them good?


Tbh my toaster has three resistors and two springs. Anything beyond that is bloat and probably buttnut.

Luddite

This needs a fancy touchscreen GUI with oversized icons and screen. It should always be connected to the internet and monitor you for "convenience" as well.

You're joking, but this is what Iotafags unironically want to do

UNIX is just a stripped down Multics though, and some OSes did not do things the UNIX way after UNIX showed up. Lisp machines/their respective OSes, Smalltalk (it was possible to boot directly to Smalltalk), and Oberon are very different from UNIX. I don't see how Classic Mac OS, BeOS, Windows (before you could run GNU on it), DOS, CP/M, etc were Unix-like either (unless you consider using C or C++ "Unix-like" which is a very broad way to define it), and arguably Windows is probably still mostly non-Unix-like internally although I would not be surprised if they're repurposing some permissively licensed code here and there into it due to laziness.


It is possible to write an OS in other languages, such as Pascal, Ada, and Forth (or just an assembly language but thats not portable), and of course theres C-like languages like C++ and Rust too that can be used for that.

In the 1980s, and these were basically minicomputers, probably what the Lisp machine companies could afford to customize so much back then. I think Moore's law has probably fixed this, you've got like what, 4-8 cores on consumer hardware (that a lot of programs don't use because multi-threading turns out to be hard in most languages), 64 bits, integrated GPUs usually, SATA, solid state drives, a lot more main memory, maybe even a ridiculously powerful discrete GPU (but drivers are hard too so its often wasted and grorious C does not make that any easier) etc. You're telling me that you can't even have some Emacs tier OS? Come on now.

Attached: interjection pepper.jpg (1600x900, 449.97K)

Okay Sham, how would you like an operating system to be designed? What design decisions would you make to avoid the mistakes of Unix? What do you want to see in an operating system?

Not him, but something like GuixSD seems to be moving in the right direction to me. It doesn't follow file system hierarchy design, doesnt use systemd, and is 100% free software.

...

Are packages dynamically linked?

A Modern Operating System™shouldn't have everything as "a file". Sockets are not fucking files. At most, they are input/output streams. This is what programs should deal with. Typed objects, with inheritance. InputStream, OutputStream, File, TCPSocket, etc all the way up to "PNGImage" or "ODTDocument". File extensions and magic bytes are an ugly hack and a poor joke that must go.
Drop the FHS. It sucks, it's a dinosaur from the era when the goal was to save as many keystrokes as possible. Understandable on a terminal connecting to a PDP-11, but not in current_year. /System, /Applications, /Storage, etc as GoboLinux does, is I believe the right way to go.
Drop editable text config files. There should be a database, managed by the OS. In fact, the whole filesystem should be a database. With version control.
Is your config stored in .config/, a dotfile in $HOME, or in /etc, or maybe /var/local/? Remove that shit. They're all stored in your application's config table. At most, you've got one system-wide config entry, and one config per user. The OS should provide a way for applications to get the config values directly, without any pain.
Some command like "$ config " would be available.
All libraries should also be stored in the OS database, instead of 30 different folders. Is it /lib? /usr/lib? /usr/local/lib? LOL you can't know because UNIX doesn't enforce that!
The OS should provide a standard logging interface, with standard levels of logging priorities. stderr vs. stdout? Not anymore. Now there's what would be a sort of stderr, stdwarn, stdlog, stddbg, etc.
Programs should be callable as functions from within other programs.

Yeah and it should be called something like the "Registry", right?
Although I do agree with renaming the FHS stuff. there shouldn't be a /bin, /sbin, /usr/bin, /usr/sbin, /usr/local/bin, and /usr/local/sbin.
it should just be /bin. That's where all binaries go. /bin. That's it. /etc should be renamed /conf or /config. /usr looks too much like user, which confuses new people. It should be /resources, or /res to save keystrokes (keep in mind that saving keystrokes is not just because of dinosaur hardware. It's also more pleasant to type). /home can then be /usr or /user. MacOS actually does call /home /Users, so it's not out of the question.

Well the idea is badly executed on Windows, but otherwise, yes, something like that.

If you're so keen on saving keystrokes, you should switch to keyboard that doesn't require for example pressing shift AND 0 to get parentheses. Parentheses are oftentimes used in programming, and often used in normal conversations. Same with the quotation mark, question mark, exclamation point, etc.

Personally, I suggest removing hierarchical-based filesystems altogether. They're nothing but a confusing mess.

Can you tell me more about GuixSD (and perhaps Nix OS)? All I know is that it has a functional package manager, which means that you can install and uninstall shit all you want without having everything crash down like a house of cards because dependencies are an eldritch abomination.

It basically handles everything for you. Each package has its own directory in /nix/store, under the name of the hash of the build process, where everything is stored (binaries, resources, etc). This means you can easily delete EVERYTHING that's correlated with the package without cruft in your system. Packages (and in fact, the whole system) are either installed via a main configuration file (/etc/nixos/configuration.nix) or on a per-user basis (with nix-env -i), or (here's what's really cool) in a kind of chroot development environment (nix-shell -p ). This makes it extremely easy to know what's on your system and what isn't. The development environment is of course isolated, and you don't have to install millions of libraries with apt-get and then forget about them.

Is your use case dozens of machines that need the exact same setup?

lol

when did x86 ever have hardware bounds checking for arrays? or the 6502. or the z80.

My lad, the 8086 is about a decade younger than C, and the 6502 and z80 are even younger. But even then, x86 has the BOUND instruction, although it is a subpar instruction, and more recently (we're talking last 3 years), Intel has introduced MPX.
The Lisp machines also did have hardware support for array bounds checking.
stackoverflow.com/questions/40752436/do-any-cpus-have-hardware-support-for-bounds-checking

Zig Forums doesn't understand hardware bounds checking and thinks it's magic and free. In the bad old days of processor design, general-purpose instruction-level parallelism wasn't available and processors would instead provide higher-level constructs that allowed the processor to do work in parallel. That's what was being done with tagged memory, so enough information was available per instruction to do the bounds check at the same time. But ever since we got things like pipelining (and everything that came after), there was no reason to do that as it would just be less efficient. And the general purpose approach also allowed the compiler to elide checks when they aren't necessary. There's a misconception here that we've lost a technology to time instead of the same thing happening today in a more efficient way in type-safe languages.

Is this a troll?

Attached: absolute state of jewtel in 4 CYE.png (750x390, 51.99K)

Well there is no such thing as free lunch but if you're okay with storing the space of the array and use some extra space on the die, adding in bounds checking wouldn't have any performance cost.

What does age of the CPU's have to do with the argument? It's clearly evident to see your entire argument is only to paint some kind of idea that lisp machines were better, but they failed and are hence failures. If these designs were so important and monumental, then their implementation would've been present in modern hardware, but it isn't. It is entirely absent because they serve no use outside of academic pursuit. Stay mad lispcuck.

1) I mention C being responsible for hardware array bound checking, a thing in the 60's/70's, disappearing
2) You say 80's architectures don't have array bound checking
3) C was a very popular language by the time these architectures came to be
4) Thus it can be hypothesized that C was indeed responsible for that

Not at all. I mentioned Lisp machines because they happened to have hardware bounds checking via special parallel instructions.
You also seem to reason in the following terms
1) Popular -> Good
Therefore,
x) ¬Popular -> ¬ Good
Not only the logic is incorrect, but the premise is an ad populum. Even if we accept the premise as correct, the result is
x) ¬Good -> ¬Popular
which doesn't quite explain why Windows is so popular


I know it's not perfect, but that wasn't the point, user

This is by far the best thing you ever done lispkike.

Holy shit when did I land in 2015?

You are really stretching it with your lies. C wasn't developed until the late 70's, and it didn't gain popularity until the mid to late 80's, hence the standardization of it in '89/'90. 8086 was developed in the mid 70's. If it was necessary for hardware bounds checking of arrays, surely Intel would've included it in their flagship ISA. You completely ignore that with your lies and continue to lie and build your argument on lies. You've proved nothing, your lisp machines failed for reason; they are slow, singular purpose, and incapable for adapting to the needs of commercial and consumer demands. If it could, then they would still exist and people would still be using them. They don't exist, they failed and are thus failures. Academics btfo again. Stay mad lispcuck.

I can easily come up with more than one purpose. How about these two:
1. expert system
2. computer graphics

Pipelining showed up in the late '70s and spread to everything during the '80s. That obsoleted wacky processors as that functionality could be done in software. There's no C conspiracy, those hardware techniques just became obsolete.

Is this why they're reimplementing them?

It's anyone's guess why Intel made MPX since it's slower than software bounds checking, less safe than software bounds checking, and they seem to have abandoned support for it. See . I assume it was so tech illiterate faggots like you would see it on a feature list and get hyped to buy all new Jewtel processors. It appears to have been purely a marketing stunt.

Pipelining has little to do with why bounds checking is no longer done in hardware. People have worked out and successfully implemented into hardware pipelines in which traps are able to be raised by any section of the pipeline. The fact that something if a bounds check fails and destroys the pipeline, doesn't matter much as you'd probably want to halt the program and drop into a debugger anyways due to there being a problem.

That's an impressive amount of gibberish, user.

I will clarify any section that you didn't understand. In order to save us some time, I tried to explain further each statement I made.
I am saying that functionality from "wacky" processors needing to be done in software is not a problem with pipelining itself.
Depending on the architecture, traps / interrupts / whatever you want to call them can be raised in more than one location. For example, if you decode a instruction which doesn't exist, you could trap in the decoding stage of the pipeline. The same architecture could also trap when an arithmetic operation overflowed. This trap would obviously happen in a different part of the pipeline and not in the same place as the decode stage. Depending on the architecture, different things are done to handle this problem.
Upon trapping, you'll likely need to flush the pipeline (at least part of it). This means that you will take a performance hit. My statement here is saying that a performance hit for an outof bounds read / write does not matter as it is an edge case that should never happen in correct software.

(is) (a) (faggot)
who could be behind this post?

Attached: c3820f14b30c66ca3cfa35bb06aaeadd31933e4947e5a12cf9a73af0ca64725e.png (940x766, 350.85K)

You all should really be deferent towards Xerox; because the world would have been *very* different without them -- the people behind UNIX on the other hand, they're nothing special. (Read "Dealers of Lightning" if you want to know more about Xerox PARC.) Xerox gave birth to the idea of a personal computer (that is to say, single user) -- the Xerox Alto booted from a hard disk, had a graphical screen, proportional fonts, a mouse driven GUI, a WYSIWYG text editor, a drawing program, IDE with overlapping windows, Ethernet, Net booting, laser printing, full office suite, and the first optical mouse. In the book I mentioned, there's a story about Jobs meeting a manager at Symbolics; Jobs told this person to "demo" their machine. After the demo, Jobs said something along the lines of "This bitmapped display... wouldn't it be nice if it could scroll pixel by pixel?" -- they went to a file, changed some of the Lisp code, and it scrolled pixel by pixel. Jobs was blown away. When Jobs left, he still felt they hadn't showed them what they *actually* had (they were the vainglorious academic type, and regarded Jobs as a hobbyist). There's also a Symbolics chief engineer describing when "the spirit of Xerox PARC evaporated" -- that is a must read. We're not living in the future -- we're living in 1991 with slightly more lucid graphics.

Attached: Alto.gif (337x480, 87.94K)

lisp is for gaylords

Unix is bad. Unix it shit man, yes. Here, Here look at this
This is a quote this is a real motherfucking block of words. Yeah man, this is a REEEEEEEEEEEEEEEEEEEEEEEAL QUOTE FUCK YEAH BABY.This is a line.This is another paragraph. Wow, really good. Nice paragraph. Yeah man. Words words words words words words words words words words words words words words words words. words words words? words words...words words words.And yeah, I forgot to tell you about unix.UNIX IS SHITUNIX IS FUCKING SHIT _ _ _ _ _____ __| | | | \ | |_ _\ \/ /| | | | \| || | \ / | |_| | |\ || | / \ \___/|_| \_|___/_/\_\ ___ ____ |_ _/ ___| | |\___ \ | | ___) ||___|____/ _ _ _ / \ | \ | | / _ \ | \| | / ___ \| |\ |/_/ \_\_| \_| _ ____ ___ __ __ _____ _ _ _ ____ ___ _ _ / \ | __ ) / _ \| \/ | ____| \ | | / \ / ___| / _ \| \ | | / _ \ | _ \| | | | |\/| | _| | \| | / _ \ \___ \| | | | \| | / ___ \| |_) | |_| | | | | |___| |\ |/ ___ \ ___) | |_| | |\ |/_/ \_\____/ \___/|_| |_|_____|_| \_/_/ \_\____/ \___/|_| \_|

And.......
Did i tell you about my
MOTHERFUCKING
LISP
MACHINES???????????????

LISPMACHINES NNGH YEAH BABY OH YEAH NNNNNNNNNNNNNNNNNNNNNGH I CAN'T FUCKING HANDLE IT NNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNGH

Hm? What? Learning how to program? Discussion on good software design? Who needs that?
All weee neeeeed isssssss
LISP
MACHIENS
YEEEEEEEEEEEEEEEEEEEEEEEEEEAH

ayy lmao

Attached: 56fc41b1107fb58adcfa9e025f8a033a2157566c2994070f15310dc5297035be.jpg (659x506, 83.39K)

Literally no one but you spergs mentioned Lisp

That's a very interesting post, user, I will definitely check out this Xerox thing. I knew they were pioneers, but I never paid much attention to what they had actually achieved.

smalltalk is gay tho

Objectively wrong. You mention one that does right in your next post.
Even if you don't consider that, unix hater fag (and/or maybe similar others, like yourself) are shilling their lisp and lisp machines everywhere.

There are a lot of people over here who have no idea of what their talking about and instead go on to shill and take up "sides" of whatever they see fit, much like literal normalfags.

are you fags retarded?

I mentioned a specific HARDWARE feature of Lisp machines, nothing more.

Ah yeah, forgot about that


And that's why I said OBJECTIVELY.
and EVEN if you IGNORE that, you guys have been shitting up this board with your lisp machine shilling and your failure to come up with a design of a better system than unix, which has already been done roughly 20-30 years ago by bell labs.

And I have seen you fags over IRC. You guys know jack shit about programming aside from, maybe fizzbuzz.
This will be the last (you) I'll give you.

Top kek, who's the midget behind this post? var-g?

Hang yourself.

Trips of truth

𝙵𝚘𝚞𝚗𝚍 𝚝𝚑𝚎 𝚄𝙽𝙸𝚇 𝚞𝚜𝚎𝚛𝚜. 𝚆𝚑𝚎𝚗 𝚊𝚛𝚎 𝚢𝚘𝚞 𝚐𝚘𝚒𝚗𝚐 𝚝𝚘 𝚜𝚝𝚊𝚛𝚝 𝚞𝚜𝚒𝚗𝚐 𝚊𝚗 𝚊𝚌𝚝𝚞𝚊𝚕 𝚐𝚘𝚘𝚍 𝚙𝚛𝚘𝚐𝚛𝚊𝚖𝚖𝚒𝚗𝚐 𝚕𝚊𝚗𝚐𝚞𝚊𝚐𝚎, 𝚕𝚒𝚔𝚎 𝙲++?

a-am I allowed to come back? uwu

Umm, so since one of you said to "start using an actual good programming language", that got me thinking. Is there any other language out there that has the same level of performance as C/C++? That seems to be the most common argument in favor of those languages: writing low level stuff that needs to be fast.
Is the Rust meme capable of that? I know people make fun of it a lot here, but people have already been trying to write an OS in it, and the shills say it's safer or something, so wouldn't that be a possible C-replacement? of course the community sounds like a bunch of controlling meanies, but is it as good as they say on a technical level?

Do any of you have any other possibilities? ^.^ I think it would be nice to see, as it seems whether we want to replace C/C++ or not, we can't do it without finding something that's as fast.

Attached: boy.jpg (736x1021, 87.52K)

Bullshit. UNIX weenies believe that all Multics innovations came from their "eunuch" OS because they don't know anything about Multics. The hierarchical file system and other parts of VMS, VME, Xerox workstations, and Lisp machines are based on Multics, not UNIX.

en.wikipedia.org/wiki/ICL_VME


They're an amazing lack of quality control.

Accomplishments don't disappear just because you don't know about them.


Modern hardware does have bounds checking. x86 has segment limits and a BOUND instruction that can be used for bounds checking since the 80s. RISCs don't have it because the PDP-11 is not modern.

What I find disgusting about UNIX is that it has *never*grown any operating system extensions of its own, all thecreative work is derived from VMS, Multics and theoperating systems it killed.

If you want to remember the actual last time you edited those files, then keep your own damn database of dates and times, and stop bothering us Unix Wizards.I thought this is what RCS is for.I'm TA'ing an OS course this semester. The last lecture wasan intro to Unix since all other operating systems were onlyimperfect and premature attempts to create Unix anyway.Some lecture highlights...An aside during a discussion of uid's and many a unixweenie's obsession with them: "A lot of people in the Unixworld are weird."When asked if Ritchie et al regretted some otherinconsistency in Unix metaphysics, "These guys probablydon't care."Have another twinkie.

Some Andrew weenie, writing of Unix buffer-length bugs, says:> The big ones are grep(1) and sort(1). Their "silent> truncation" have introduced the most heinous of subtle bugs> in shell script database programs. Bugs that don't show up> until the system has been working perfectly for a long time,> and when they do show up, their only clue might be that some> inverted index doesn't have as many matches as were expected.Unix encourages, by egregious example, the mostirresponsible programming style imaginable. No errorchecking. No error messages. No conscience. If a studenthere turned in code like that, I'd flunk his ass.Unix software comes as close to real software as TeenageMutant Ninja Turtles comes to the classic Three Musketeers:a childish, vulgar, totally unsatisfying imitation.

how old is this book, mister lisp meanie? did they not have postgres back then?

Attached: postgresql_logo-555px.png (555x617, 45.99K)

>>938805
First of all, reported for tripfagging, avatarfagging, shitposting, and reddit spacing.

Second of all, C/C++ isn't really as fast as you think. If you need low level stuff, you need to use assembly. C is almost never used in embedded, you either use Forth on a dedicated stack-based processor, write assembly for some obscure processor, or use a PLC.

Rust is a fast as C, more secure and actually fun to program in. Obviously it's a good replacement, it's just not true and tried yet.

A good possiblity is also Java. It's plenty fast, easy to program in, runs everywhere, and there's a fuckton of jobs for it. Don't listen to the «haha le pajeet java user?? let me write fizzbuzz in c to show u how SMART i am», they're compeltely delusional and most likely 12 years old.

Attached: 800px-James_Joyce,_Ulysses,_1ed_2pr,_p240.jpg (800x1035, 178.39K)

You mean LARPers? Anyways Rust is better than C/C++ unless you want to target some obscure hardware for which LLVM doesn't have a backend.

Lol

Attached: DWnsr28U8AAAoPF.jpg (621x513, 35.69K)

Attached: not an argument.jpg (500x534, 39.03K)

hey friend !!!!!! :D xoxoxoxoxo
you are reeeeeeeeeealy cute !!!!! ^.^
you are reeealy a biiig klutz, aren't you? :P ;) ;) you can't even keep yourself in character when you speak!!!!!

Attached: CUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUTE.jpg (1280x720, 67.5K)

you're right. its not an argument. faggot.

Attached: DeP1W9lXUAIear2.jpg (769x560, 34.63K)

Rust has nice ideas, but ultimately the syntax ruins it all to the point where C is still better. C++ rules them all, of course, and you won't even be able to close the gap when C++2x is out.

fucking larpers caring about curly braces and keywords instead of shit that matters like memory models.

good argument, koolfaggot

Why don't you write in brainfuck then?

You do realize brainfuck would not be better if you changed all the symbols to words, and added parenthesis right? It would still be the useless shit that it is, only now more verbose. Do you even know what syntax is?

Thou understandest what is meant well, verily.
If there were to be a kind of Rust with every keyword replaced by a Unicode symbol, wouldest thou still want to use such a speech?

If size doesn't matter, why are you an incel?
LOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOL
get rekt, [k00l]n1gg3r

When was the last time thou hast sex with a maiden? Verily, I believe it was never.

Dude, just filter the namefag.

This is common in certain languages used by major financial institutions. It is also very common in theorem proving languages. Shit like that is really not a big deal. The main annoyance is setting up your keyboard bindings to deal with it.

Thou hadst not replied to my enquiry. Wouldst thou use it? Dost thou think such a speech would have quick adoption in the craft?

I would not use Rust with or without it :^). I have used unicode in a few languages that require it though.

Have fun in prison once writing software in unsafe languages becomes illegal.

...

Rust is the only ethical language :^)

Attached: DZL2WcQXcAUyGJz.jpg (1200x600, 100.04K)

That's exactly how they do it in Plan 9. Unfortunately it will never be done in Linux because it'll piss of the sysadmins

Cry more, Lispfag.

Is the bound instruction emitted by any modern compilers?

By C compilers, certainly not. And not by any other that I know. According to the SO post I linked, it takes simply too many instructions, and works with a tuple loaded in memory, so it's essentially useless. CMP with a maximum and minimum pointers stored in registers is faster.

It was never used as far as I know and I did a lot of assembly and cracking in the early '90s. It was in a large collection of garbage opcodes intended to coax pascal compilers onto the platform that pascal compilers didn't end up using. It's been deprecated longer than most of you have been alive, and was effectively removed by amd64 in 2003.

Heh, good thing i don't use unix

UNIX weenies hate over 60 years of good ideas, like segmentation, dynamic linking, strings, bounds checking, overflow checking, usable error handling, and so on. If UNIX weenies say UNIX is good because "all OS design thereafter owes itself to UNIX" and I say the OS design that newer OSes were based on actually came from Multics, they change the subject. Now it doesn't matter who invented what or what influenced what, just popularity. When someone points out that Plan 9 is neither innovative nor popular, they say some bullshit about how the industry is against innovation, but also that they should be against all innovation that didn't come from AT&T, whether it's older than the PDP-11 or from the 80s or the last few years. And you thought UNIX brain damage was metaphorical.

I don't see how being "professional" can help anything;anybody with a vaguely professional (ie non-twinkie-addled)attitude to producing robust software knows the emperor hasno clothes. The problem is a generation of swine -- bothprogrammers and marketeers -- whose comparative view of unixcomes from the vale of MS-DOS and who are particularlysusceptible to the superficial dogma of the unix cult.(They actually rather remind me of typical hyper-reactionarySoviet emigres.)These people are seemingly -incapable- of even believingthat not only is better possible, but that better could haveonce existed in the world before driven out by worse. Well,perhaps they acknowledge that there might be room for someincidental clean-ups, but nothing that the boys at Bell Labsor Sun aren't about to deal with using C++ or Plan-9, or,alternately, that the sacred Founding Fathers hadn'texpressed more perfectly in the original V7 writ (if only wepaid more heed to the true, original strains of the unixcreed!)

Segementation is still used today in the MMU.

Segmentation is practically unused for its original purpose today. It's now ghetto tagged memory on Intel for thread-local support and stack protection. Even back in the DOS days we tried to avoid using limited range selectors. VCPI allowed doing it the modern way and DPMI the backwards, archaic way with non-overlapping ranges.

Do you ever source your claims?

Yes he does! from a 9000-year old outdated book in the form of blockquotes!

I am the source, faggot.

Why would anyone trust an anonymous source for information? Get lost.

At least Plan 9 exists, works on modern hardware, and actually has autists dedicated enough to continue working on it. That's more than can be said for anything you'd like, with the best progress being on shit like some mostly dead LispOS projects and a sad Multics emulator running in a web browser.

Attached: DMmAPqbWkAAE69G.jpg (1852x1425, 183.28K)

What are you even mad about you little shit? What do you think isn't true? Do you think compiled code today is full of segment selectors? You could just fucking check rather than look like a fool.

You can buy OpenGenera for $5k and run it on top of GNU/Linux.

...

Plan 9 was a research dumpster fire where they intentionally gave every bad idea air to see if it was actually a bad idea or not. Spoiler: no good ideas were found. It's been the Edgy McEdgerson UNIX for several decades of contrarian kiddos.

Such as?