Lispfag was right. UNIX has completely stunted the growth of software and its "philosophy" has led to hardware like x86 that operate like a clocked up 8080. C is a garbage larp language that promotes hacks such as null terminated strings and integer overflows are rebranded as features. Post enlightening non-shitter software that isn't C or (((UNIX))) here
Death to (((UNIX)))
Other urls found in this thread:
A good feature back when memory was limited and hardware was slow as molasses. It also gave rise to some very efficient algorithms for working with strings.
So tell me, what should happen when an integer operation results in a value that logically cannot be represented by the result's data type or the accumulator's native word size?
with Ada.Text_IO; use Ada.Text_IO;procedure Numbers is type Digit is range 1 .. 9; type Hour is mod 12; D : Digit := Digit'Last; H : Hour := Hour'Last;begin Put_Line ("max digit: " & Digit'Image (D)); -- 9 Put_Line ("max hour : " & Hour'Image (H)); -- 11 -- D := D + 1; -- fails with Constraint_Error -- Put_Line ("max+1 digit: " & Digit'Image (D)); H := H + 1; Put_Line ("max+1 hour : " & Hour'Image (H)); -- 0end Numbers;
ATS shows that A SUFFICIENTLY ADVANCED CO--type system can cope with terminated strings just fine.
Anyone considering the academician garbage deconstructing the concept of syntax known as Lisp can never be right.
The problem isn't UNIX per se, but being "good enough". Which is always a curse.
Kill yourself, faggot. CISC comes from litography improvements and uarch designers not knowing what to with all this space other than moar cores and moar caches.
Yes, it's shit.
SIGNED integer overflow, and this UB exists only because not all hardware did the same thing and they didn't want to bloat the compiler.
You retards pretending to attack the philosophy but always using historical bagage as an argument are just annoying at this point.
C++
There are some algorithms you can't use unless you know the length, like parallel searching for a substring. It's faster, eliminates buffer overflows, and requires less code. Strings do not have to be processed one byte at a time when you know the length.
An exception or converting to a bignum, which is what happens on Lisp machines because of tagged memory.
C is more of a deconstruction of syntax than Lisp, like that article Creators Admit UNIX, C Hoax. Look at the for and switch statements and some of the bullshit like array indexing being the same as pointer arithmetic. C++, Perl, and Rust derive their syntactic philosophy from C.
The only people who say UNIX is "good enough" are the weenies responding to criticism of UNIX. "Good enough" is their own euphemism for bad. They have a mentality like "yes UNIX sucks, but we're stuck with it, so you shouldn't be allowed to use anything better because then we wouldn't be stuck with it" which doesn't make any sense to me. UNIX Haters also know UNIX sucks, but know that better is possible and can exist again.
UNIX is not about minimalism, but weenies use minimalism as an excuse for why everything sucks. They say that doing something right would need more code, even though their own code has millions more lines than the operating systems that did it right decades ago.
No, CISC is good. What sucks about x86 is that it's a bunch of extensions and hacks tacked onto a 16-bit architecture. What really sucks about x86 is that the good parts like rings and segments are not used by UNIX and clones or Windows because of "lowest common denominator" RISC bullshit. Just like C weenies say that "other programmers" is the main reason to use C, their reason for not using their hardware properly is "other architectures" and it sucks.
That's why a lot of programming languages have a way to suppress the overflow check, like PL/I and Ada.
The historical baggage is part of the philosophy. The UNIX philosophy includes not rewriting code and not fixing mistakes. Plan 9 is based on modified UNIX source code even though it's supposed to be a new OS. This is the same philosophy that leads to Electron and node.js and using Github shit code "because it's free."
> There's nothing wrong with C as it was originally > designed,> ...bullshite.Since when is it acceptable for a language to incorporatetwo entirely diverse concepts such as setf and cadr into thesame operator (=), the sole semantic distinction being thatif you mean cadr and not setf, you have to bracket yourvariable with the characters that are used to representswearing in cartoons? Or do you have to do that if you meansetf, not cadr? Sigh.Wouldn't hurt to have an error handling hook, real memoryallocation (and garbage collection) routines, real datatypes with machine independent sizes (and string data typesthat don't barf if you have a NUL in them), reasonableequality testing for all types of variables without havingto call some heinous library routine like strncmp,and... and... and... Sheesh.I've always loved the "elevator controller" paradigm,because C is well suited to programming embedded controllersand not much else. Not that I'd knowingly risk my life inan elevator that was controlled by a program written in C,mind you...
based
haha
Yep, If it wasn't Unix, we would be using any other good enough solution today. Give any mainframe OS a try, there are things like allocating the number of bytes for a file when creating it and if your file grows too much you're fucked, if you allocate too many bytes, you'll be wasting valuable space. Not even to mention COBOL.
Maybe we would be using some kind of CP/M, with full blown support for file hierarchy instead of ~2 levels deep folders.
We should be using something without the limitations of POSIX today, but this would take an incredible amount of time, energy and money to rewrite dozens of software, so we stick with *nix.
By the way, how many threads are going to have about talking shit about anything Unix related? There was yet another new one shitposting about Unix philosophy, like having small software working together is exclusive to single OS.
Wrong. The UNIX philosophy says that you can write sucky software quickly and if needed you can rewrite it in the future. It's just that no one actually gets motivated to rewrite it.
People have been writing shitty software since long before unix, long before electrical computers even. And they'll keep doing it even if you manage to "kill" unix. The idea that people only write shitty software is because of some academians from the 60s is laughable. Do you think most of the pajeets shoving out java shitware have even heard of UNIX?
The UNIX philosophy is a series of fringe guidelines for writing developer friendly user interface tools. But whenever something adjacent to UNIX doesn't work the way they expect, people start raging against the unix philosophy. Whenever I get a BSOD do you think I get mad at the "windows philosophy?" Whenever safari has to reload the page because "something went wrong", so you think I get mad at the "apple philosophy"? No, because I'm not a literal retard. I realize that software errors have a single source: devs that don't give enough fucks, because they aren't being paid enough, or have tight deadlines, or have management that only cares about how pretty the software is. You can write good software in pretty much any fucking language, and you write trash software anywhere, no matter what guarantees of safety or correctness it portends to provide.
The primary factor that affects the amount of shitty code written in a language is the total amount of code written in that language. That's why the majority of shitty software, and shitty kernels, and shitty drivers has been written in c. That's why every lisp example I've ever seen has been a tower of shining purity. And that's why nothing will ever get any better, despite idiots like the OP jumping to the next new language or framework or development process or whatever the fuck every couple of years hoping it will solve all their problems for them. The problem is you OP, and you need to fix yourself before your software will improve.
The sagely Multicians said PL/I was "good enough," yet their custom compilers were slow as shit unless you restricted yourself to a tiny subset of the language. The language died for a good reason and it isn't muh weenies.
So instead of actually writing better software, you cry about it on the internet and hope some big corporation writes your dream OS for you. At this point Zig Forums and fucking /g/ have produced more successful programmers and software than the decades-old UNIX haters movement. Shit, you've even been outdone by a fucking schizophrenic who wrote a toy OS to talk to God.
So you're still in denial over why the past's memory restricted computers (especially minicomputers) would possibly skimp on memory-consuming features. Congratulations, you're the same shit as today's "just throw more hardware at it" fags.
It's time to stop huffing glue when you catch yourself promoting PL/I and GNU over UNIX because the latter is bloated and has too much code.
Stop here, retard. Lisp is the concept of having no syntax, of puting everything or the most possible in routines. Bad syntax has nothing to do with no syntax.
Nice argument you got there.
Nice strawman, Plan 9 is better than UNIX. The point is that the inertia around UNIX (well, Linux, since it tried to fix some bullshit of POSIX).
The UNIX philosophy is documented by Doug McIlroy[1] in the Bell System Technical Journal from 1978:[2]Make each program do one thing well. To do a new job, build afresh rather than complicate old programs by adding new "features".Expect the output of every program to become the input to another, as yet unknown, program. Don't clutter output with extraneous information. Avoid stringently columnar or binary input formats. Don't insist on interactive input.Design and build software, even operating systems, to be tried early, ideally within weeks. Don't hesitate to throw away the clumsy parts and rebuild them.Use tools in preference to unskilled help to lighten a programming task, even if you have to detour to build the tools and expect to throw some of them out after you've finished using them.
First point looks dangerously like minimalism to me. Oh wait, I'm responding to your argumentless assertions sincerly.
No, it's not.
And both of these languages need ridiculously huge compilers. This is retarded arguing about the past and stupid archs like Alpha, anyway. Everybody agrees C's way is outdated and was never good anyway.
Argument?
See point 2 of the aforementioned citation. Especially the "Don't hesitate to throw away the clumsy parts and rebuild them" part.
Proof and point?
Non sequitur.
Would you stop being just a giant bag of fallacies and dicks? I can't imagine you're just trolling, that would be sad.
It's not a issue of motivation, because it's not about implementation, it's about things being designed wrong, people coming to accept the way things are designed, and then not wanting anything to change because they'd have to rewrite tons of depending software. Writing shit quickly is great for spreading your virus into untapped markets before others can, it's not good for making usability of software.
Even though I'm making fun here, I think this is more descriptive than prescriptive.
What makes UNIX in the aggregate shit isn't its implementation it's its design.
this is laughable, cross-post because I can't go through this twice:
I'm not the user you're responding to and I don't necessarily hold the same viewpoint (I don't really know the user meant by "weenies" nor the history of plan9) but minimum the inner two remarks there are strawmen.
Not if you've been following the guy's posts for a while. The UNIX Haters movement is hilariously unproductive and the UNIX hater user has a long history of shilling certain hardware/software features which consume memory and refuses to consider why memory restricted environments might not implement them or why a portable-ish programming language might not be designed around said hardware features.
okay.
learn to read. I didn't say unix was perfect. I said "unix philosophy" is way more limited than lispfags give credit. none of the things you listed are endemic to unix, many are fixed in modern versions.
JSON grows in popularity with each passing day. It's also harder to parse than csv
duplicating all your data for each slight modification is a clear evil. Higher level languages do it anyways. C is low level, if you want you data duplicated do it yourself.
modern versioned filesystems/snapshots. A royal waste of space on disk, better to duplicate the data you think you'll want (the modern institution of recycling bin/trash can doesn't protect against opening a file and blanking it, approaches uselessness)
this is the most obvious example. Do you think devs actively think documentation is a bad thing? Or perchance do they not want to expend the effort writing it?
Yes it would be nice to have perfect software that works exactly as we want it to. The fact that we don't isn't because of some malicious philosophy that devs hold above all else. It's because they have to make a tradeoff, either in perf or dev time. The actual "philosophy" is small, reuasable, composable tools, something you see in lisp as much as any other programming language.
Don't think it can get much worse OP
UNIX is only "good enough" as a euphemism for bad, like saying plumbing is "too much work" because shitting your pants is "good enough."
Mainframe files can have multiple contiguous extents and also span multiple disks. Many mainframes do have files that are preallocated and contiguous on the disk because it's faster and because files are random-access and more like a database. Finding the location of a record on disk can use multiplication and division instead of a file system. UNIX doesn't allow any of this. Instead, it pretends your disk is a tape drive.
COBOL has integer overflow checks and record I/O, which makes it better than anything that came out of UNIX. The reason 60s computer scientists complained about COBOL was because their preferred languages were better, which is also true about BASIC. What sucks is that those languages in the 60s and 70s are still better than C in 2018.
Maybe, but probably we would be using something that takes full advantage of x86 segments and rings because only C and C++ have problems with segments. Without the brain-dead RISC philosophy, hardware research would be better and actually useful, so we might not be using x86 anymore either.
But that never happens because weenies say it's "good enough."
No, which is why I keep bringing up the PDP-11 when talking about why UNIX sucks. Some (but not all) of those decisions made sense given that hardware, but they no longer do. Null-terminated strings always sucked, but they were a PDP-11 convention.
C has a "deconstruction" of syntax, where all choices are arbitrary and meaningless. Is (i - 10)[a+2] to index an array enough of a "deconstruction" for you?
That's great, but UNIX doesn't actually follow any of those rules. If "Use tools in preference to unskilled help to lighten a programming task" was a rule, they wouldn't need 15,600 programmers for the Linux kernel. If "Don't hesitate to throw away the clumsy parts and rebuild them" was a rule, we wouldn't be stuck with misdesigns like /usr/bin because they ran out of disk space one time in the 70s.
The UNIX philosophy is separate programs that communicate using virtual tape devices (pipes). UNIX weenies shit on better ways of composing software like dynamic linking, packages/modules, and threads (e.g. tasks in Ada) for decades and now they want to take credit for "inventing" the whole idea of being modular and composable.
Mr. A is being hurt by a Unix bug, a bug accidentallyintroduced into the file system code when Unix was firstwritten.You see, a file is internally described by an inode entry,which contains 13 pointers to the disk blocks thatconstitute the file. The first 10 of these pointers behavenormally, but a strange mutation occured in some code leftsitting overnight on a disk in a room where the airconditioning had failed. As a result of this mutation, theeleventh pointer came to point, not to a block of the file,but to a block containing a whole bunch more (like 256)pointers. Later on, replication of the erroneous codefragment due to gamma ray damage made things even worse:Now, the twelfth pointer points to a block of pointers toblocks of pointers, and the thirteen to a pointer to blocksof pointers to blocks of pointers to blocks of pointers (orsomething like that).A result of this bug was the cancerous growth in Unix filesizes. Remember that a cancer is uncontrolled growth, inthe wrong place at the wrong time. In the intended scheme,no file could be more than 13 blocks long, and most of Unixwas designed around that assumption. The mutationsintroduced the potential for growth way beyond this designparameter. Needless to say, nothing has worked quite rightsince. The X window system is probably the worst of exampleof metastasized Unix code. I think it's safe to say that nopiece of X would have managed to survive on a Unix systemwith only the original 13 block pointers.
Could you read the stuff you answers too? What you pointed out is neat but useless; why would you care about something like this, that you'll never have to write? Count the numbers of reserved keywords in Lisp, then in C, this is what no syntax means.
Are you finished with your tantrum? Nobody denies that UNIX is a ridiculously bad implementation of its philosophy, Plan 9 being way more consistent in this regard.
Looks you're not finished pretending. Linux isn't UNIX, and 99% of Linux is drivers.
Yes, the UNIX FHS is retarded. But it's this way because it was standardized way too fast by stupid (((companies))) like DEC.
So why is this a bad thing when a pipe is just supposed to be written at one end and read at the other?
You're as retarded as you seem to be. The point of using binaries as the interface between tools is to be the most language agnostic possible. Nothing prevents you from using your way inside those tools.
I don't get the constant hateboners for Unix. It has its problems, but it is also what we are stuck with. You might as well suck it up and go on with your life. Common Lisp has all sorts of retarded baggage from being a Frankenstein Lisp stitched together from the corpses of a million dead Lisps, but Lisp programmers just get over it and move on to get the job done.
I said this before in other thread, but the argument was simple hand-waved aside. For the guys complaining about null terminated strings in C, Plan 9, which uses the 9P protocol for everything in the system, doesn't have null terminated strings. Quoting direct from intro(5):
Text strings in 9P messages are not NUL-terminated: n counts the bytes of UTF-8 data, which include no final zero byte. The NUL character is illegal in all text strings in 9P, and is therefore excluded from file names, user names, and so on.
man.cat-v.org
I care because that's the kind of bullshit C compilers have to do. UNIX, C, and JavaScript are full of bullshit like that, which is why everything needs so many millions of lines of code. All of this bullshit has negative value for users and developers. They literally have to write extra code to make things worse for users.
It shouldn't be called the "UNIX" philosophy when it doesn't even apply to UNIX. Don't forget that UNIX was a registered trademark of AT&T in the 80s, so when they were talking about "UNIX® philosophy", they were advertising a product. A better name is the You-nixed philosophy.
UNIX weenies say that everything in UNIX is retarded but for some reason they still consider themselves UNIX fanatics. The only thing UNIX weenies say is good about UNIX is pipes, which suck. Even if pipes were good, that's only 5 of the hundreds of system calls.
Because it's a slow, bloated, fragile, and unproductive way to do IPC. It's slow because data structures have to be converted into bytes or text and it needs multiple context switches. It's bloated because of all the code needed to do this on both ends. It's fragile because you can't even change a single byte or add anything without breaking existing programs. It's unproductive because all that bullshit takes extra time to write and debug.
60s operating systems were able to combine code written in different languages into one program. The UNIX creators only cared about C, which is why even C++ is a second-class language with name mangling instead of fixing the linker to support it natively.
Except for the amount of code and the speed of the programs.
That's defeatism, which is a side effect of being exposed to UNIX. If you have used good software, you will know that better is possible and already exists.
The 9P protocol doesn't have null terminated strings but Plan 9 C does and Plan 9 is written in C.
OK, please damage my brain a bit and explain to me whyanyone would want to do this. In what way is it better thanjust: cc -c -DPLANES32 source.c -o source32.owhich is how one would do it on a rational system?Or even, if there's some You-nixed (i.e., bizarre andanti-functional) interaction between -c and -o: cc -c -DPLANES32 source.c mv source.o source32.o cc -c -DPLANES8 source.cI get the feeling I'm on the edge of a breakthrough I myunderstanding of The Unix Way. Please give me another smallbrain hemorrhage and help be cross that threshold.
Lispers think they're smart but Forth does it all better. I LOL when lispers say Lisp has "no syntax".
based
Text is universal, with objects you wold have to rewrite every program to adapt to a new class. And if your data would have to go through the network, it would be necessary to serialize anyway.
Again, that's text.
Same would happen with the program had to deal with some kind of new class/method.
Change a single bite where? Assuming you're talking about the programs, then read again about adding new classes/methods. It would break everything related.
Read again the post. Plan 9 uses the 9P protocol for everything, so no null-terminated strings in the OS. The C language can have it, but not the operating system to function, noticed the difference or are you going hand-waving the argument again?
"Text is universal" is meaningless nonsense. Text is still a series of structured bytes. Text has structure.
Nope. text is just bytes interpreted without structure. It's a direct conversion.
0xa0 = \n
0x61 = a
UTF-8 could be argued to have structure, and it's really primitive:
👌 = 0xf0 0x9f 0x91 0x8c
😂 = 0xf0 0x9f 0x98 0x82
ç = 0xc3 0xa7
First byte indicates the # of bytes to follow, but you can start at any byte since just the first will begin with b11x..., the rest will be b10x...
This still a far cry from structures used by programming languages.
If that was true, one text editor would not be capable of reading a text file created by another, unrelated, text editor since text is meaningless nonsense.
Pipes don't take text, though. They take whatever a file can contain, so what's your problem again?
No, it's being realistic. Are you going to moan and bitch forever, or are you going to be the change you want to see, or will you just suck it up and carry on with work? The first option is just sad, the second option is not something most people can do or afford, so this leaves only option 3.
We need an OS that cares about licensing, not conduct
We need an OS written in a functional programming language intended for systems programming.
We need an OS which license dictate that any drivers on it must be open source.
We need an OS that first cares about security, then efficiency, then usability. This shall henceforth be called productivity.
We need an OS built for modern hardware, without the legacy baggage.
We need an OS that is provably correct in its specification.
We need an OS that supports developers willing to port the drivers needed to work on modern computers.
We need an OS that dumps x86-64.
lipsters
Yeah? So write it then.
All the means is that text is well standardized. Equivalent to saying that every audio editor being able to read a wav file means wavs are universal. (or worse: every browser being able to read html means html is universal)
just use scheme, if you want simple and beautiful language. schemes and common lisps have very different philosophy.
but you can define your own domain specific language very easily in lisp, but ofc lisp has syntax but its syntax is very simple. every language has syntax to some extent.
don't be too content with current state of affairs.
they are right
their solutions are even worse
cant believe im saying this but Cutler at Microsoft had the right ideas, what ended up happening there ofcourse is also even worse