C weenies are trying to deflect by mentioning both C and Common Lisp, but what really sucks about C is that it restricts the hardware you are allowed to use. Tagged memory, useful hardware exception handling, hardware strings and arrays, and many other technologies that have been around for decades have huge advantages over the hardware you are using today, but C does not benefit from them.
Some reasons C sucks are pointer arithmetic, which prevents tagged memory and garbage collection, null-terminated strings, which slow every string down to 1 byte at a time, lack of error handling, which prevents useful hardware exception handling, and a brain-dead linker and "function pointers" which prevent any hardware attempts to speed up nested scopes, generics, or OOP.
C compilers need millions of lines of code to be efficient, so a C compiler on any kind of hardware would need just as much work as C on a shitty RISC like RISC-V. Other languages like Lisp, Ada, PL/I, Fortran, Basic, and so on would have serious improvements in both quality of implementation and compiler complexity by running on a tagged architecture, but C can't because anything not part of the PDP-11 is not useful to C. Even after all that code, C itself still sucks. Writing useful software (not FizzBuzz and K&R "exercises") in C is like trying to build a skyscraper out of toothpicks held together with shit. They need 15,600 "unskilled" "programmers" just to do a kernel, which sucks. OS/360 written entirely in assembly and including compilers for 5 languages (Fortran, Cobol, Algol, PL/I, and RPG) took less time and fewer people and that included creating the PL/I language and 360 hardware architecture. Multics was made with a much smaller group and that included inventing the hierarchical file system and rings and adding segmented memory to the GE hardware. Linux is mostly a clone of UNIX, an OS that already existed for 20 years, and despite all that and decades of better operating systems, Linux can't do system calls right and has an OOM killer and all that other bullshit and none of those are considered problems.
That's defeatism. You're confusing C not being compatible with other languages with being stuck with C. C would need an FFI on Lisp machines too because of the brain-dead data structures like null-terminated strings, but you wouldn't be stuck with C. We'll only be "stuck with" it if the world IQ becomes too low for anyone to ever be able to create something better again (>>988677). That might be what UNIX weenies want.
There is no difference between "the fundamental design flawsof Unix" and "the more incidental implementation bugs."If you had to write umpteen versions of of a pathname parserbecause the system didn't do it for you, I bet you wouldmake a number of "incidental implementation bugs." And itwouldn't be your fault. It just not reasonable to expectthat every programmer that comes along has thewhere-with-all to deal with aborted system calls, memoryallocation, a language that allows (make that "proudlyfeatures!") both "==" and "=" in conditional tests, etc.The fault lies with the original designers.I've been waiting for an appropriate time to use this quote.This is as good a time as ever. Programs are written to be executed by computers rather than read by humans. This complicates program comprehension, which plays a major role in software maintenance. Literate programming is an approach to improve program understanding by regarding programs as works of literature. The authors present a tool that supports literate programming in C++, based on a hypertext system. - abstract to an article in the Jan 1992 issue of the Journal of Object-Oriented programmingThe fundamental design flaw in Unix is the asinine beliefthat "programs are written to be executed by computersrather than read by humans." [Now that statement may betrue in the statistical sense in that it applies to mostprograms. But it is totally, absolutely wrong in the moralsense.]That's why we have C -- a language designed to make everymachine emulate a PDP-11. That's why we have a file systemthat forces every file to be viewed as a sequence of bytes(after all, that's what they are, right?). That's why"protocols" depend on byte-order.They have never separated the program from the machine. Itnever entered their tiny, pocket-protectored with acalculator-hanging-from-the-belt mind.