Why C?

If it wasn't unusable we'd be linking in bits of Rust code to C/C++ projects today and projects would generally become more Rusty over time. That's not happening at all because it's trash and few expect it to ever be usable. Mozilla is dogfooding it because they know it's not working out. They're trying to fix it by forcing themselves to feel the pain that others would have when mixing it in.
I do somewhat low level networking for a living and I'd love to replace some very dangerous sections of my code with something safer - it's not like the desire's not there.

you can trivially replace parts of a C codebase with ATS, which is much safer than Rust and doesn't have its problems. It's really good at doing exactly what you want to do.
... the only problem is that you have to sacrifice a schoolbus full of children to evil gods, to be granted the ability to read its error messages

Because it has been used, everything is optimized assuming that you will use it, so you use because it's optimized because it's assumed that you will use it. If there is a better way of doing things, make your own computer or forget about it. Stagnation won't go away until the industry is dead.

KILLYOURSELF RUSTFAG
yet another shill thread brought to you by our friendly mozilla paid kike shill.

a more realistic problem is not remembering that some pre-suicidal moron demands that you address it in a special way, then casually referencing that person without satisfying that demand that you didn't remember, and then having to choose "be banished to the void" over "grovel and apologize" because your sperm count is too high for that shit.
meanwhile, Rust has Communist propaganda in the official documentation
legacy projects and legacy knowledge.
it's widely known, and not that big -- it's much easier and faster to learn than any modern replacement candidates (maybe Pascal variants had a chance. not anymore). that it's very hard to extend is a bonus: you never run into projects that are exhibiting their own shitty variant of the language going on

To clarify a bit:
I am not, as much as one would believe, a rust shill. It's a popular low-level language. That was in the tech news sites for various reasons that most people would know. You can use ada or pony or whatever language you want to use. I don't care. I use C more than I use Rust.
It was a mistake bringing politics to this board, obviously.

Actual Stuff:
Makes sense, what I thought it was
The weird thing about C is that the low-level stuff is commendably uniform cross-platform, but anything higher-level than memory management, math and some well-implemented libraries become high-platform quickly.
>Coc's allow (((degenerates))) to yell at me
Using C is not going to fix the snowflake problem

C is used because it sucks. C requires more programmers, more bullshit "research", and more wasted time. Good languages need less programmers, can reuse decades of existing research, and save time. Since a C program needs more programmers just to produce a lower quality result, there is more demand for C programmers, so more universities teach C. Since C is unable to use 60 years of research that works, there is more demand for bullshit "research" to make it safer, which never works because of "unrelated" C flaws, so the problems can be "researched" again. "If it worked, it wouldn't be research!"

A few examples of this bullshit are integer overflows, buffer overflows, and incompatibility, which were all solved in the 60s but are still problems with C. These were known to be old problems with C in 1990 and it's not like nobody tried to fix them.

There are "guidelines" about how to check for overflow with addition and multiplication in C. In assembly, it's just checking a flag or comparing the high register to 0 after performing the operation, if the computer doesn't have a way to trap automatically. C needs about 8 branches (7 ifs, 1 &&) and 4 divisions (>>954468) just to check whether the result will be in range after multiplication. That's what professional "coding standards" recommend.

Most vulnerabilities, segfaults, and other errors in C are due to overflowing the bounds of strings and arrays. All this bullshit like ASLR, heap canaries, and stack canaries were invented to put a band-aid on this one problem. This was solved around 60 years ago by having the compiler compare the index to the array bounds. Tagged architectures like Lisp machines have an array data type and bounds checking in hardware to make it faster and easier for programmers and compilers. There are also around 60 years of techniques to determine when it's possible to optimize away unnecessary bounds checks.

A big feature of mainframes and workstations (not including UNIX and RISC) is compatibility between different languages. Multics and VMS have data descriptors so you can pass data between programs written in any language. Fortran and Pascal on Lisp machines share bignums, arrays, and GC with Lisp. Microkernels treat drivers as ordinary user programs, so drivers can be written in any language including interpreted languages if they're fast enough. What really sucks is that AT&T shills did not want C to be compatible because it kept people from moving away from C and UNIX.

C was "designed" in a way that these solutions don't work for it. The solutions work for other languages, like Lisp, Ada, Cobol, and BASIC, but they don't work for C, because C sucks. Instead of using decades of simple and proven solutions, the C "solution" is to throw more programmers at it and, if that doesn't work, blame the user or hardware.

Doesn't it give you that warm, confident feeling to know that things aren't Quite Right, but that you'll have to wait 'til you need something to know Just What? Life on the Edge. Get with the program -- remember -- 90% is good enough. If it's good enough for Ritchie, it's good enough for me!"It's State of the Art!" "But it doesn't work!" "That ISthe State of the Art!"Alternatively: "If it worked, it wouldn't be research!"The only problem is, outside of the demented heads of theUnix weenies, Unix is neither State of the Art nor research!

the unix hater, ladies and gentlemen. Buffer overflows?! We solved those! By removing your ability to directly work with memory! We solved it, damnit!
gosh I guess since you solved it so well, you're probably not at all interested in how dependently-typed languages can protect you from buffer overflows while still giving you direct access to memory and with no runtime bounds checks or other runtime costs, at all.

See cve.mitre.org/cgi-bin/cvekey.cgi?keyword=buffer

C is popular is because C is good. If it wasn't good it wouldn't be popular. Sounds like a case of sour grapes and pure butthurt.

Attached: francis e dec.webm (640x480, 4.95M)