Why C?

Unironically why is the C language used other than legacy devices?

I've written C code before. I know what pointers are, you're not cool
We don't live in the 80's anymore. Bell labs is shut down. Rust trades the milliseconds of compile and running time for infinitely more secure and elegant code.
Example language, but is a decent alternative to C.
Don't you just love being totally unable to increase the size of a string without literally editing raw memory. 'Cuz I do.
What do you mean I can't write nazi propaganda in the official documentation. (((them))) at work

Besides the gatekeeping, is this a meme language? Talking about actual professional code writing, not the autistic-tier hobbyist shit.

Attached: OverCopyright.jpg (870x545, 19.14K)

Other urls found in this thread:

cve.mitre.org/cgi-bin/cvekey.cgi?keyword=buffer
archive.is/Z0PSm
en.wikipedia.org/wiki/VistA
debian-med.debian.net/tasks/his.fr.html
datasciencecentral.com/profiles/blogs/mumps-the-most-important-database-you-probably-never-heard-of
gcc.gnu.org/onlinedocs/gcc/Optimize-Options.html
twitter.com/AnonBabble

Rust is a NIH Ada.

Like I said, just an example language. I agree that the small amount of time Ada takes to compile is fine because it has security features, though.

Hi leftypol.
I agree we don't live in Bell labs in the 80s.
rust is very elegant some would say infinitely elegant. Personally I am a fan of the infinite security. Though I must warn you that I am also not a shill.
Exactly. Code without conduct is like words without action. Conduct first, then we can code.
I think at best we can say there is room for it be and not be but also both or neither after all who is to say what is means.

C is dominant in embedded for a couple of reasons. A) Momentum, once a product is in production then "If it ain't broke don't fix it" rules the day. B) The same reasons it allowed these machines back in the day when they were first coming into existence; it's a very simple language and the compiled code could fit in the very tiny storage on these devices. As for why engineers continued using it even after the device grew in size, see A above. As for other domains, C simply is not dominant, and for good reasons. Apart from dominant OSs (all legacy, old code, see above + accidental ABIs) C isn't on very many large codebases. Java, C++, and (shudder) .Net is much larger representation there.

Faggots not welcome, GTFO.

C works on everything, it's very close to the hardware and so much so it can even be used as an intermediate language by other languages, it can be reduced to no startup time and overhead at all for use in smol areas like old MBR-style boot sectors, and it can usually be coaxed into producing the exact assembly a programmer wants without having to sacrifice portability. It's an excellent language to sit right above the assembly level and provide a common ground for everything else. It's not used because it's entrenched, there's just no competing language that can fully replace it.
Rust currently can't replace C as it takes nearly 10 minutes to compile even small (for a systems language) projects like servo in debug mode. Productivity and scaling are measured by the length of the build-run-test cycle and that's 100x longer that what I deal with daily on a project 10x larger in C. It's also unusual to do development and testing of C programs with different compiler options than will be used for the release as you might not notice bugs that only happen in the released version so that comparison should really be against optimized Rust which takes another 3x the time.
The language's other issues aren't worth talking about as it's already excluded by just not being usable at all.
Ever consider that you're just not at the level of systems work and see everything as more complicated than it is?
They excluded all the angry "fuck you" greybeards from the process yet those were the systems guys who were supposed to be their target audience. And the result was they made a tool that didn't fit into their development model at all and bombed on release.

Attached: meme.png (373x395, 21.37K)

C works on mass-market hardware that target C as a language (It's in the fucking design goals).
Try compiling C to an 8008. Or even an 8080.
All languages can be used as intermediaries by all other languages. This is what it means for a language to be Turing-complete.
Any language can do this.
And any language can do this just as well as C does (and you're full of shit).
You have a poor definition of excellence.

< there's just no competing language that can fully replace it

>>>Zig Forums

From "Max Stirner - His Life and His Works" By Mackay

Attached: good catlblacks.jpg (1012x1433, 271.52K)

What possible relevance is dead hardware to this thread? Why would it matter?
Do you completely not understand the point of doing so in context is to generate machine code?
You can boot Java?

Stick to communism, it has a better chance of working out than you do.

Sunk costs. Millions of lines of C code power billions of computing devices. The Windows kernel is written in C. The Linux kernel is written in C (which means that every Android device fundamentally runs on C). The Mac OS and iOS kernels are written in C. Tons of useful libraries have been written in C. Many useful tools have grown up around C programming. Many more people can program in C than in your boutique lang-of-the-week.

Billions (or tens or hundreds of billions) of dollars worth of engineering effort have gone C programs in the last 40 years. Rust is a pickaxe chipping away at a glacier.

If it wasn't unusable we'd be linking in bits of Rust code to C/C++ projects today and projects would generally become more Rusty over time. That's not happening at all because it's trash and few expect it to ever be usable. Mozilla is dogfooding it because they know it's not working out. They're trying to fix it by forcing themselves to feel the pain that others would have when mixing it in.
I do somewhat low level networking for a living and I'd love to replace some very dangerous sections of my code with something safer - it's not like the desire's not there.

you can trivially replace parts of a C codebase with ATS, which is much safer than Rust and doesn't have its problems. It's really good at doing exactly what you want to do.
... the only problem is that you have to sacrifice a schoolbus full of children to evil gods, to be granted the ability to read its error messages

Because it has been used, everything is optimized assuming that you will use it, so you use because it's optimized because it's assumed that you will use it. If there is a better way of doing things, make your own computer or forget about it. Stagnation won't go away until the industry is dead.

KILLYOURSELF RUSTFAG
yet another shill thread brought to you by our friendly mozilla paid kike shill.

a more realistic problem is not remembering that some pre-suicidal moron demands that you address it in a special way, then casually referencing that person without satisfying that demand that you didn't remember, and then having to choose "be banished to the void" over "grovel and apologize" because your sperm count is too high for that shit.
meanwhile, Rust has Communist propaganda in the official documentation
legacy projects and legacy knowledge.
it's widely known, and not that big -- it's much easier and faster to learn than any modern replacement candidates (maybe Pascal variants had a chance. not anymore). that it's very hard to extend is a bonus: you never run into projects that are exhibiting their own shitty variant of the language going on

To clarify a bit:
I am not, as much as one would believe, a rust shill. It's a popular low-level language. That was in the tech news sites for various reasons that most people would know. You can use ada or pony or whatever language you want to use. I don't care. I use C more than I use Rust.
It was a mistake bringing politics to this board, obviously.

Actual Stuff:
Makes sense, what I thought it was
The weird thing about C is that the low-level stuff is commendably uniform cross-platform, but anything higher-level than memory management, math and some well-implemented libraries become high-platform quickly.
>Coc's allow (((degenerates))) to yell at me
Using C is not going to fix the snowflake problem

C is used because it sucks. C requires more programmers, more bullshit "research", and more wasted time. Good languages need less programmers, can reuse decades of existing research, and save time. Since a C program needs more programmers just to produce a lower quality result, there is more demand for C programmers, so more universities teach C. Since C is unable to use 60 years of research that works, there is more demand for bullshit "research" to make it safer, which never works because of "unrelated" C flaws, so the problems can be "researched" again. "If it worked, it wouldn't be research!"

A few examples of this bullshit are integer overflows, buffer overflows, and incompatibility, which were all solved in the 60s but are still problems with C. These were known to be old problems with C in 1990 and it's not like nobody tried to fix them.

There are "guidelines" about how to check for overflow with addition and multiplication in C. In assembly, it's just checking a flag or comparing the high register to 0 after performing the operation, if the computer doesn't have a way to trap automatically. C needs about 8 branches (7 ifs, 1 &&) and 4 divisions (>>954468) just to check whether the result will be in range after multiplication. That's what professional "coding standards" recommend.

Most vulnerabilities, segfaults, and other errors in C are due to overflowing the bounds of strings and arrays. All this bullshit like ASLR, heap canaries, and stack canaries were invented to put a band-aid on this one problem. This was solved around 60 years ago by having the compiler compare the index to the array bounds. Tagged architectures like Lisp machines have an array data type and bounds checking in hardware to make it faster and easier for programmers and compilers. There are also around 60 years of techniques to determine when it's possible to optimize away unnecessary bounds checks.

A big feature of mainframes and workstations (not including UNIX and RISC) is compatibility between different languages. Multics and VMS have data descriptors so you can pass data between programs written in any language. Fortran and Pascal on Lisp machines share bignums, arrays, and GC with Lisp. Microkernels treat drivers as ordinary user programs, so drivers can be written in any language including interpreted languages if they're fast enough. What really sucks is that AT&T shills did not want C to be compatible because it kept people from moving away from C and UNIX.

C was "designed" in a way that these solutions don't work for it. The solutions work for other languages, like Lisp, Ada, Cobol, and BASIC, but they don't work for C, because C sucks. Instead of using decades of simple and proven solutions, the C "solution" is to throw more programmers at it and, if that doesn't work, blame the user or hardware.

Doesn't it give you that warm, confident feeling to know that things aren't Quite Right, but that you'll have to wait 'til you need something to know Just What? Life on the Edge. Get with the program -- remember -- 90% is good enough. If it's good enough for Ritchie, it's good enough for me!"It's State of the Art!" "But it doesn't work!" "That ISthe State of the Art!"Alternatively: "If it worked, it wouldn't be research!"The only problem is, outside of the demented heads of theUnix weenies, Unix is neither State of the Art nor research!

the unix hater, ladies and gentlemen. Buffer overflows?! We solved those! By removing your ability to directly work with memory! We solved it, damnit!
gosh I guess since you solved it so well, you're probably not at all interested in how dependently-typed languages can protect you from buffer overflows while still giving you direct access to memory and with no runtime bounds checks or other runtime costs, at all.

See cve.mitre.org/cgi-bin/cvekey.cgi?keyword=buffer

C is popular is because C is good. If it wasn't good it wouldn't be popular. Sounds like a case of sour grapes and pure butthurt.

Attached: francis e dec.webm (640x480, 4.95M)

They don't even have enough money to their developers.

rather, you didn't solve the problem. You had an expensive alternative which had its own problems, and in many cases people preferred to manage the first problem.

Lmao get a life. Or actually, try to write something serious in Rust.

it's only a zero-cost abstraction if your brain is free :)

Dude calm your asshole, just cus someone sticks Stirner on a fucking meme doesn't make them Zig Forums brainlet

Attached: b21dc90b2f5b04f4126444a13f37054beac8313f01b0b8c6a23614bafa0e4d8e-b.png (506x506, 321.46K)

Shiggy diggy. 10K cve's with the word buffer in it says otherwise.
Not that I'm suggesting we move to lisp. Any language that doesn't make buffer overflows trivial would do.

its super optimized, close to machine code, barebones language written for computers that are 1000 times slower than todays
i say its quite useful

Stirner memes are dogshit and haven't caught on outside Zig Forums for good reason.

Attached: smug vivian.jpg (300x300, 28.37K)

C programmer here. There are no usable alternatives. You can complain all you want, but solutions would be more useful. The closest anyone came to replacing C was with Ada but it was a warzone, required 10x the manpower, and had its own problems. Supposedly it's not nearly as shit as it used to be but the community around it is completely dead now so I dunno.

Except they literally have just not on Zig Forums. The only reason I learnt about Stirner was Telegram meme channels.

Attached: 7e223371f9171068.png (511x714, 442.08K)

I think the first time I saw them was on /lit/, pre-Zig Forums

Attached: image.png (2048x1536, 97.55K)

Well, maybe it's time to dust off Ada again, if that is even required. It is still the only non-meme safety oriented language with an actual track record that has performance comparable to C/C++.
The only problem in the past, as far as I can tell, was the lack of compilers, but that is solved by the gcc and Adacore compilers.
Looking at Ada2012, I can't see anything that would require 10x the manpower. And any extra time spend on typing will pay for itself every time somebody looks at the code.

rust?

Attached: rusted.png (764x383, 48.03K)

1) C works because it has a solid core (symbol ratio in syntax, reach of the POSIX stdlib, simplicity of the language) with a shit coating and historical baggage. It's still better than the opposite of having a shit core with gold plating.
2) C has no replacements. C++ has tons, but not C. If you can't understand the difference between C and C++, kill yourself.

Ehh I'd want an all-clear from a programming community on that one. It was full of thread safety issues and memory leaks last I tried. Most of the safety features you guys think were being used were being avoided because of the verbosity required.

C needs replacement

I should also note that getting good performance in Ada is all about disabling safety checks so it's that it's comparable to C/C++. Otherwise it's very slow by default.
As an example, this is a function that adds two integers:
0000000000000000 : 0: 48 63 c6 movslq %esi,%rax 3: 48 63 d7 movslq %edi,%rdx 6: 48 01 c2 add %rax,%rdx 9: b8 00 00 00 80 mov $0x80000000,%eax e: 48 01 c2 add %rax,%rdx 11: b8 ff ff ff ff mov $0xffffffff,%eax 16: 48 39 c2 cmp %rax,%rdx 19: 77 04 ja 1f 1b: 8d 04 37 lea (%rdi,%rsi,1),%eax 1e: c3 retq 1f: 48 8d 3d 00 00 00 00 lea 0x0(%rip),%rdi # 26 26: 48 83 ec 08 sub $0x8,%rsp 2a: be 04 00 00 00 mov $0x4,%esi 2f: 31 c0 xor %eax,%eax 31: e8 00 00 00 00 callq 36
Kinda huge for what should be 3 instructions isn't it? It's because every add and subtract is now autistically checked for overflow by default and it also now has to deal with raising errors.

Interesting, but it sounds unlikely. I'm gonna need a source on that. Unless your source is people making basic mistakes like not cleaning the heap.

Someone on Zig Forums actually understands that the languages themselves don't provide any inherrent safety or security features, and that it is the compilers/intrepeters that do all of those checks behind the scenes, and that a language is only secure by trusting the programmers to make secure compilers/interpreters. Is this /bizarrotech/?

One of the reasons I write in C is because I like C. Another reason is the simple binary interface C libraries have, which allow them to be used by any programming language out there. Another reason is access to hardware and system calls, which isn't automatic in most other languages and even when it's available it's strange due to pointers wrapped in objects rather than first class objects that are part of the language.

Also, it depends on the complexity of the program. I usually write very specific programs in C, and orchestrate them from Python, Lua or something.

That's bullshit and you know it. You can't shift every responsibility to the compiler. Language features determine what the compiler (and programmer) have to work with.

Trolling used to be a art.

Have a (you)
* an art

...

#include "share/atspre_staload.hats"implement main0(argc, argv) = println!(argv[0], "'s first arg: ", argv[1]);fails with a compile time error: typechecking has failed: there are some unsolved constraints. After Satan rewards your sacrifices with his power, you'll be able to read the rest of the error message, which plainly points to the '1' of 'argv[1]' and says for you to do that requires argc to be greater than one, and you haven't proven that.
so what do you do? easiest way is just to branch off the value of argc:#include "share/atspre_staload.hats"implement main0(argc, argv) = if argc > 1 then println!(argv[0], "'s first arg: ", argv[1]) else println!("usage: ", argv[0], " ")the compiler doesn't add any checks. and although you're adding an explicit check here, that's not really what the compiler asked for. It just asked you to prove you aren't going out of bounds. Which means this is OK:#include "share/atspre_staload.hats"implement main0(argc, argv) = if argc > 3 then ( println!("first: ", argv[1]); println!("second: " ,argv[2]); println!("third: ", argv[3]); ) else println!("usage: ", argv[0], " ")and how many checks does it have? three of them, around each of those argv[] lookups? nope. just the one check. compiler can answer simple linear questions like "if argc must be >3 at this point, is it is also > 2 like it needs to be for this lookup?"

My problem with these compiler reasoning and proofs is we have no idea they're happening. We have no idea what the compiler's thinking. You might add a check for overflow and assume the compiler will produce code that checks the number, but it might reason that overflow is undefined and thus can't happen and thus the check is unecessary and gets removed. We have no idea this happened and will end up shipping unsafe code that isn't doing a check you explicitly wrote into your code.

So don't go overboard with these compiler proofs -- they have their limits. The compiler is implementing the standard, and nobody really cares about the standard. People care about reasonable C code producing reasonable assembly code.

You've written a hello world. Good for you. C's elegance can only be understood through experience.
Try writing fast code in python
Need i say more
You keep using that word...
Documentation should be decentralized, just like the implementation.

that never happens in ATS. The theorem-proving is purely a compile-time, static analysis matter. Its not like Ada; there's no runtime code that might be added to support some guarantee, and ATS also doesn't remove 'useless' runtime code based on static knowledge. (if you compile to C, gcc might do some bogus dead code elimination, but it won't do that based on your ATS proofs at least since gcc doesn't know about them.)
it's very easy to lie to ATS though: anything you assert in an FFI declaration is going to be assumed to be completely true, and you can also introduce assertions directly -- you can brainwash ATS into believing that x < 0 after a 'val x = 5'. Example:#include "share/atspre_staload.hats"extern fun returns_zero(): int(0) = "ext#"%{int returns_zero(void) { return 5; }%}fun no_positives_please{zneg:int | zneg < 1}(n: int(zneg)): void = println!("This will never be greater than zero: ", n)implement main0() = no_positives_please(returns_zero())output:This will never be greater than zero: 5

I could offer a well thought out response, but why waste time on a Rastfag? I'll settle for calling you a nigger, you stupid nigger.

Rustfags are such a sad story. They disconnect themselves from the adults and end up producing a toy language. Adults in charge at mozilla are horrified and try to force them to see it's a toy language by making it an obstacle to build furryfux. Rust children continue playing in the ball pit instead and write lots of useless toy libraries that no one outside of their ball pit has any interest in using. They don't see the infinite compile times because as children time seems infinite.

I don't know this ATS language so I will refrain from opining.

Because it demonstrates that the "C is close to the hardware" meme is just that. It is difficult compiling C to this dead hardware because C isn't close to the hardware, the (modern) hardware supports C.
If the language has a compiler that generates machine code, then any language is suitable for this. You're trying to defend C with plain ignorance about how the tools work.
Yes you can. There is no reason that Java must be compiled to the JVM. You can generate native code with it, GCC did so for a few years. Then again, there are (wait for it) processors that execute Java bytecode natively (I heard a gasp). There are actually computers for which Java isn't a virtual machine.


The thing is, the Ada language specification requires those checks and specifies how they can be removed. C doesn't require them and most compilers have no way to add them. That Ada was used for defending human lives and C for exchanging porn speaks to the quality of each language.


Ada was full of thread safety issues because people tried to recreate pthreads with rendezvous.


Elegance:
/* Removes the first item from the list. */void * removeFirst (void ** head) { void ** c; c = *head; if (*head != NULL) { *head = *c; *c = NULL; } return c; }

...

I should learn Ada someday. Seems comfy. I remember reading an article about static memory allocation with Ada and how it's reliable and predictable; I can't seem to find that article.

Maybe you're unaware but those industries switched back to C a long time ago.

How exactly is that germane? Why do people here fall back to calling you a pinko/apparatchik if you blow them the fuck out? You're probably someone that prattles on about how "Zig Forums is not for politics" too. Idiot.

Yes, and it's causing them all kinds of problems. The reason they relinquished the mandate is because they could not find people who still wrote Ada. *Highly* critical systems still use Ada (the fly-by-wire controller on the Boeing, hospital/hospice embedded systems, etc).

The avionics industry does seem to be moving to C, but I hear it's not going well. The F-35 as a whole is some kind of joke NuFighter thing...

Oh come on, can't they make people learn the language? I simply don't understand what's so hard about this. Programmers should be better than whatever language they got through school with. Are we going to have aircraft programmed in Javascript if somehow all the C programmers die out and babby generation refuses to learn it?

Ada, like Erlang (and the L word) has good documentation--that is, if you know where to look. The problem is, contemporary programmers are used to consulting Stack Overflow/blogs/trawling the Web for "documentation" and then piecing it all together. With Ada, you just have PDFs.

I dunno how to feel about "contemporary programmers". I was born in the 90s, so technically a millennial, right? Yet I have no problem with this. I don't have any problem reading the C standards, or Javascript's ECMA standard. Most languages out there aren't standardized, and I read their reference implementation's source code instead.

I think people are just lazy and want cookbooks on how to solve whatever problems they're having without having to understand anything too deeply. This is the reason I stopped posting on Stackoverflow.

What? One of the first things I do in my makefiles is to set up bin/debug/ and bin/release/ targets. So do you release your software with -g enabled? If not, how do you debug? Do you always wait for -O3? Have you ever needed pgo? I can barely comprehend not having at least 1 debug build, are you sure this is what normal people do?

One of the F-35's problems is that it runs Javascript in Internet Explorer. I've read that the bases that fly the 35 have to get exemptions from DoD network security mandates due to how the flight software communicates status information to the ground controller.

How do you do this? How do you manage these targets?

.... What.

You'd be surprised. The other day, I met a tenderfoot programmer, must have been 16 or so. He only knew Javascript--I, being a codger in my 20s, told him to try Erlang, and I told him where he could find documentation. He then responded that he'd only really learned Javascript through YouTube and blogs. I know that he's still a kid, and he may regret that in the future, but I was still surprised. And yes, I know that "Kids these days..." is a sentiment held in almost every society throughout history.

It's not a language choice issue, it's that they can't hire any new blood because they can't compete with Silicone Valley. Would you want to work a government job on a locked-down PC using software that is 30 years out of date for certification and conformance reasons and be routinely drug tested and get no shares or chance to make it big and not be allowed to wag your dick on social media? Only the rejects would take that offer, so you get a plane designed by an army of rejects.

Spotted the commie.

And I spotted the libertard. Your playground ideology is a total laugh; at least reds can talk about how living under the Soviet Union was was still better than helotry under the Csar. State industrialization turned the Soviet Union from a slum into a global superpower with far higher literacy rates, life expectancies, and standards of living. (Because I brought up the Soviet Union, you're probably going to call me a pinko again. So, to be clear, I'm not ignoring their egregious shortcomings.)

Attached: libertard.jpg (598x398, 21.02K)

Well, in my country government jobs are highly sought after because they're highly paid and it's essentially impossible to be fired. The best programmers and IT professionals end up in the government, the rest go work in some startup where they make good yet inherently unstable money. Lots of people tend to favor the stability of government jobs.

I dunno how America works. Don't they subcontract the development of military aircraft to private companies? I mean, this seems to be the case everywhere in the world.


What I hate about these people is how they seem to be allergic to learning. Learning new languages means learning new ways to solve problems, new ways to think about stuff. "Oh I just learned JS on youtube" and how does that justify not learning about Erlang or anything else for that matter?

Actually, you'd be surprised. The government doesn't like to run software with known vulnerabilities, and 30 year old software has a lot of known vulnerabilities. There's an interesting tug-of-war between "we know the software has issues" and "we don't know what we don't know".

I have never been drug tested beyond my initial hiring.

To be fair, government work is reject work: you go into government because you couldn't make it in the corporate world. The only non-rejects are either ex-military or lazy bastards who didn't want to move. One of my government bosses ran a submarine (La Jolla) into a fishing boat.

Many of my coworkers see their work as a stepping stone into Silicon Valley. So, they want to use the languages and "technologies" that are hot there.

Government contracting doesn't pay as well, however it is difficult to be fired. And there's more stability in contracting, despite the US government being fickle. As far as I know, no contractor has ever been told "if your badge fails to work tomorrow, you've been let go", which is how a local company did a mass layoff once.

MSVC hurt you. Don't do that.
Kinda, yes. We used detached debugging symbols. The binaries as shipped have no debug info but it matches the detached stuff so I can pull up a core dump made by a release binary and have full debugging info without exposing sensitive info. At runtime it's exactly the same.
I only use O3 on small pieces of code that need it to vectorize (e.g. a bundled copy of SFMT) as it has a tendency to make things worse if used everywhere. Everything else is always O2.
It's too difficult to use and use well for most things.
Yep, at least where reliability is needed and there's a lot of process in place to achieve it.

OP you're such a faggot there is nothing wrong with C or C++ get the fuck out

Attached: what the fuck man.jpg (1095x1195, 343.11K)

Yeah, I was the last time I contracted on a government job and had to learn how to use MUMPS. That's a 50 year old system they're still using in healthcare and even inter-office communication felt like I was launching a nuclear missile. They currently plan to have replaced it by 2025.
They've all been trying to reduce how much they're required to do it because of the braindrain. The FBI on the issue 4 years ago:
archive.is/Z0PSm
That's where I'm at. Fun tales of briefly contracting for Northrop Grumman..

Well, Zig Forums is an offshoot of /lit/.

That's interesting. Can you post more details about this system? Google gives me articles about parotitis.

It's a pretty tangled mess. Where to start would be looking into VistA:
en.wikipedia.org/wiki/VistA
It's mostly all in MUMPS. MailMan is their 1950s style nuclear launch "email" program.
MUMPS is open sores now out of desperation, see here:
debian-med.debian.net/tasks/his.fr.html
MUMPS itself is a tangled mess, too. An overview with absolutely no hint of shilling:
datasciencecentral.com/profiles/blogs/mumps-the-most-important-database-you-probably-never-heard-of

Attached: 1280px-VA_VISTA_Architecture.png (1280x895, 467.57K)

Is this some shit used by the Veteran's Administration? If so I'm not surprised it's worthless garbage.

Lol Stirner busting spooks hundreds of years after his death.

I was into Erlang for a while, a few years ago. Erlang was interesting in that it had really good documentation -- exhaustive, very clear (not Haskell "here's a bunch of types. That's all you want, right?"), well written, with interlinked reference and tutorial sections. I'd never seen such a well-documented language since Perl.
But it was unpopular so you couldn't Google shit.
And I about lost my shit with people who stopped there. HOW ABOUT, INSTEAD OF FUCKING SEARCHING THE ENTIRE WEB FOR KEYWORDS, YOU GO TO THE-FUCKING-SUBJECT-AT-HAND DOT COM AND THEN CLICK ON "WHAT I WANT TO KNOW" YOU STUPID MOTHERFUCKER

more like 15 years ago come to think of it...

Because real niggaz hack c, you sorry ass bitch.

It means that you're using all the same optimization and control flags, except -g and -s. That's because these flags control the assembly output (-O-class flags especially) and if you have them set up differently for debug and release version, they can (and will) produce different assembly. The repercussion is that you may have obscure weird bugs that work themselves out with one specific assembly flag set but manifest under others. So if you only test against debug flag set you gonna miss these bugs that will manifest in release flag set.

Generally, you only use debug build for step by step execution using a debugger. You do final [unit] testing using release build.

Oh you missed out on the fun. The contractor has developed a major mission & loadout control component as a web app. Sure enough it works like shit. The airplane firmware is presumably C++ but it's fucking dogshit just as well, british F-35 couldn't even use missiles nor bombs yet. Also it's tail fins crack if it flies supersonic, so in real life scenario it can't even break the sound barrier due to safety limits.

user, they have trouble making their programmers poo in the loo.

They're generally very rare, though. In my experience, the only bugs that have popped out due to different debug and release flags have been either threading issues: actual logic errors in the code that get exposed by some section of code running faster or slower; or static initialization order issues: actual logic errors in the code that get exposed by some variable being initialized before or after another, because C++ doesn't define that order (and you can force the order, but most pajeets don't know how).

I've seen similar. I've had to tell younger people, hey, if you're getting an error on that function, instead of clicking random tutorial videos for 40 minutes, you can just look that function up in the docs. The docs will tell you how to use that thing you're trying to use. And the response is like, oh yeah good idea.

...

>What do you mean I can't write nazi propaganda in the official documentation. (((them))) at work
You're post was only retarded and naive until it went full retard right here. Ironically a CoC is more close to something like a nazi idea than whatever it "protects" against. Go suck a CoC.

...

What the fuck? How many years of development has it been?

However many until you stop funding it.

The system for the self scheduling of maintenance also presumably runs in javascript. Part of the overall F35 package is a system where if the plane detects a fault during flight it sends a message to the hangar to order the required parts and schedules a repair crew.


Too many, the F35 is an extreme example of scope creep which has resulted in a plane which doesn't do any one thing well but does everything poorly. Throughout its development the various branches of the US military decided they needed a new plane, every time this happened the requirements changed and expanded slightly.

Because I don't like change, I already settled with C and I treat all your arguments against it as personal attacks, because if they were to be valid it would mean I wasted and I am still wasting time to program in this shit and unsafe miserable language instead of D or Rust or god forbid FreePascal, it would mean I had to justify to myself this wasted time, but I wouldn't be able to therefore I wouldn't be able to look in the mirror next morning, I would totally collapse mentally in days and I would kill myself within a week.

This is what lispfags really believe.

that's funny user

Something like

# Special flags for debugging modesifeq ($(DEBUG),true) BUILD_MODE=debug CFLAGS+=-g -O0 -Wall -Werrorelse BUILD_MODE=release CFLAGS+=-O3 --staticendif
Then later

all: bin/$(BUILD_MODE)/program

And I just use `make DEBUG=true` when building. I usually have build/debug and build/release too for differentiating the .o files and letting make figure out what needs to be updated.

Guilty as charged


Really? Why? I always though O3 was like O2, but with more optimizations applied. Is there somewhere I can read more about what specific optimizations get applied at each level?

I try to make my makefiles cross platform, so usually windows needs some special stuff, like .dll s in the /bin/ folder, and various -mwindows or -lmswsock flags

The soy is strong on this one.

Attached: rust.png (501x504 72.95 KB, 18.04K)

gcc.gnu.org/onlinedocs/gcc/Optimize-Options.html

O3 is like O2 but with sketchy things that don't always help enabled. Sadly, they stuck autovectorization in there and keep changing what's necessary to enable it on O2 so older code designed for that needs it. Today, everyone has given up on the idea of compilers ever being able to do it well so use non-standard vector extensions that let you be explicit.