I've been writing C for about two years now. I have a solid grasp on the fundamentals but I'm interested in:

I've been writing C for about two years now. I have a solid grasp on the fundamentals but I'm interested in:

- Reducing code re-use
- Writing military-grade safe applications

Are there any intermediate/advanced texts out there that explore generic programming and ultra-safe code?

Attached: shut the fuck up liberal.jpeg (464x328, 39.16K)

Other urls found in this thread:

en.m.wikipedia.org/wiki/The_Power_of_10:_Rules_for_Developing_Safety-Critical_Code
hackernoon.com/so-you-think-you-know-c-8d4e2cd6f6a6
theregister.co.uk/2018/01/30/f35_dote_report_software_snafus/
twitter.com/SFWRedditVideos

Rewrite it in Rust

what the fuck, is it some machine-generated text or what?
go fix that fucker.

Sorry I've been drinking. I didn't mean "reducing."
I mean writing more reusable, generic code.

That's literally what Ada was designed for. So I guess you want Ada.

Stop telling me to write in another language. I'm aware there are safer alternatives. I want to be a better C programmer.

Then you must use one if its dialects, the righteous HolyC.
It is meant to be used in Ring 0.

C isn't military grade, kid. They use Ada for avionics and other extremely critical applications for a good reason. If you want to use C, fine, but you're not gonna be working in that sector.

I don't literally mean military you cunt.
C is used all the time in essential embedded software. I want to learn to code to that musn't-fail standard.

en.m.wikipedia.org/wiki/The_Power_of_10:_Rules_for_Developing_Safety-Critical_Code

I sadly must agree because it's true. Nuclear plant control programs are written in C, even though I would argue securing these places is even more important than military avionics, but whatever.

An article on the matter by some random guy who worked on nuclear plant control software:
hackernoon.com/so-you-think-you-know-c-8d4e2cd6f6a6

tl;dr memorize the spec to a T.

Also avoid globals,

>return (((((i >= i) > i)

I believe that's the point of the article i.e the results of examples are ambiguous and will cause problems later in execution.

Make up your mind. You either want military-grade or you don't. Maybe you shouldn't drunk-post. And yeah you can do regular embedded stuff in C, although Forth is probably better. I'm not sure what you gain from C tbh.

Internet cred.

I'm sure reactors have different computers, OS, and languages. It's not like they're all from the same era or made by same people.

Reactors have stuff like Windows XP and C.

Isn't this undefined behaviour?

What's your point exactly? They also have stuff like IBM maineframes and DOS computers.

where is this illusion coming from that they dont use C on military aircraft. they definitely do. at least at sikorsky almost everything is written in C. control systems everything. In some old things you are forced to write ada but those are exceptions to the rule on the rule

Attached: use_rust2.jpg (840x562, 258.71K)

The easiest thing you can do to make your code safer is to switch to a compiler with better diagnostics capabilities and turn them all on. Try compiling your code with Clang with -Weverything and -Werror. -Weverything enables all diagnostics and makes them appear as warnings at compile time, -Werror turns every warning into an error and forces you to fix before it will compile. You might already do this but I have to ask, most people don't bother with the tools provided by the compiler and then go and complain why their code is buggy.

For material you can read on good coding standards for high reliability systems MISRA (Motor Industry Software Reliability Association) has a set of standards which are good enough they have been adopted for military use in some cases, the only downside is that the documentation costs money.

If you want to learn by reading through code then the seL4 microkernel is second to none in terms of high reliability code, developed for military applications its well documented and every line of C in the kernel is backed up by an average of 20 lines of formal verification. The developers of seL4 have quite a bit of material on how they develop the codebase to get you started.

No clue why I inserted a ',' in there. Good thing it wasn't code ment for an airoplane.. airoplane; that's a cute word you know air-o-plane..

Anyway OP should look at CERT-C, MISRA-C, JPL, NASA and the books they reference. Nice online list of the mistakes to avoid is CWE 658.

As opposed to Forth and Ada?
It's pretty obvious that C isn't the most hipster or hardcore lang.

I already have my compiler as verbose as possible, yeah. Thanks for the actually useful info though.

Sage for somewhat off-topic, but the term "military grade" is a joke among people in the military.

So pajeet-tier code?

deserves an image tho

Attached: lego_grade.jpg (800x534, 151.86K)

Attached: soldiers_singing_military_grade.jpg (800x531, 130.05K)

why do those irishmen have ak47s?

The F-35 software is written in C++ though

Attached: 7f34299ca78de044361f7c073be60be13c858d47d65cc05c745c6f6512b6eb4b.jpg (611x960, 68.63K)

It depends on what you are doing there are different standards for different things. programming a helicopter has different standards to lets say a tank or nuclear missle. alot of the standards are hardware based. AKA you need to know where you chip came from. what beach the sand was taking from to make the silicone and a bunch of other shit. in your code you need full names for variables. so no int i or what ever. if you are using compiler optimization you need to go through the assembly and make sure the compiler didnt change the logic of your program. these are just a couple things you need to do for shit. look up some FAA standard shit and go from there. its really not worth it and its a big pain in the ass. just write your code well you dont and don't go by stupid FAA standards or whatever

what standards does a nuclear tank have?

You could try C++, Cykalin.

Look at the flags, buddy boy.

No, it's just a convoluted way of writing 2*i + 2 with a side effect of doubling the loop index on every iteration.

Attached: doomed.gif (500x213, 930.51K)

And the F-35's software is a notorious clusterfuck.

autism, languages are tools.

My point is you're wrong, reactors don't have difference computers, OSes or languages. Industrial stuff uses Windows XP and shit langs like C or Visual Basic a lot.

that's not accurate other then the C part. they may have computers that run xp but the computers don't control anything and they are only used to visualize what's going on. if you notice when new nuclear reactors are built (at least in the USA) the control center looks straight from the 1980s. they use dials and buttons to control everything since they won't fail.

You want some advice on low-level embedded "safe" software?

0) Mil-Grade is bullshit, you may like to read some specs for atomic stations; always go for hard-coded real time or simple interrupts.
1) stop with this commie bullshit; you may appeal to some eastern-slav old fucks, but they still don't know everything and fail miserably most of the time.
2) Misra-C requires you to buy it (but you may pirate it); ANSI-C is usually enough.
3) Get some "splint"-like commercial bullshit that doesn't detect shit, it may, or may not help you.
4) Write your own libraries for everything! Even sprintf, sscanf if you consider yourself a "safety"-oriented boy.
5) Do not use famous and popular libraries, specifically glibc; they are shit
I repeat:
DO NOT USE DEFAULT LIBRARIES
The reason behind this is simple they may brick your microcontroller so bad, you won't be able to reboot them using software means or even watchdog timer.
6) There is no "reusable" code really; every time you write something, you MUST rewrite it better next time; the only exclusion is some initialization routine, it may be reused many times in some of your different projects.
7) Make use of software daemon removal, or soft reset or hardware watchdog reboot.
8) Test it, until you break it; usually it breaks when you overflow its buffers or (much more common case) if you are using default libraries.
9) QNX might sound good and sound, but you must remember: the simpler your code is the easier it could be debugged. Also those canadian faggots won't help you at all.
10) Do not for for autistic "ALL INTERRUPTS" they will too break your system; choose optimal level of interrupt priority levels, put some circle buffers where needed and place checks in interrupts to check whether the previous operation is completed or not.
11) If you are not sure your system can handle all the input you MUST create a separate block specifically to feed some leftover data into.
12) Don't listen to autists sperging out: "use this flavour-of-the-month bullshit", or "some obscure one-programmer-per-continent VASTLY SUPERRIOR programming language", or "assembler 4 life". You are programming in C because it's "lingua franca" of modern embedded, and there are not that many embedded specialists, let alone embedded fortran/basic/assembler/forth/OOP autists.
13) Comment your code. Write some header documentation. Describe your functions like: what it does, its inputs, outputs.
14) Learn good coding practices, like naming your vars by their size: ui32FaggotSize, pui8InputBuffer[INPUT_BUFFER_SIZE].
15) Preprocessor directives. They may help you a lot, even when you use them in your own libraries (and i mean write your own libraries to be able to output hexadecimal, binary, char, float).
16) When you are ready to become "MIL-Grade pro" anime pro pun is intended send your code to your country govt code reviewing agency/or "patent"-like bureau; even if your code looks stolen you may remind them that you wrote it specifically for this hardware setup which is nowhere else to be found.

Not in the mood for shitpost, but you may throw some questions at me.

Are all those advices coming from experience? Did you learn them in a series of helpful books? If so, which ones?

Why do you shitpost then say you are not in the mood?

Attached: bd6f9baa7f3ed6d388674a6903ac65c914309684f87239d71b5be6f5b9520a48.jpg (255x218, 19.57K)

They really don't. Ada programmers are rare.

Cert C secure standard.

Very secure there m8
Because why use stuff that's been relentlessly proven to work over the course of 30 years of glibc, when you can just write your own shit?
HAHAHAHAHA THIS NIGGA THINKS SYSTEMS HUNGARIAN IS A GOOD NAMING PRACTICE HAHAHAHA get a load of this LARPer

Hey OP, want another good advice? Roll your own crypto.

Attached: 12471408221.jpg (610x662, 42.08K)

Don't assume that random shit that works on your desktop will work on weird as fuck embedded systems. If you have done literally any non trivial embedded work you would know this. Just because its C does not means it runs everywhere.

C is the "hardcore" one because you show how macho you are by using it. As in, seatbelts? We don't need no stinking seatbelts! So then you pile on all these other analysis tools on top, to make up for some deficiencies in the language. And then you wonder why people look at you funny when you say you want to write critical software in C. Don't get me wrong, it can be done, but the effort you're expending is much greater than needed. Only an idiot would willingly choose this path. Even Theo doesn't have a good response to the question as to why memory safe languages aren't used more in OpenBSD (except that nobody has yet stepped up to the task). I bet you 10 years from now he'll be singing a different tune, just like he changed his mind about auditing alone was insufficient after the 90's. But hey, it's your time to waste, no skin off my back (just so long as I'm not forced to depend on your machoware).

But theo is right, the other alternative are simply worse. Until rust leave its nodejs style stdlib behind, it's not a possible replacement.

Don't forget to rewrite malloc too, because that worked so great for OpenSSL.

The main reason its useless is because it does not run on the 20 different CPU architectures that C can. OpenBSD cant even begin to use it.

Large game companies do this all the time. Tons of custom allocator bullshit for different patterns of usage. Very useful.

And my point is that you should kill yourself already, tripfag.

Then use µlibc.

Large game companies can more or less afford to do this because they have the required resources. Our OP is seemingly alone, which kind of excludes that possibility.
But he can indeed integrate his program with custom malloc() implementations that you find around the web. Those will have been tested for years. That's why Free Software is so great.

I have no love for Rust, but that's the weirdest fucking anti-Rust argument I've ever heard.

>if it can't run on these other 18 architectures that constitute

Its not weird. These different BSD systems run on several architectures. Your normie servers and desktops are not most computers. There are an ungodly number of different embedded systems on different routers and what not. Arm and X86 are not the whole world.

And ignore all the other libraries, just like ops point. Guess you did not read it.

???? My point is that you should use libraries that were already written and tested. It doesn't even have to be glibc or µlibc. "Reducing code re-use" doesn't really mean anything here, unless OP literally wants to reinvent the wheel, but that's pretty much incompatible with ultra-safe programs.

Its what you have to do with c all the time anyways

i suggest rewrite it in rust

It's a usual practice, but there are tons of C libraries out there for just about everything.

He's right, and OpenBSD had problems even getting the Rust toolchain operational on ARM, which is why there's no Firefox port on ARM yet (or at least last time I checked). So it's going to take much longer to get Rust working on the more oddball stuff like m88k.
That said, I'm wondering why Ada can't be used instead of Rust. It's been around long enough to work on everything. And it's not like it's hard to learn either.


OpenSSL (one of the most critical libraries in use) was well-tested, but was hiding some nasty bugs anyway due to its overly complex implementation. Shouldn't have been this way, if the "many eyes" theory actually worked. We'll see how well that works for systemd...

They only got their shit together AFTER everyone started forking off their own implementations. They would release security patches that would randomly break ABI because of compile fuckups for unrelated things.

You can be sure OpenSSL would have had many, many more bugs should a single person have written it.

Unless it was 10 thousand lines instead of however the fuck many it was with a simpler protocol.

Well user, why aren't you writing your own TLS implementation? Apparently you're much more capable than teams of dedicated engineers across decades.

It IS undefined, because you don't have a guarantee which occurence of i is going to be evaluated first (in the best case it's implementation-specific, i.e. non-portable).

Dedicated engineers can waste a fuck ton of time over decades when they are stuck bullshitting over bad standards combined with historic bloat.

Still waiting for that cross-platform TLS lib fam.

some of us have jobs you know

Apparently you don't, or at least not in software development, since you show a lack of understanding of its process.

Multiple small groups forked OpenSSL and cut out 90% of the bloated historical bullshit. Ended up with much simpler more secure systems. Imagine what all those "dedicated engineers" at OpenSSL could have done if they were willing to take the knife to it. Really though i'm the one with the lack of understanding.

and it does not work.

theregister.co.uk/2018/01/30/f35_dote_report_software_snafus/

Go learn vhdl.

Actually test your code. Unlike 99.95% of the shit on github or your typical Linux application.

Write test setups for each function. Automate random noise input over full range of all input parameters. Test all branches of code, not just the typical code path. Be systematic about it.

All of this will force you to write the bare minimum of code with no more features than are absolutely required. This is good.

Use the bare minimum amount of libraries, or none at all if possible. Statically link. Link only what is required. Test it. If it fails, sorry, but you will be rewriting library code. This will happen. There will be bugs in your linked in library code.

Realize that you are spending 80% effort testing. Realize that you will still have bugs. Realize that other "high level" languages are actually worse for this, give you less control, and introduce exponentially more actual machine code that you will be unable to test. C may be dangerous, but the alternatives are placebos at best.

k

...

k

no

But C is a high level language. Maybe you meant scripting languages where you can't even directly control memory?
And you can't trust the C compiler, since so much is hand-waved and up to the implementation. You're much better off to just code directly in asm, that way at least there's no illusions about what the program is doing.

Not moving it man, you said it was piss easy to just write stuff, then proceeded to take the example of people who removed stuff.

This is not what happens in real life, Mr. LARP.

Implementing a 10k line spec is way easier than implement a million line spec.

kys, you obnoxious tripfag

Your logical process is correct, it's your premises that are flawed. TDD doesn't make you end up requiring 100 lines of tests for every line of code written.

TDD requires implementing the same specification twice.

1) False
2) Hardly your number exaggeration

Where's my Sneaky Pete can...


Ada is good stuff tbh fam.

doing allocation yourself isn't large-company-only territory bud

go back to {reddit,SO,HN} with this faggotry

Pilot here, looking to build my own plane. Currently experimenting various systems on drones. Avionics are extremely overpriced. I'm not a very good programmer, but I'm learning.

Modern airplanes don't use linux, windows, or any unix-based distro, because safety is the #1 priority at all times, even above data integrity. They have a real-time operating system, which basically means that when a process is started, it gets a certain amount of time to execute before the next process *must* take place. There are also a few other safety checks built directly into the silicon and coded into the custom kernel. Then, they feed the cpu fake data for x amount of time (usually around 5k hours) to 'prove' the design won't burn out early or w/e. There are a few other differences from normal computers idk about that help airplane cpu's avoid freezing up or bugging at inopportune moments. I can't speak to the specific language these things are programmed in, but the operating system and hardware itself runs on a different principle than normal computers. Most airplanes also have non rtos computers onboard, for auxilliary bullshit like those audio jacks between seats and flatscreens on newer jets.
I'd be willing to bet money that most fielded military equipment also uses a real-time operating system.

My project was using a raspberry pi to fly a drone over wifi. It worked well enough, but i sure as fuck wouldn't trust anything flight-critical to a 20$ anything with 'made in china' on it.

All that said, avionics aren't going to kill you. 90% of airplane accidents are pilot error, and the other 10% everyone usually lives.
Also, Don't fly an asian/southern american airline, ever.

You are talking about hard real time systems.
Linux and Windows are soft real time systems.

I remember back then, I had to make some I2C implementation on micro OS. It was pretty fun to do. I was told that it's the kind of stuff you had to deal with when making avionics. I never went into this field, but it would have been pretty interesting.

Remember to filter tripfags.
Scheme is Life is shamoanjac, part of the Krebs Kinder Klub.

Right right, in dynamic languages the tests are ever worse. Its more like implementing it 3 times.

The only thing I want to say is that, unless you are working on the F-35 program, man-rated software is a process, not a product. It is as much about documenting what was done and why it was done as what actually occurs.

A bunch of uncommented, unreadable code is not defensible in court that you, the company, did your due diligence in making the software safe for humans. You need that paper trail for litigation (because if you are making things that lives depend on, you will get into litigation).

And which legal case are you citing here