Nvidia goes to Ada

blogs.nvidia.com/blog/2019/02/05/adacore-secure-autonomous-driving/?linkId=100000004938589

adacore.com/press/adacore-enhances-security-critical-firmware-with-nvidia

Attached: 300px-Ada_Mascot_with_slogan.png (300x325, 46.03K)

Other urls found in this thread:

archive.adaic.com/intro/ada-vs-c/cada_art.pdf
wiki.mozilla.org/Oxidation#Within_Firefox
hacks.mozilla.org/2017/08/inside-a-super-fast-css-engine-quantum-css-aka-stylo/
hacks.mozilla.org/2017/10/the-whole-web-at-maximum-fps-how-webrender-gets-rid-of-jank/
www2.seas.gwu.edu/~mfeldman/ada-project-summary.html
reddit.com/r/ada/comments/a62y4o/success_with_introducing_ada_to_three_college/
archive.fo/Qy7Ge]
variety.com/2019/gaming/news/ubisoft-and-mozilla-announce-clever-commit-1203137446/
govinfo.gov/content/pkg/STATUTE-106/pdf/STATUTE-106-Pg1876.pdf
ocw.mit.edu/courses/aeronautics-and-astronautics/16-01-unified-engineering-i-ii-iii-iv-fall-2005-spring-2006/comps-programming/
ada-auth.org/standards/rm12_w_tc1/html/RM-TOC.html
discinterchange.com/TTvmsfiles.aspx
neilrieck.net/docs/openvms_notes_text_files.html
youtube.com/watch?v=i0BRn016EGI
okasaki.blogspot.com/2008/07/functional-programming-inada.html
groups.google.com/forum/#!topic/comp.lang.ada/RyjqQ2QOS1g
fsharpforfunandprofit.com/rop/
semanticscholar.org/paper/An-Experiment-in-Software-Prototyping-Productivity-Hudak-Jones/4029a3c5b19365ea3e0c453c4245eb184e038c75
twitter.com/SFWRedditImages

YAER OF ADA

Nothing has ever become better when there's a push to rewrite something that was previously C/C++ into some other meme language.

Attached: 973421bf23176071a2202228aa1a4b2e4bc011e2da2afdfa2ad37cd532e7013c.jpg (622x464, 67.7K)

...

Rust BTFO
How is that web browser engine coming along? Is is ready yet?

Maybe -- the Ada 2020 Standard is coming along nicely and has some really nice features: lock-free nonblocking containers, the `parallel` blocks/for-loops, a few new attributes like 'Reduce/'Parallel_Reduce, among others.


Patently false; and known for years -- archive.adaic.com/intro/ada-vs-c/cada_art.pdf & file:///C:/Users/tiger/AppData/Local/Temp/1991_003_001_15905.pdf

YEEEEAAAAAAAHHHHHH!
for once, they finally did something right.

How embarrassing, Tiger.

Sorry, got the local link instead of URL -- archive.adaic.com/intro/ada-vs-c/cada_art.pdf

>want to talk about daemon I rewrote from C because the C was particularly shit. Fixed ten bugs just getting the thing going on our servers because other-than-author's configuration options were never tested

I don't know what's worse: the fact that you linked a URL from your hard drive, or the fact that it's a Windows file URL.

Yep, it is -- but I popped the correct link.


Agreed -- I remember being *SEVERELY* dissapointed with CUDA because it was essentially manual insertion of GPU primitives, not significantly different from inline-assembly, rather than using Ada and taking advantage of Task and Protected types/objects.

I mean, it would have been really nice to use a implementation-pragma like:
Pragma CUDA( Some_Task );

And have the compiler pop the task onto the GPU automattically or issue a warning that it can't and use the normal CPU -- you also get the advantage of the programs being portable to other Ada compilers (though w/o the GPU support, obviously) AND when you get better GPU/compiler tech you can increase the capabilities w/o impacting extant code-bases.

*shrug*
At work I have a Windows box, a Solaris box, a VxWorks box and [in storage] a Mac.

Write in Forth, homos.

That's one thing that I really hate about C: it's *SO* fragile on those compiler-switches/build-environment. It really makes me appriciate when I (eg) took a non-trivial 30 year old Ada program written for a different compiler on a different archetecture with about a half-dozen changes (mostly because I was using an Ada 2012 compiler, and the source had used as an identifire what became new keywords).


I love Forth -- I am thinking about using it as the backend/emition for a compiler just to leverage Forth's porting power when bootstrapping the compiler.

Attached: 2c4aa2ebc35d40cafcc68603eb407702bbdd2dc29efc87d06a9c48c4dffb311f.png (500x584, 123.73K)

Remember that Paki rape gang taximan wanting his victims to call him like that?

Yes, we know.


No, and I don't want to.

Attached: 1490905674746.jpg (1280x720, 195.49K)

No. Why would I be frustrated?

Usually programs get rewritten in C/C++ and become worse, like the F-35.

How does this change anything? Firmware isn't user accessible anyways for NVIDIA GPU's etc. so it doesn't really matter. What really matters is if performance will be affected on future cards. People won't use them because they are "safe" if it means a noticeable drop in output.

It is a meme language. It isn't even used in the industry for which it was designed. If that isn't meme, then meme on.

Gno


wiki.mozilla.org/Oxidation#Within_Firefox
It's coming along quite well. The last big thing was servo, the next is WebRender.
hacks.mozilla.org/2017/08/inside-a-super-fast-css-engine-quantum-css-aka-stylo/
hacks.mozilla.org/2017/10/the-whole-web-at-maximum-fps-how-webrender-gets-rid-of-jank/

Never really looked at Ada or SPARK before. It looks awesome. I wish I had been taught this instead of Java. I can't even imagine programming in a world with such compiler-enforced pre-/post-conditions.

You're talking out of your ass:
www2.seas.gwu.edu/~mfeldman/ada-project-summary.html

Unless you're referring to the money-sink that is the F-35 and how they used C++ "because they couldn't find any Ada programmers" there's a large unwritten caveat here: "at a price we wanted to pay."

Also, train-up time in Ada pretty good:
reddit.com/r/ada/comments/a62y4o/success_with_introducing_ada_to_three_college/


It is!
That would actually be a really good question ot ask on comp.lang.ada -- and the Ada 2020 standard is going to have some *NICE* stuff.

Attached: F35 - Forever.jpg (630x834, 22.86K)

lmao you hate unix because you're a fucking wangblows nigger

No, I hate Unix because of terrible design decisions.

I'd unironically suggest that, as a base system, VMS has a much better design than Unix.

You posted one source to expect that to apply to the whole industry, while the law stipulates that _any_ language can be used if it will save money for the project. The law even calls it a meme language.

And we have a good example of money saving right there. Instead of a paying some Ada programmers N dollars a year for five years, you can pay four times the number of C++ programmers N/2 dollars a year for 50 years.

R U stupid? Both most Linux distros and Windows are just Unix-like and if they are Unix-complient then that's because some corporate distro bought the label.
It doesn't say anything about the OS.

Mozilla seem to also partner up with Ubisoft recently.
Not entirely sure in which direction money is going.
[archive.fo/Qy7Ge] variety.com/2019/gaming/news/ubisoft-and-mozilla-announce-clever-commit-1203137446/

Do you mean OpenVMS? It's probably way worse than Haiku.

I didn't post a law.
Why are you making shit up user?


There is that, too.


You know, I wonder if they even out *that* much thought into it. At some places, there's absolutely *ZERO* discussion on what to implement a new project with, and it seems to me like they go with what's popular precisely to avoid training.

Some of the mental gymnastics I've seen to avoid proper training in the corporate world are *astounding* -- akin to "yes, a safety class on power tools would take an hour; but rocks are free."

For those claiming ada is best for systems programming, why were colleges apprehensive about providing courses for ada programming after it was designed and standardized? It's a language from the 70's, but very few colleges actually teach it. Why? It's simply because it's not an efficient language for the needs that systems programmers have. It also shows how people are incapable of understanding that the problems with overflows and """"""safety"""""" are not intrinsic to software but to hardware, and anyone who argues contrary are either trying to goad argument or are ignorant.

Why do you say *probably*?
And, you should note I *am* talking about design, particularly the underlying design-philosophies, rather than the actual implementation or polish.

I dunno it's a mystery.

Source needed.

The language ws standardized in the `80s, 1983 to be precise, and this was prior to any existing compiler and incorporating bleeding-edge compiler technology into the standard as requirements.

Oh? What needs, in particular, do you have within systems programming that are not addressed?

lol nope. You hate Unix because you are a retarded wangblows nigger. Your bizarre self-justification is not something I take seriously.


lmao

Hm, intersting; using your astounding mental powers, tell me more about my own personal prefrences.

Also, could you tell me about how, since they differ from yours they are automatically wrong?

Because I haven't tested it personally but multicore processors didn't even exist back then, so it probably doesn't have support for it.
Great, memory mapping. Something every OS has now.
Both shit. Windows and MacOSX both have a proper compositor now and Linux has Wayland coming with support from GTK and Qt.
One thing done well. It prevents time from overflowing.

Are you perhaps referring to the Do one thing and do it well! phrase cause everyone knows that's total bullshit and totally inefficient.

Are you one of those GNIggers that think every Linux distro is UNIX or are you one of those Macfags who think their OS is somehow based on Linux because Mac has the bash shell and other stolen components from BSD?

No, not merely features/implementations.

Things like:
* Common Language Environment
* Record Management System
and how they fit together and present an entire system to the user (or programmer).

Read it yourself
govinfo.gov/content/pkg/STATUTE-106/pdf/STATUTE-106-Pg1876.pdf
So not only does the cost influence Ada's status of being a meme languge, any official can meme Ada out of a project simply because it will increase the unnecessary complexity of the project.
Meme language defined by law.

I didn't cite that.

Not a point.

Show me any ivy league college that teaches Ada as a required course in any CS or other degree program.
Low-latency and accuracy are two needs that Ada does not address.
So you are ignorant.

I did. Ada is a meme.

Solid argument. I bet this increases interoperability by a lot.
Seems like an integrated serializer or am I wrong?

Does MIT count? -- ocw.mit.edu/courses/aeronautics-and-astronautics/16-01-unified-engineering-i-ii-iii-iv-fall-2005-spring-2006/comps-programming/


Yes it does; look at the Real Time Annex, Numerics Annex, and the Systems Programming Annex. (D, G, & C, respectively)
ada-auth.org/standards/rm12_w_tc1/html/RM-TOC.html


Not *entirely*, but yes... RMS is more like being able to define on-disk database records. WRT text, you can have the system natively know about lines and "do the right thing" rather than having to screw around with line-endings (CR vs CRLF vs LF).

re: RMS
discinterchange.com/TTvmsfiles.aspx
neilrieck.net/docs/openvms_notes_text_files.html

The problem of """"""safety"""""" is fundamentally a fault of hardware and has nothing to do with software.

You're an idiot; safety has multpile facets.

Yes, our hardware is one area that could be much better (look at the tagged archetectures like the Burroughs) -- and indeed should be -- but saying that it has *nothing* to do with software is just stupid.

OH SHIT
Ada might become popular now, FUCK! My pajeet free life is over!

Yeah -- Ada does tend to be a good filter against pajeet.

A wangblows nigger who criticises Unix is like a gay man who talks about MGTOW.
Doubtful. All he can do is spout talking points in a vain attempt to get more cocks in his mouth.


oh no no no I know Unix is awful trash. That's why I use a totally different thing called Linux.

... yeah, "totally different".

How many Ada seminars have been created in India upon this news breaking?

Who knows.
But I doubt Pajeet would have the patience to learn anything -- they can barely handle call-center scripts, after all.

you want to know why colleges don't do a thing. The character of the people at the colleges is part of the explanation.

get men live in civilization with the rest of us, and they occasionally come in conflict with women, and they're actively getting kicked out of the protected class club. They'll be talking about the same shit.
will be: youtube.com/watch?v=i0BRn016EGI
female liberation and its consequences are the problem. The non-gay aspects of MGTOW are just people talking about libido management methods, marriage, and ... eh... PUAs being fags. It's pretty much all stuff a fag could get on board with.
Gays might be exempted from the first round of Bachelor taxes. That's about the pace of things.

Does it include functions as first class citizens? That may be the only thing that's pushing Ada down in my priority queue.

yeah. You can have HOFs, you can take pointers and pass them around. You can put them in variables. You can have closures as well. There are some limitations that you can run into due to subprogram lifetimes: your function pointer can't live longer than the function. In my code all this has meant is that I've moved 'local' function definitions off to a package.

Yes[-ish].

You can pass a function as a parameter to a GENERIC, you can pass an access to a function as a subprogram parameter, and you can use RENAMES to give a name to a function call/result.

See also:
okasaki.blogspot.com/2008/07/functional-programming-inada.html
groups.google.com/forum/#!topic/comp.lang.ada/RyjqQ2QOS1g

maybe he is using windows because he hates unix

Generic Type Element is private; -- Any non-limited constrained type. Type Index is (); -- Any discrete type. Type Vector is Array(Index range ) of Element; with Procedure Swap(Left, Right : in out Element); with Function "

This is actually a pretty neat feature.

It has nothing to do with software. If the hardware provides the checks intrinsically, then any software check would be redundant and unnecessary, bloat.

Are you stupid? You still have to encode the checks themselves into the hardware, and the best way we have of doing that is... software. Many checks in Ada actually can be elided and moved into CPU instructions as it stands. LARP harder next time.

And what hardware is doing this?
Heartbleed shows us there's not a lot that does in common, everyday, consumer-grade usage.

Careful, you're going to summon the LISP machine poster.

That would actually be kind of cool... I was hoping to get my hands on a copy of the source for Symbolics Ada.

Forth is awesome. People recommended it a few times and I finally tried it. I really like the way that it was designed, and it's the only language that I can't find anything to complain about, and it's so flexible that I can't see why you would be unable to do anything with it, since you can just build it up to whatever level you think is required. It's easy to see why someone would argue that it's the ultimate language. The postfix notation is really nice once you get used to it as well, and it legitimately makes more sense than anything else. Unique as hell and comfy as fuck. Still, I think Ada does look nice, but really verbose, and perhaps too close to natural language, which is inefficient, and kinda pointless since there is no need for a language to try that hard to be readable by people that haven't studied it. I would easily take it over Lisp (Smalltalk too, but again, really verbose). The parenthesis make my head hurt and my autism doesn't find it aesthetically pleasing enough, even though that never bothers me. I like how it works, but I couldn't get used to that.

Just mention C and/or Unix. Maybe talk about how great Unix is even though it's not. This might even be enough, actually.

Forth breaks down on generic code, on system-portable code, and on troubleshooting. And its hell of "other people's code" is much worse than most languages. Even fairly common problems like stack errors are a bitch to fix, even after you have tooling for them.
And there are a few places where Forth as practiced is pointlessly pessimal on modern systems. Like CREATE DOES> , "the pearl of Forth", often requires unnecessary memory fetches.
Implementation quality is generally poor, with a lot of surprising variations in performance, to the point that you might different parts of the same project to use different implementations.
I'm not one to say that popularity matters that much, but Forth is so unpopular that any serious project will be like mowing an abandoned lawn with tall grass without checking it over carefully--every other yard, you have to stop and pull a dead body or stolen statuary or evil spirits out of the lawn mower.
And if you think Zig Forums's full of LARPers, the Forth community'll make your head explode.

Well, I knew that all current implementations are kinda weak to begin but, but I was thinking about writing my own because that's the kind of thing that I'm interested in. The actual language is what really matters, since my original idea was to use what I learned from other languages to design my own. I don't intend to rely on someone else's work and I have no interest in "communities" (pretty damn sure that they are all crap), so it doesn't matter too much. My interest is to pretend to be living in the golden age of computing and develop basic shit on my own from scratch, pretending that it's being done for the first time, so I can understand how it was done.

Also, I am incapable of teamwork and almost dislike forking, so I have no intent of relying on anyone else. I only care about optimizations. Readability is only important if it doesn't prevent that and makes my own life easier. The machine comes first, other humans come last. I would even say that if you can't read optimized code, then you probably shouldn't be reading it. In my case, it's just fun hobby shit so it will never matter. I'm just trying to learn a lot of shit and occasionally hear what other people have to say, and then use the information to improve my ideas.

It actually *really* good for error-detection. Especially when you're refactoring nested constructs.

I flatly disagree; the readability for non-programmers is absolutely needed so that you can have domain-experts who aren't programmers understand it (say physicists or statisticians).

jesus christ, another group of tech nerds who couldn't produce a string length function or a webpage that does a database query without RCE vulns wrote an article about security, ep. #35723582375. +1 me on HN and twitter

You're not wrong -- perhaps the biggest problem/difficulty is that the industry standardized on C and C++ and waste untold millions trying to "fix C" or make a "Better C" (Java, C#, Swift, ObjectiveC, etc) and now have such emotional investiture it's like looking at a fully internalized Sunk Cost Fallacy.

Though it really is super-frustrating to be plagued by bugs that are easily detectable if not outright avoided.

wtf I love Windows and Ada now

Well, I guess assuming that anything about Ada is the way it is because it's good for that wouldn't lead me to a very inaccurate understanding of the language. The language didn't find its niche for nothing.

Well, only for source code that these people are expected to read. Other than that it doesn't matter and it's better to prioritize performance and make it easier to write. At least in my case, I don't think it's even easier to read unless you just don't know the language. Personally, I don't like having to rely on completion, I would rather have a more minimal syntax, but both have their place, I suppose. That's one reason why I enjoyed writing Forth, and it's also one of my reasons for appreciating Lisp even though it makes my head explode (just like graph paper fucks with my head and my eyes because of the excessive amount of squares, Lisp does that with parenthesis). Still, I really like the way that Forth is structured. But if I had to pick a language that is currently in actual use, I think I would pick Ada.

Why? -- Even disregarding the need for technical non-programmers to understand it, code is read much more than written, so it makes sense to optimize for readability, no?

Go for it!
Most of the Ada programmers I've met/talked with are pretty friendly.

How long until they get kicked out for being filthy kikes only to sue Adacore for 6 million shekels in libel and copyright infringement damages?

Interesting, I have developed a "blindness" to both the squares on the graph paper as well as to the parentheses in Lisp. Of course I still see them, but my brain just ignores them.

Yes, by the computer. Computers will read it many more times than humans ever will, and optimizations will increase the program's longevity as well.

Someone that can't read optimized source code is clearly not competent enough to do anything with it. If it's unoptimized, then it's not worth using and therefore not worth writing. It's good to keep people that aren't good enough away from your source. Quality control.

Well, I don't socialize. But the fact that Ada is actually used for some serious shit makes it easier to respect. It's safe to say that it's better engineered than the languages that crap consumer software is written in. If it's used by the military, and in aviation, then it's definitely doing something right considering how unreliable a lot of the C and C++ software that I have used has been. Then again, I'm sure that the people working in those fields are more competent than average as well.

By the human, too. -- Optimization for the computer is a function/property of the translator; in theory, the source language has little impact here.


I think you're misunderstanding -- Ada was designed for readability, this is completely orthogonal from the optimizations of generated code. It also makes it easier to maintain; and in my opinion Ada code is a lot easier to "come back to" after X weeks/months.

I can't into Ada. I keep writing junk in C.
Am I closet pajeet?

Learn Pascal so you can wrap your head around the syntax.

why can't you?

How are you trying to learn it? Have you read the Barnes book?

Dupix btfo, the thread. C and Unix fags, please don't kill yourselves.

Program a lot of NVIDIA firmware; do ya?

Learning C requires a healthy, preferably white brain that can be properly damaged. People that don't understand toilets probably don't have that.

Nobody can write correct and safe C code, it is impossible.

I'm a Haskell/F#/Scala dev; somebody sell me on Ada. How would I use/implement monads in it? For example, like in: fsharpforfunandprofit.com/rop/

You wouldn't. Haskell has
1. ML features, which are pretty cool, and which you can find in Ada.
2. a bunch of bad decisions
3. a bunch of features that only make sense in the context of #2
you should be well used to looking at other languages and saying "oh... weirdly, to use this I'll have to put aside all that weird stuff I learned in Haskell, like men putting away childhood toys."

Contrary to popular belief, monads aren't just some workaround to perform I/O, they're practical design patterns that make code easier to reason about, as demonstrated by that user's "railway oriented programming" link.

If Ada claims "ease of maintenance" while not supporting such basic abstractions, then it's a waste of time to learn it when you can just use the equally rigorous safety of Haskell instead.

yeah, yeah, yeah. I ran out of patience for Haskeller cultist bullshit even before I started drinking to try and free up the completely wasted skill points investments that the language encourages. You wouldn't like Ada because it's readable, and 'remotely readable' is a bad code smell to a Haskeller.

Ada is supposed be non functional and low-level, so monads has to be thrown out the window by default

I have some bad news for you:
semanticscholar.org/paper/An-Experiment-in-Software-Prototyping-Productivity-Hudak-Jones/4029a3c5b19365ea3e0c453c4245eb184e038c75

Attached: 13-Figure4-1.png (1128x732, 25.65K)

I knew Ada was a meme language, but I didn't know it was intentionally trying to be useless, lmao

good joke though. Not going to waste my time digging into how they managed to arrive at Haskell begin either of those things.

...

For shit like this I come here

Attached: 3238b7fa43e3a34519a5599003b3906e4050b4d1ab0f38dd6dfc582ae95a868c.jpg (400x400, 29.84K)