Nvidia goes to Ada

Austin Price
Austin Price

blogs.nvidia.com/blog/2019/02/05/adacore-secure-autonomous-driving/?linkId=100000004938589
To ensure that this vital software is secure, NVIDIA is working with AdaCore, a development and verification tool provider for safety and security critical software. By implementing the Ada and SPARK programming languages into certain firmware elements, we can reduce the potential for human error.

Both languages were designed with reliability, robustness and security in mind. Using them for programming can bring more efficiency to the process of verifying that code is free of bugs and vulnerabilities.

adacore.com/press/adacore-enhances-security-critical-firmware-with-nvidia
Some NVIDIA system-on-a-chip product lines will migrate to a new architecture using the RISC-V Instruction Set Architecture (ISA). Also, NVIDIA plans to upgrade select security-critical firmware software, rewriting it from C to Ada and SPARK. Both moves are intended to increase verification efficiencies to achieve compliance with the functional safety standard ISO-26262.

Attached: 300px-Ada-Mascot-with-slogan.png (46.03 KB, 300x325)

Other urls found in this thread:

archive.adaic.com/intro/ada-vs-c/cada_art.pdf
wiki.mozilla.org/Oxidation#Within_Firefox
hacks.mozilla.org/2017/08/inside-a-super-fast-css-engine-quantum-css-aka-stylo/
hacks.mozilla.org/2017/10/the-whole-web-at-maximum-fps-how-webrender-gets-rid-of-jank/
www2.seas.gwu.edu/~mfeldman/ada-project-summary.html
reddit.com/r/ada/comments/a62y4o/success_with_introducing_ada_to_three_college/
variety
govinfo.gov/content/pkg/STATUTE-106/pdf/STATUTE-106-Pg1876.pdf
ocw.mit.edu/courses/aeronautics-and-astronautics/16-01-unified-engineering-i-ii-iii-iv-fall-2005-spring-2006/comps-programming/
ada-auth.org/standards/rm12_w_tc1/html/RM-TOC.html
discinterchange.com/TTvmsfiles.aspx
neilrieck.net/docs/openvms_notes_text_files.html
youtube.com/watch?v=i0BRn016EGI
okasaki.blogspot.com/2008/07/functional-programming-inada.html
groups.google.com/forum/#!topic/comp.lang.ada/RyjqQ2QOS1g
fsharpforfunandprofit.com/rop/
semanticscholar.org/paper/An-Experiment-in-Software-Prototyping-Productivity-Hudak-Jones/4029a3c5b19365ea3e0c453c4245eb184e038c75
muen.codelabs.ch/
youtube.com/results?search_query=nightmare of C++
youtube.com/watch?v=9-_TLTdLGtc
people.eecs.berkeley.edu/~necula/cil/cil016.html
mega.nz/#F!DpAz2IgQ!nW7bPNnpJFk5CAV3ypiaHw
youtube.com/watch?v=dQw4w9WgXcQ

Zachary Baker
Zachary Baker

YAER OF ADA

Jeremiah Scott
Jeremiah Scott

Nothing has ever become better when there's a push to rewrite something that was previously C/C++ into some other meme language.

Attached: 973421bf23176071a2202228aa1a4b2e4bc011e2da2afdfa2ad37cd532e7013c.jpg (67.7 KB, 622x464)

William Nelson
William Nelson

Ada
Meme language

Henry Sanders
Henry Sanders

Rust BTFO
How is that web browser engine coming along? Is is ready yet?

Samuel Flores
Samuel Flores

Maybe -- the Ada 2020 Standard is coming along nicely and has some really nice features: lock-free nonblocking containers, the `parallel` blocks/for-loops, a few new attributes like 'Reduce/'Parallel_Reduce, among others.

Nothing has ever become better when there's a push to rewrite something that was previously C/C++ into some other meme language.
Patently false; and known for years -- archive.adaic.com/intro/ada-vs-c/cada_art.pdf & file:///C:/Users/tiger/AppData/Local/Temp/1991_003_001_15905.pdf

James Myers
James Myers

YEEEEAAAAAAAHHHHHH!
for once, they finally did something right.

Dominic Hill
Dominic Hill

file:///C:/Users/tiger/AppData/Local/Temp/1991_003_001_15905.pdf
How embarrassing, Tiger.

Brody Gutierrez
Brody Gutierrez

Sorry, got the local link instead of URL -- archive.adaic.com/intro/ada-vs-c/cada_art.pdf

Kayden Parker
Kayden Parker

want to talk about daemon I rewrote from C because the C was particularly shit. Fixed ten bugs just getting the thing going on our servers because other-than-author's configuration options were never tested
will now have people calling me 'Tiger'

Connor Peterson
Connor Peterson

file:///C:/Users/tiger/AppData/Local/Temp/1991_003_001_15905.pdf
I don't know what's worse: the fact that you linked a URL from your hard drive, or the fact that it's a Windows file URL.

Kayden Hall
Kayden Hall

Yep, it is -- but I popped the correct link.

YEEEEAAAAAAAHHHHHH!
for once, they finally did something right.
Agreed -- I remember being *SEVERELY* dissapointed with CUDA because it was essentially manual insertion of GPU primitives, not significantly different from inline-assembly, rather than using Ada and taking advantage of Task and Protected types/objects.

I mean, it would have been really nice to use a implementation-pragma like:
Pragma CUDA( Some_Task );

And have the compiler pop the task onto the GPU automattically or issue a warning that it can't and use the normal CPU -- you also get the advantage of the programs being portable to other Ada compilers (though w/o the GPU support, obviously) AND when you get better GPU/compiler tech you can increase the capabilities w/o impacting extant code-bases.

Cooper Powell
Cooper Powell

*shrug*
At work I have a Windows box, a Solaris box, a VxWorks box and [in storage] a Mac.

Noah Miller
Noah Miller

writing in a language created by a kike
Write in Forth, homos.

Cooper Morgan
Cooper Morgan

C was particularly shit. Fixed ten bugs just getting the thing going on our servers because other-than-author's configuration options were never tested
That's one thing that I really hate about C: it's *SO* fragile on those compiler-switches/build-environment. It really makes me appriciate when I (eg) took a non-trivial 30 year old Ada program written for a different compiler on a different archetecture with about a half-dozen changes (mostly because I was using an Ada 2012 compiler, and the source had used as an identifire what became new keywords).

I love Forth -- I am thinking about using it as the backend/emition for a compiler just to leverage Forth's porting power when bootstrapping the compiler.

Xavier White
Xavier White

tiger

Attached: 2c4aa2ebc35d40cafcc68603eb407702bbdd2dc29efc87d06a9c48c4dffb311f.png (123.73 KB, 500x584)

Blake Parker
Blake Parker

tiger
Remember that Paki rape gang taximan wanting his victims to call him like that?

Jordan Anderson
Jordan Anderson

Yes, we know.

Remember that Paki rape gang taximan wanting his victims to call him like that?
No, and I don't want to.

Tyler Jenkins
Tyler Jenkins

Attached: 1490905674746.jpg (195.49 KB, 1280x720)

Mason Peterson
Mason Peterson

No. Why would I be frustrated?

Jonathan Morgan
Jonathan Morgan

Usually programs get rewritten in C/C++ and become worse, like the F-35.

Sebastian Powell
Sebastian Powell

How does this change anything? Firmware isn't user accessible anyways for NVIDIA GPU's etc. so it doesn't really matter. What really matters is if performance will be affected on future cards. People won't use them because they are "safe" if it means a noticeable drop in output.
It is a meme language. It isn't even used in the industry for which it was designed. If that isn't meme, then meme on.

Logan Wright
Logan Wright

nvidia
Gno

wiki.mozilla.org/Oxidation#Within_Firefox
It's coming along quite well. The last big thing was servo, the next is WebRender.
hacks.mozilla.org/2017/08/inside-a-super-fast-css-engine-quantum-css-aka-stylo/
hacks.mozilla.org/2017/10/the-whole-web-at-maximum-fps-how-webrender-gets-rid-of-jank/

Nicholas Sanchez
Nicholas Sanchez

Never really looked at Ada or SPARK before. It looks awesome. I wish I had been taught this instead of Java. I can't even imagine programming in a world with such compiler-enforced pre-/post-conditions.

Robert Hill
Robert Hill

It is a meme language. It isn't even used in the industry for which it was designed.

You're talking out of your ass:
www2.seas.gwu.edu/~mfeldman/ada-project-summary.html

Unless you're referring to the money-sink that is the F-35 and how they used C++ "because they couldn't find any Ada programmers" there's a large unwritten caveat here: "at a price we wanted to pay."

Also, train-up time in Ada pretty good:
reddit.com/r/ada/comments/a62y4o/success_with_introducing_ada_to_three_college/

Never really looked at Ada or SPARK before. It looks awesome. I wish I had been taught this instead of Java. I can't even imagine programming in a world with such compiler-enforced pre-/post-conditions.
It is!
That would actually be a really good question ot ask on comp.lang.ada -- and the Ada 2020 standard is going to have some *NICE* stuff.

Attached: F35---Forever.jpg (22.86 KB, 630x834)

Ryan Anderson
Ryan Anderson

lmao you hate unix because you're a fucking wangblows nigger

Justin Anderson
Justin Anderson

No, I hate Unix because of terrible design decisions.

I'd unironically suggest that, as a base system, VMS has a much better design than Unix.

Colton Peterson
Colton Peterson

You posted one source to expect that to apply to the whole industry, while the law stipulates that _any_ language can be used if it will save money for the project. The law even calls it a meme language.

Brandon Adams
Brandon Adams

And we have a good example of money saving right there. Instead of a paying some Ada programmers N dollars a year for five years, you can pay four times the number of C++ programmers N/2 dollars a year for 50 years.

Owen Rivera
Owen Rivera

Windows users are the autistic unix is bad faggots
R U stupid? Both most Linux distros and Windows are just Unix-like and if they are Unix-complient then that's because some corporate distro bought the label.
It doesn't say anything about the OS.

Thomas Price
Thomas Price

Mozilla seem to also partner up with Ubisoft recently.
Not entirely sure in which direction money is going.
[https://archive.fo/Qy7Ge] variety.com/2019/gaming/news/ubisoft-and-mozilla-announce-clever-commit-1203137446/

Henry Richardson
Henry Richardson

VMS
Do you mean OpenVMS? It's probably way worse than Haiku.

Robert Nelson
Robert Nelson

I didn't post a law.
Why are you making shit up user?

There is that, too.

And we have a good example of money saving right there. Instead of a paying some Ada programmers N dollars a year for five years, you can pay four times the number of C++ programmers N/2 dollars a year for 50 years.
You know, I wonder if they even out *that* much thought into it. At some places, there's absolutely *ZERO* discussion on what to implement a new project with, and it seems to me like they go with what's popular precisely to avoid training.

Some of the mental gymnastics I've seen to avoid proper training in the corporate world are *astounding* -- akin to "yes, a safety class on power tools would take an hour; but rocks are free."

Nathan Bennett
Nathan Bennett

For those claiming ada is best for systems programming, why were colleges apprehensive about providing courses for ada programming after it was designed and standardized? It's a language from the 70's, but very few colleges actually teach it. Why? It's simply because it's not an efficient language for the needs that systems programmers have. It also shows how people are incapable of understanding that the problems with overflows and """"""safety"""""" are not intrinsic to software but to hardware, and anyone who argues contrary are either trying to goad argument or are ignorant.

Jonathan Jackson
Jonathan Jackson

Do you mean OpenVMS? It's probably way worse than Haiku.
Why do you say *probably*?
And, you should note I *am* talking about design, particularly the underlying design-philosophies, rather than the actual implementation or polish.

Henry Collins
Henry Collins

why were a bunch of communists and hippies unwilling to promote the official language of capitalism and baby-killing?
I dunno it's a mystery.

Lincoln Bennett
Lincoln Bennett

For those claiming ada is best for systems programming, why were colleges apprehensive about providing courses for ada programming after it was designed and standardized?
Source needed.

It's a language from the 70's, but very few colleges actually teach it.
The language ws standardized in the `80s, 1983 to be precise, and this was prior to any existing compiler and incorporating bleeding-edge compiler technology into the standard as requirements.

Why? It's simply because it's not an efficient language for the needs that systems programmers have.
Oh? What needs, in particular, do you have within systems programming that are not addressed?

Christopher Flores
Christopher Flores

lol nope. You hate Unix because you are a retarded wangblows nigger. Your bizarre self-justification is not something I take seriously.

#notall wangblows niggers
lmao

Jose Stewart
Jose Stewart

Hm, intersting; using your astounding mental powers, tell me more about my own personal prefrences.

Also, could you tell me about how, since they differ from yours they are automatically wrong?

Nathaniel Sanchez
Nathaniel Sanchez

probably
Because I haven't tested it personally but multicore processors didn't even exist back then, so it probably doesn't have support for it.
particularly the underlying design-philosophies
closed-source, proprietary
virtual memory system
Great, memory mapping. Something every OS has now.
Motif user interface (based on CDE) layered on top of OpenVMS's X11
Motif
X11
Both shit. Windows and MacOSX both have a proper compositor now and Linux has Wayland coming with support from GTK and Qt.
represents system time as the 64-bit number
One thing done well. It prevents time from overflowing.

Are you perhaps referring to the Do one thing and do it well! phrase cause everyone knows that's total bullshit and totally inefficient.
Are you one of those GNIggers that think every Linux distro is UNIX or are you one of those Macfags who think their OS is somehow based on Linux because Mac has the bash shell and other stolen components from BSD?

Grayson Smith
Grayson Smith

No, not merely features/implementations.

Things like:
* Common Language Environment
* Record Management System
and how they fit together and present an entire system to the user (or programmer).

Nicholas Perry
Nicholas Perry

Read it yourself
SEC. 9070. Notwithstanding any other provision of law, where cost effective, all Department of Defense software shall be written in the programming language Ada, in the absence of special exemption by an official designated by the Secretary of Defense.
govinfo.gov/content/pkg/STATUTE-106/pdf/STATUTE-106-Pg1876.pdf
So not only does the cost influence Ada's status of being a meme languge, any official can meme Ada out of a project simply because it will increase the unnecessary complexity of the project.
Meme language defined by law.

Mason Russell
Mason Russell

I didn't cite that.

James Cox
James Cox

Not a point.
Source needed.
Show me any ivy league college that teaches Ada as a required course in any CS or other degree program.
What needs, in particular, do you have within systems programming that are not addressed?
Low-latency and accuracy are two needs that Ada does not address.
completely ignores the fact that all problems with """"""safety"""""" is inherrently a fault at the hardware level and has nothing to do with software
So you are ignorant.

Jeremiah Perez
Jeremiah Perez

I did. Ada is a meme.

Gabriel Johnson
Gabriel Johnson

Common Language Environment, a strictly defined standard that specifies calling conventions for functions and routines, including use of stacks, registers, etc., independent of programming language
Solid argument. I bet this increases interoperability by a lot.
Record Management System
Seems like an integrated serializer or am I wrong?

Lucas Powell
Lucas Powell

Show me any ivy league college that teaches Ada as a required course in any CS or other degree program.
Does MIT count? -- ocw.mit.edu/courses/aeronautics-and-astronautics/16-01-unified-engineering-i-ii-iii-iv-fall-2005-spring-2006/comps-programming/

Low-latency and accuracy are two needs that Ada does not address.
Yes it does; look at the Real Time Annex, Numerics Annex, and the Systems Programming Annex. (D, G, & C, respectively)
ada-auth.org/standards/rm12_w_tc1/html/RM-TOC.html

Record Management System
Seems like an integrated serializer or am I wrong?
Not *entirely*, but yes... RMS is more like being able to define on-disk database records. WRT text, you can have the system natively know about lines and "do the right thing" rather than having to screw around with line-endings (CR vs CRLF vs LF).

Austin Foster
Austin Foster

re: RMS
discinterchange.com/TTvmsfiles.aspx
neilrieck.net/docs/openvms_notes_text_files.html

Austin Cruz
Austin Cruz

The problem of """"""safety"""""" is fundamentally a fault of hardware and has nothing to do with software.

Owen Hill
Owen Hill

You're an idiot; safety has multpile facets.

Yes, our hardware is one area that could be much better (look at the tagged archetectures like the Burroughs) -- and indeed should be -- but saying that it has *nothing* to do with software is just stupid.

Christian Sanders
Christian Sanders

OH SHIT
Ada might become popular now, FUCK! My pajeet free life is over!

Ethan Richardson
Ethan Richardson

Ada might become popular now, FUCK! My pajeet free life is over!
Yeah -- Ada does tend to be a good filter against pajeet.

Carter Watson
Carter Watson

A wangblows nigger who criticises Unix is like a gay man who talks about MGTOW.
b-but he might have a point
Doubtful. All he can do is spout talking points in a vain attempt to get more cocks in his mouth.

oh no no no I know Unix is awful trash. That's why I use a totally different thing called Linux.

David Lopez
David Lopez

That's why I use a totally different thing called Linux.
... yeah, "totally different".

Jacob Campbell
Jacob Campbell

How many Ada seminars have been created in India upon this news breaking?
Get ahead of the crowd and learn 100% from the Oracle of Ada for guaranteed success in our teachings of material to avoid learning the unnecessary in becoming quick Ada ready hireable tomorrow programmer.

Carter James
Carter James

How many Ada seminars have been created in India upon this news breaking?
Who knows.
But I doubt Pajeet would have the patience to learn anything -- they can barely handle call-center scripts, after all.

Juan Brooks
Juan Brooks

Not a point
you want to know why colleges don't do a thing. The character of the people at the colleges is part of the explanation.

Nathan Jenkins
Nathan Jenkins

a gay man who talks about MGTOW.
get men live in civilization with the rest of us, and they occasionally come in conflict with women, and they're actively getting kicked out of the protected class club. They'll be talking about the same shit.
will be: youtube.com/watch?v=i0BRn016EGI
female liberation and its consequences are the problem. The non-gay aspects of MGTOW are just people talking about libido management methods, marriage, and ... eh... PUAs being fags. It's pretty much all stuff a fag could get on board with.
Gays might be exempted from the first round of Bachelor taxes. That's about the pace of things.

Robert Rivera
Robert Rivera

Does it include functions as first class citizens? That may be the only thing that's pushing Ada down in my priority queue.

Jacob Sanchez
Jacob Sanchez

yeah. You can have HOFs, you can take pointers and pass them around. You can put them in variables. You can have closures as well. There are some limitations that you can run into due to subprogram lifetimes: your function pointer can't live longer than the function. In my code all this has meant is that I've moved 'local' function definitions off to a package.

Brandon Smith
Brandon Smith

Yes[-ish].

You can pass a function as a parameter to a GENERIC, you can pass an access to a function as a subprogram parameter, and you can use RENAMES to give a name to a function call/result.

See also:
okasaki.blogspot.com/2008/07/functional-programming-inada.html
groups.google.com/forum/#!topic/comp.lang.ada/RyjqQ2QOS1g

Jeremiah Gomez
Jeremiah Gomez

maybe he is using windows because he hates unix

Brody Anderson
Brody Anderson

Does it include functions as first class citizens? That may be the only thing that's pushing Ada down in my priority queue.

You can pass a function as a parameter to a GENERIC


Generic
Type Element is private; -- Any non-limited constrained type.
Type Index is (<>); -- Any discrete type.
Type Vector is Array(Index range <>) of Element;
with Procedure Swap(Left, Right : in out Element);
with Function "<" (Left, Right : Element) return boolean is <>;
Procedure Generic_Sort( Object : in out Vector );
--...
Procedure Generic_Sort( Object : in out Vector ) is
Begin
-- Bubble-sort
For Outer in Object'Range loop
For Inner in Index'Succ(Outer)..Object'Last loop
if Object(Outer) < Object(Inner) then
Swap( Object(Outer), Object(Inner) );
end if;
end loop Inner;
end loop Outer;
End Generic_Sort;

Then you could say have a "count_swap" that increments a counter as a side-effect and pass that in as the actual for "Swap" in the instantiation.

Adrian James
Adrian James

This is actually a pretty neat feature.

Aaron Moore
Aaron Moore

It has nothing to do with software. If the hardware provides the checks intrinsically, then any software check would be redundant and unnecessary, bloat.

Aaron Hill
Aaron Hill

Are you stupid? You still have to encode the checks themselves into the hardware, and the best way we have of doing that is... software. Many checks in Ada actually can be elided and moved into CPU instructions as it stands. LARP harder next time.

Michael Peterson
Michael Peterson

If the hardware provides the checks intrinsically,
And what hardware is doing this?
Heartbleed shows us there's not a lot that does in common, everyday, consumer-grade usage.

Justin Ross
Justin Ross

what hardware is doing this?
Careful, you're going to summon the LISP machine poster.

Isaac Evans
Isaac Evans

Careful, you're going to summon the LISP machine poster.
That would actually be kind of cool... I was hoping to get my hands on a copy of the source for Symbolics Ada.

Oliver Bennett
Oliver Bennett

Forth is awesome. People recommended it a few times and I finally tried it. I really like the way that it was designed, and it's the only language that I can't find anything to complain about, and it's so flexible that I can't see why you would be unable to do anything with it, since you can just build it up to whatever level you think is required. It's easy to see why someone would argue that it's the ultimate language. The postfix notation is really nice once you get used to it as well, and it legitimately makes more sense than anything else. Unique as hell and comfy as fuck. Still, I think Ada does look nice, but really verbose, and perhaps too close to natural language, which is inefficient, and kinda pointless since there is no need for a language to try that hard to be readable by people that haven't studied it. I would easily take it over Lisp (Smalltalk too, but again, really verbose). The parenthesis make my head hurt and my autism doesn't find it aesthetically pleasing enough, even though that never bothers me. I like how it works, but I couldn't get used to that.

Just mention C and/or Unix. Maybe talk about how great Unix is even though it's not. This might even be enough, actually.

Parker Diaz
Parker Diaz

Forth breaks down on generic code, on system-portable code, and on troubleshooting. And its hell of "other people's code" is much worse than most languages. Even fairly common problems like stack errors are a bitch to fix, even after you have tooling for them.
And there are a few places where Forth as practiced is pointlessly pessimal on modern systems. Like CREATE DOES> , "the pearl of Forth", often requires unnecessary memory fetches.
Implementation quality is generally poor, with a lot of surprising variations in performance, to the point that you might different parts of the same project to use different implementations.
I'm not one to say that popularity matters that much, but Forth is so unpopular that any serious project will be like mowing an abandoned lawn with tall grass without checking it over carefully--every other yard, you have to stop and pull a dead body or stolen statuary or evil spirits out of the lawn mower.
And if you think Zig Forums's full of LARPers, the Forth community'll make your head explode.

Evan Jenkins
Evan Jenkins

Well, I knew that all current implementations are kinda weak to begin but, but I was thinking about writing my own because that's the kind of thing that I'm interested in. The actual language is what really matters, since my original idea was to use what I learned from other languages to design my own. I don't intend to rely on someone else's work and I have no interest in "communities" (pretty damn sure that they are all crap), so it doesn't matter too much. My interest is to pretend to be living in the golden age of computing and develop basic shit on my own from scratch, pretending that it's being done for the first time, so I can understand how it was done.

Also, I am incapable of teamwork and almost dislike forking, so I have no intent of relying on anyone else. I only care about optimizations. Readability is only important if it doesn't prevent that and makes my own life easier. The machine comes first, other humans come last. I would even say that if you can't read optimized code, then you probably shouldn't be reading it. In my case, it's just fun hobby shit so it will never matter. I'm just trying to learn a lot of shit and occasionally hear what other people have to say, and then use the information to improve my ideas.

Josiah Reed
Josiah Reed

Still, I think Ada does look nice, but really verbose, and perhaps too close to natural language, which is inefficient,
It actually *really* good for error-detection. Especially when you're refactoring nested constructs.

and kinda pointless since there is no need for a language to try that hard to be readable by people that haven't studied it.
I flatly disagree; the readability for non-programmers is absolutely needed so that you can have domain-experts who aren't programmers understand it (say physicists or statisticians).

Levi Powell
Levi Powell

jesus christ, another group of tech nerds who couldn't produce a string length function or a webpage that does a database query without RCE vulns wrote an article about security, ep. #35723582375. +1 me on HN and twitter
Ada is crap
C is good
programming is hard, the vulns aren't my fault
C is the bestest PL ever made
Ada is crap and impractical
cars crash just from ABS malfunction and ECU vulns and bugs
programming is hard, not my fault
Ada is good! let's use it to make self driving cars

Jaxon Hall
Jaxon Hall

You're not wrong -- perhaps the biggest problem/difficulty is that the industry standardized on C and C++ and waste untold millions trying to "fix C" or make a "Better C" (Java, C#, Swift, ObjectiveC, etc) and now have such emotional investiture it's like looking at a fully internalized Sunk Cost Fallacy.

Though it really is super-frustrating to be plagued by bugs that are easily detectable if not outright avoided.

Jeremiah Jones
Jeremiah Jones

wtf I love Windows and Ada now

John Evans
John Evans

It actually *really* good for error-detection. Especially when you're refactoring nested constructs.
Well, I guess assuming that anything about Ada is the way it is because it's good for that wouldn't lead me to a very inaccurate understanding of the language. The language didn't find its niche for nothing.

the readability for non-programmers is absolutely needed so that you can have domain-experts who aren't programmers understand it (say physicists or statisticians).
Well, only for source code that these people are expected to read. Other than that it doesn't matter and it's better to prioritize performance and make it easier to write. At least in my case, I don't think it's even easier to read unless you just don't know the language. Personally, I don't like having to rely on completion, I would rather have a more minimal syntax, but both have their place, I suppose. That's one reason why I enjoyed writing Forth, and it's also one of my reasons for appreciating Lisp even though it makes my head explode (just like graph paper fucks with my head and my eyes because of the excessive amount of squares, Lisp does that with parenthesis). Still, I really like the way that Forth is structured. But if I had to pick a language that is currently in actual use, I think I would pick Ada.

Henry Ramirez
Henry Ramirez

Other than that it doesn't matter and it's better to prioritize performance and make it easier to write.
Why? -- Even disregarding the need for technical non-programmers to understand it, code is read much more than written, so it makes sense to optimize for readability, no?

But if I had to pick a language that is currently in actual use, I think I would pick Ada.
Go for it!
Most of the Ada programmers I've met/talked with are pretty friendly.

Michael Cooper
Michael Cooper

How long until they get kicked out for being filthy kikes only to sue Adacore for 6 million shekels in libel and copyright infringement damages?

Chase Evans
Chase Evans

just like graph paper fucks with my head and my eyes because of the excessive amount of squares, Lisp does that with parenthesis
Interesting, I have developed a "blindness" to both the squares on the graph paper as well as to the parentheses in Lisp. Of course I still see them, but my brain just ignores them.

Henry Perez
Henry Perez

code is read much more than written
Yes, by the computer. Computers will read it many more times than humans ever will, and optimizations will increase the program's longevity as well.

so it makes sense to optimize for readability, no?
Someone that can't read optimized source code is clearly not competent enough to do anything with it. If it's unoptimized, then it's not worth using and therefore not worth writing. It's good to keep people that aren't good enough away from your source. Quality control.

Most of the Ada programmers I've met/talked with are pretty friendly.
Well, I don't socialize. But the fact that Ada is actually used for some serious shit makes it easier to respect. It's safe to say that it's better engineered than the languages that crap consumer software is written in. If it's used by the military, and in aviation, then it's definitely doing something right considering how unreliable a lot of the C and C++ software that I have used has been. Then again, I'm sure that the people working in those fields are more competent than average as well.

Jordan Garcia
Jordan Garcia

code is read much more than writtenYes, by the computer. Computers will read it many more times than humans ever will, and optimizations will increase the program's longevity as well.
By the human, too. -- Optimization for the computer is a function/property of the translator; in theory, the source language has little impact here.

Someone that can't read optimized source code is clearly not competent enough to do anything with it. If it's unoptimized, then it's not worth using and therefore not worth writing. It's good to keep people that aren't good enough away from your source. Quality control.
I think you're misunderstanding -- Ada was designed for readability, this is completely orthogonal from the optimizations of generated code. It also makes it easier to maintain; and in my opinion Ada code is a lot easier to "come back to" after X weeks/months.

John Davis
John Davis

I can't into Ada. I keep writing junk in C.
Am I closet pajeet?

Evan Allen
Evan Allen

Learn Pascal so you can wrap your head around the syntax.

Levi Johnson
Levi Johnson

why can't you?

Brody Reyes
Brody Reyes

How are you trying to learn it? Have you read the Barnes book?

Parker Baker
Parker Baker

Dupix btfo, the thread. C and Unix fags, please don't kill yourselves.

Eli Walker
Eli Walker

Program a lot of NVIDIA firmware; do ya?

Nicholas Barnes
Nicholas Barnes

Learning C requires a healthy, preferably white brain that can be properly damaged. People that don't understand toilets probably don't have that.

Zachary Miller
Zachary Miller

Nobody can write correct and safe C code, it is impossible.

Wyatt Young
Wyatt Young

I'm a Haskell/F#/Scala dev; somebody sell me on Ada. How would I use/implement monads in it? For example, like in: fsharpforfunandprofit.com/rop/

Ethan Rivera
Ethan Rivera

You wouldn't. Haskell has
1. ML features, which are pretty cool, and which you can find in Ada.
2. a bunch of bad decisions
3. a bunch of features that only make sense in the context of #2
you should be well used to looking at other languages and saying "oh... weirdly, to use this I'll have to put aside all that weird stuff I learned in Haskell, like men putting away childhood toys."

Hunter Perez
Hunter Perez

Contrary to popular belief, monads aren't just some workaround to perform I/O, they're practical design patterns that make code easier to reason about, as demonstrated by that user's "railway oriented programming" link.

If Ada claims "ease of maintenance" while not supporting such basic abstractions, then it's a waste of time to learn it when you can just use the equally rigorous safety of Haskell instead.

Tyler Cooper
Tyler Cooper

yeah, yeah, yeah. I ran out of patience for Haskeller cultist bullshit even before I started drinking to try and free up the completely wasted skill points investments that the language encourages. You wouldn't like Ada because it's readable, and 'remotely readable' is a bad code smell to a Haskeller.

Lincoln Martin
Lincoln Martin

Ada is supposed be non functional and low-level, so monads has to be thrown out the window by default

Ethan Moore
Ethan Moore

'remotely readable' is a bad code smell to a Haskeller.

I have some bad news for you:
semanticscholar.org/paper/An-Experiment-in-Software-Prototyping-Productivity-Hudak-Jones/4029a3c5b19365ea3e0c453c4245eb184e038c75
Understandability
Ada: A
Haskell: A+

Learnability
Ada: C
Haskell: A

Attached: 13-Figure4-1.png (25.65 KB, 1128x732)

Justin Young
Justin Young

Ada is supposed be non functional
I knew Ada was a meme language, but I didn't know it was intentionally trying to be useless, lmao

Kevin Jenkins
Kevin Jenkins

bad news
good joke though. Not going to waste my time digging into how they managed to arrive at Haskell begin either of those things.

Lucas Gutierrez
Lucas Gutierrez

<Y-y-you're j-j-just a cult
Get off my board brainlet

Gabriel Gray
Gabriel Gray

file://
tiger
C:

For shit like this I come here

Attached: 3238b7fa43e3a34519a5599003b3906e4050b4d1ab0f38dd6dfc582ae95a868c.jpg (29.84 KB, 400x400)

Liam Campbell
Liam Campbell

ADA was the right choice for something as crucial as this.
kernel written in ADA when?

Thomas Lopez
Thomas Lopez

Enjoy you're overflow or frame attacks

Jayden Hernandez
Jayden Hernandez

I'm a registered sex offender, is Ada the right language for me?

Andrew Myers
Andrew Myers

LGBTPBBQ
No. Ada is too complicated for your cout<< code.
cd..

Josiah Roberts
Josiah Roberts

If a brick landed on your head and took away only your knowledge of Haskell, you'd become a better programmer.

Ayden Robinson
Ayden Robinson

Well, it depends -- some simple monadic things are built into the language; example modular arithmetic.

-- Declares a numeric type [0..20], w/ wrap-around.
Type Cycle is mod 20;

Or you could use generics to encapsulate your monad and its generic functionality.

Ada is supposed be non functional and low-level, so monads has to be thrown out the window by default
Funny,
It is funny, because Chris Oakasaki has a blog post about using Ada for functional programming:
okasaki.blogspot.com/2008/07/functional-programming-inada.html

kernel written in ADA when?
There is/was some interest in comp.lang.ada about doing an OS in Ada.

I don't know if you'd count it, but there's the Muen Separation Kernel:
muen.codelabs.ch/

Jeremiah Kelly
Jeremiah Kelly

Sorry, that should be [0..19] -- haven't had coffee yet.

Owen Carter
Owen Carter

looks like fun

Benjamin Flores
Benjamin Flores

be the only retard in the whole imageboard ecosystem to shill Ada
nobody listens to me
ff 2 years
more and more people shill Ada
be b& from 4chan because /g/anny unironically shits his pants when he sees cartoon frogs
go to its goat fucker commie twin
see this
Mind explaining why Ada is making a comeback? Cause I don't see any reason.

Adrian Green
Adrian Green

you were shilling Ada on /g/?
I only saw a few of their AoC threads, but I didn't see any Ada.
You missed an opportunity there.

Asher Ortiz
Asher Ortiz

Do you really expect anyone to believe that shit?

Charles Long
Charles Long

The thing I dont get about ADA is the license. Can I write closed source software with GNAT? Is the compiler itself open source? Ive been interested in it for sometime but this issue is holding me back from investing time learning it.

Thomas Sullivan
Thomas Sullivan

cuckchan nigger confused about why anyone would use the thing he spent 2 years shilling
easily the most pathetic site and users on the internet

Angel Davis
Angel Davis

fsf-gnat is free to use.
adacore-gnat you can freely use for any open source project. commercial projects require a license.
gnat's not the only Ada implementation, too.

Luke Long
Luke Long

What does adacore have that gnat doesnt?

Benjamin Hernandez
Benjamin Hernandez

you were shilling Ada on /g/?
On /g/ and Zig Forums, but it was 2 years ago.

Look at /g/ in the evening (PST), and see how many pepe pics get deleted within minutes.

The thing is I stopped shilling it when I actually learned how to use C++ cause Ada is mostly made for formal methods (which I don't use). But yeah, recently some tard on /g/ unironically blamed Unix when I told him pre-11 C didn't have atomics.

Levi Gonzalez
Levi Gonzalez

here the pepe pics are usually in threads that have nothing to do with technology. this board does not have real mods so they will stay up

Elijah Adams
Elijah Adams

*shrug*. apparently it has a slower release schedule, although adacore-gnat only releases twice a year. All I know is, Fedora 'gnat' doesn't come with the pretty printer (gnatpp), and AdaCore's package does come with it.
why wasn't my insincere shilling persuasive?
btw using C++ now
I don't care about formal methods, but I care about a language not being a nightmare to use correctly:
youtube.com/results?search_query=nightmare of C++

Benjamin Fisher
Benjamin Fisher

should've just linked this one: youtube.com/watch?v=9-_TLTdLGtc

Asher Cox
Asher Cox

Is it possible to use ADA to write GUI apps?

Cameron Smith
Cameron Smith

Skimmed the slides. As expected, the guy's just doing a bunch of shit nobody ever does and blames the language when it doesn't work or gets incomprehensible. In C too you can do that.
people.eecs.berkeley.edu/~necula/cil/cil016.html

Alexander Scott
Alexander Scott

ok let me correct my greentext of you
btw using a random subset of C++ with lots of hidden performance penalties
I'm not writing template libraries here so I don't need to know how this language works :p

Ryan Hall
Ryan Hall

C is not a subset of C++. They are two distinct languages with their own syntax and semantics.

Lucas Moore
Lucas Moore

C is not a subset of C++
WHERE DO YOU SEE ME SAYING LIKE THAT, YOU MORON

Zachary Edwards
Zachary Edwards

What about GPS ide, can I use it for closed source without paying?

Hunter Edwards
Hunter Edwards

It doesn't matter how you say it. All that matters it that you said it, and you did.
Why such butthurt?

Jonathan Perez
Jonathan Perez

You brainless freak, I said that you are using a subset of C++, which is what you are doing. Since all the horror stories in the video is "shit nobody ever uses", you don't even use range-based for loops apparently. I even called your subset "random", i.e., unique to you, probably slightly different from anyone you work with. I never named your personal subset of C++, and certainly didn't call it "C".

Logan Clark
Logan Clark

Nobody besides Boostfags write all their code in templates.

Next time you try to falseflag as me, refrain from saging. You'll be more credible.

Daniel Allen
Daniel Allen

he doesn't write templates ever because he doesn't understand that part of the language because C++ is a nightmare
he pretends this shameful observation is really about people writing "all their code in templates"
sage to falseflag as whoever this is

Blake Myers
Blake Myers

No wonder NVidia cards are shit.

Attached: yikes.jpg (178.16 KB, 900x900)

Levi Hill
Levi Hill

How can such butthurt exist?

Ayden Phillips
Ayden Phillips

if you don’t casually use the hardest part of a language, you don’t understand it
That mentality might be why you’ve never programmed anything relevant in your life.

Luke Clark
Luke Clark

he never does a thing because he doesn't understand it because his language is impossible to understand
he pretends that this shameful observation is really about how *casually* he does a thing

Ethan Campbell
Ethan Campbell

4um refugees.

Samuel Bell
Samuel Bell

Am I really outside /g/ btw? Cause I don’t see any difference.

Logan Martinez
Logan Martinez

C is not a subset of C++.
That's not quite true, but it's not quite untrue either -- C++ was designed as a C superset and so tried to keep backwards compatible w/ C -- and, for the most part it does.

What about GPS ide, can I use it for closed source without paying?

Yes.
The situation w/ GNAT is there's three possible versions:
(1) AdaCore's Community edition -- this is the GPL restricted RTL one.
(2) AdaCore's Pro edition -- this is unrestricted, TTBOMK, and the most up-to-date.
(3) FSF -- this too is unrestricted, but usually lags behind #1 & #2.

There are other Ada implementations as well, RR Software's Janus/Ada, Green Hills, ICC, Verdix, and a few more.

Owen Turner
Owen Turner

c/c++
Never heard of this language, what is it?

Jayden Foster
Jayden Foster

So in your opinion of those ADA implementations which is the most secure for powerpc and why. Does the ADA implementation write assembly to ADA or does it go assembly > c > ADA? Which is to say what architectures are supported by ADA/its implementations.

As if you are writing for security you might as well not bother with the lanuage if all it supports is x86 or ARM or some shit like that.

Henry Scott
Henry Scott

not using good tools because you despise those who made them
That just makes said language better. Learn from your enemy and beat him at his own game.

Noah Jackson
Noah Jackson

C is not a subset of C++
That's better discussed in another thread.

Logan Gonzalez
Logan Gonzalez

Mind explaining why Ada is making a comeback?
Because it offers some good solutions to existing problems: the native TASK; GENERICs that are (a) not Turing-complete, (b) can use other generics, values, and subprograms, (c) essentially providing the functionality of the C++ proposed "concepts"; the functionality of C++ proposed "modules" provided by PACKAGEs; the SPARK formal-methods/-proovers; and the upcomming PARALLEL blocks/loops.

c/c++
Never heard of this language, what is it?
Ususally a shorthand for "C and/or C++" -- usually grouping the two together based on similarities.

Ada is mostly made for formal methods (which I don't use).
Not really; but it has a much better foundation to build provers on. (More information on types made explicit.)

Is it possible to use ADA to write GUI apps?
Yes.
There's Gnoga.com [down right now], GTK, and CLAW from rrsoftware.

So in your opinion of those ADA implementations which is the most secure for powerpc and why.
I honestly don't know which would be best for PowerPC, as I haven't built any PowerPC programs.

Does the ADA implementation write assembly to ADA or does it go assembly > c > ADA?
This depends highly on the implementation; there's no reason that assembly or C needs to appear in any of the bootstrapping process at all. (I'm working on an Ada compiler written purely in Ada, so no C or assembly.)

Which is to say what architectures are supported by ADA/its implementations.
As if you are writing for security you might as well not bother with the lanuage if all it supports is x86 or ARM or some shit like that.
I know people who have compiled non-trivial programs from *VERY* different archetectures with little to no alteration in the source code. (In particular, Randy on comp.lang.ada has some excellent stories about odd archetectures and his Janus/Ada compiler.)

C is not a subset of C++
That's better discussed in another thread.
Agreed.

Owen Evans
Owen Evans

This is pretty interesting, I think.
I've always had a soft spot for Ada.

Dominic Collins
Dominic Collins

The first was an interesting concept that was originally referred to as `potable assembly` and was based around expanding on another language that didn't even have types. The second was a further expansion on the first that took something simple and stupid and made it complicated and stupid.

Ethan Rodriguez
Ethan Rodriguez

Why not something actually verifiable, like F*?

Julian Foster
Julian Foster

Why not something actually verifiable, like F*?
Maybe because for the firmware they want the ability to do low-level programming? Or maybe because they think that transitioning to the functional paradigm will be too much of a hurdle for their existing programmers.

C = `potable assembly`
But, honestly, Forth does an excellent job of that.

Julian Anderson
Julian Anderson

It's about time there's some good news in technology. Nvidia replacing C with Ada shows that C is not as hard to replace as the shills want you to believe. OP's article is proof that switching from C to Ada does the opposite of what switching from Ada to C and C++ did for the F-35.

You're not wrong -- perhaps the biggest problem/difficulty is that the industry standardized on C and C++ and waste untold millions trying to "fix C" or make a "Better C" (Java, C#, Swift, ObjectiveC, etc) and now have such emotional investiture it's like looking at a fully internalized Sunk Cost Fallacy.
C and UNIX are the biggest disaster and waste of effort in computer science or possibly any industry ever. Much smaller teams of programmers produced much better languages and operating systems in a lot less time. This is because of how unproductive and inefficient C and UNIX are. They need that much work just to stay up to date. Without 15,600 programmers, Linux (just the kernel) would not be able to continue to run on newer hardware. Better operating systems have fewer programmers because they don't need as many. If 15,600 programmers were told to work on any other operating system, they would literally have nothing to do.

The first was an interesting concept that was originally referred to as `potable assembly`
That's revisionism. C was not meant to be portable or assembly. It was meant to be the equivalent to PL/I in Multics and Lisp on Lisp machines. They only started calling it "potable assembly" when they realized that it couldn't compete with real programming languages. That's around the same time they started their "simple and elegant" buzzword describing kludges like using the tape archiver "tar" to copy directory hierarchies. What really sucks is that C is a lot less portable than other languages like Ada, Pascal, PL/I, BASIC, Lisp, Cobol, and so on. Most languages don't care if characters are byte-addressable or any of that other bullshit. Word-addressed CPUs were handling strings just fine before C.

and was based around expanding on another language that didn't even have types.
This part's right, which is why C sucks so much. Types, arrays, strings, compound literals, objects, exceptions, the preprocessor, and everything else in C and C++ sucks because it was not originally there. There is no coherent design like there is in real programming languages. It's like building a house by having a thousand people nail boards together with no blueprints or idea of what it should look like.

Why am I retraining myself in Ada? Because since 1979 I
have been trying to write reliable code in C. (Definition:
reliable code never gives wrong answers without an explicit
apology.) Trying and failing. I have been frustrated to
the screaming point by trying to write code that could
survive (some) run-time errors in other people's code linked
with it. I'd look wistfully at BSD's three-argument signal
handlers, which at least offered the possibility of provide
hardware specific recovery code in #ifdefs, but grit my
teeth and struggle on having to write code that would work
in System V as well.

There are times when I feel that clocks are running faster
but the calendar is running backwards. My first serious
programming was done in Burroughs B6700 Extended Algol. I
got used to the idea that if the hardware can't give you the
right answer, it complains, and your ON OVERFLOW statement
has a chance to do something else. That saved my bacon more
than once.

When I met C, it was obviously pathetic compared with the
_real_ languages I'd used, but heck, it ran on a 16-bit
machine, and it was better than 'as'. When the VAX came
out, I was very pleased: "the interrupt on integer overflow
bit is _just_ what I want". Then I was very disappointed:
"the wretched C system _has_ a signal for integer overflow
but makes sure it never happens even when it ought to".

Blake Perez
Blake Perez

Lispfag shows up in an Ada thread
spends most of his post bitching about Unix and C
only mentions Ada offhand a couple times to make it look like he's on topic
insists Linux needs 15k contributors when most of those are random programmers sending in a patch every couple years and the regular contributor count is much smaller
What a loser. You've wasted so much of your life bitching about things you hate that you can't even write a single post focusing on the things you like.
Here's a challenge: try writing a single post about something you like without mentioning C or Unix once. Convince people of your favourite language's and OS's advantages through their features and design philosophy instead of simply namedropping things you like between shitting on things you don't like.

Tyler Rodriguez
Tyler Rodriguez

Yeah. He does this shit. And when no one responds, he will samefag himself. The mods won't do anything about it because "we don't see it as off topic" but derailing threads has always been against the rules. I think the mods are the fag, or it's one of their friends so they won't do anything about it.

Jaxon Flores
Jaxon Flores

Interesting; thank you for sharing this.

Convince people of your favourite language's and OS's advantages through their features and design philosophy instead of simply namedropping things you like between shitting on things you don't like.
Disredarding excessive language/OS hate, I think there's a few people who have done so in this thread... though sometimes it is helpful to draw comparisions.

Ex:
In Ada you can have arrays (and slices thereof) as proper parameters whereas in C and C++ you cannot because of how their semantics devolve the array into a pointer/address.
Type Vector is Array(Positive range <>) of Integer;

Function "+"( Object : Vector ) return Long_Long_Integer is
begin
Return Result : Long_Long_Integer := 0 do
For Item of Object loop
Result := Result + Item;
end loop;
end return;
end "+";

which allows our summation (unary +) to be used on 'Vector' and slices thereof. You simply can't do this with a C-style array because of the aforementioned problems (you would have to include a length parameter, or sentinel-value like NUL in strings).

Robert Murphy
Robert Murphy

It was meant to be the equivalent to PL/I in Multics
What a load. C's minimalism is a reaction to this exact bullshit. Too bad they took the reaction too far.

Christopher Mitchell
Christopher Mitchell

Why don't you take your shit and make your own fucking thread instead of trying to derail every other thread to be about you?

Josiah Long
Josiah Long

Why don't you take your shit and make your own fucking thread instead of trying to derail every other thread to be about you?
What?

Dylan Bell
Dylan Bell

He probably responded to the wrong post.

Tyler Bailey
Tyler Bailey

I think the mods are the fag
Nah, they tolerate mikee too, they're just the mods this board deservers

Disable AdBlock to view this page

Disable AdBlock to view this page

Confirm your age

This website may contain content of an adult nature. If you are under the age of 18, if such content offends you or if it is illegal to view such content in your community, please EXIT.

Enter Exit

About Privacy

We use cookies to personalize content and ads, to provide social media features and to analyze our traffic. We also share information about your use of our site with our advertising and analytics partners.

Accept Exit