There are too many programming languages. Back in the day when most computing breakthroughs were made there were four...

There are too many programming languages. Back in the day when most computing breakthroughs were made there were four. People invested in computing could focus on what they were doing instead of bumbling to keep themselves up to date on latest developments to maintain employability.

Attached: 1479755222816.png (205x205, 38.3K)

Other urls found in this thread:

youtube.com/watch?v=lvclTCDeIsY
stackoverflow.com/questions/16159203/why-does-this-java-program-terminate-despite-that-apparently-it-shouldnt-and-d
twitter.com/AnonBabble

Languages aren't enough. We have to have multiple new frameworks per language every month.

I think all of these languages that people do use are 15+ years old. Most jobs are for established languages.
Creating new languages is a good way to progress. That's true even if the languages don't end up being used all that much. Barely anybody used ALGOL 68, but it was very influential.
Most computing breakthroughs were made when there were only a few languages because most computing breakthroughs were made early on, and there were only a few languages early on. Remember, correlation is not causation. If A causes B and A causes C then that doesn't mean B causes C.

Attached: Screenshot from 2018-05-12 19-28-03.png (663x518, 28.39K)

If early people had to be in touch with 20+ languages, do you think they had had time to come up even with double linked list?

It is a problem.

So? You don't have to use every programming language. There's no reason not to keep using C in 2018.

shut up rustfag

People don't have to be in touch with 20+ languages even now.

lol shut the fuck up. nobody who jumps to the latest web PL every and web framework 5 minutes matters nor anything they make

You're entirely right. We need to make a new programming language to replace all others.
I'll get started on the logo, you do the rest.

All you really need are C/C++ and some Assembly

There were over 60 new languages of note developed in the 1970s alone. Stop LARPing, Zig Forumsnigger.

Everyone should have their own language and write their own software.

Attached: 7084b7f7949a3e26be4d24395f5f6d6279c05a695ebbb12038f072ca021e63ea.gif (540x540, 1.11M)

Correlation is not necessarily causation. If A causes B and B causes C, A must cause C.

I don't see how that's relevant here in the first place, but it's also not necessarily true. If A directly causes a decrease in C that may balance out the indirect increase via B.

Heh.


Or use existing ones. Scheme and Fortran. That should do it. Rest can be memoryholed.

Really just take everything from every language and reinvent and add it to the standard library of a LISP or Scheme-like language.
You can write documents in Scribble in Racket. Do it right, and LaTeX, MD, HTML/CSS/JS, PS all get collapsed into it.
Then make this nulang compilable to machine code and C, C++, Rust, D, and Go collapse into it.
The same fate waits for Python, Ruby, shell, Perl, Java... all will be dissolved and absorbed in to ((((The (International (Language))))).

C++ has literally no purpose

It's C+classes+destructors, which is pretty damn handy.

There are still basically 4: C++, Java, Python and Javascript. You might need to pick up one or two more for a specific job (e.g. Swift, Ruby, TypeScript) but these are all variants on the above anyway. If you want to explore and expand your mind you can check out Haskell, Coq, Lisp or Rust but these are all strictly optional.


This is the much bigger problem. Ironically it's why bloated languages like C++ and Python are better. Whatever the flaws of the language and standard library, once you know them you don't have to keep relearning frameworks.


This is true, e.g. ideas from more esoteric languages like Haskell influenced C#.
Mostly agreed, although for a counterpoint see Peter Thiel's ideas on how innovation has stalled due to academia and other institutions favoring people with a less adventurous (and less optimistic) outlook.

it's pretty much bloat
pure C is the only way

oh okay, better get a 3000 page spec that changes every year for that then

nope. C++ is ugly and bloated but it's better than C for everything except perhaps embedded programming. Some people can't handle the ugliness of C++ but that's something they have to get over.

C++ gives you much stronger guarantees than C. I'm not a language lawyer so I can't say it with the right terminology, but any instance of a class can only exist in a valid state, i.e. between calling the constructor and the destructor. This is a powerful guarantee that helps prevent errors, because you can say "as long as you have an instance of this class, I can promise certain things about its internal state". This allows RAII and smart pointers.

C++ also has templates which allow types-safe generic programming.

C++ does take time to learn, but once you get familiar with it, you don't feel uncomfortable with its complexity. Even though the spec is complex, you have an intuitive feel for what is "correct" in any situation.

>>>/reddit/

Language wars are for Zig Forums autists only. Real coders just fucking code.

Programmers have big egos.

watch this, OP: youtube.com/watch?v=lvclTCDeIsY
in only minutes you will feel a great weight leave your shoulders, forever. You will no longer react to programming languages with an immediate desire to learn them. I envy you--I only gained this freedom after learning Haskell well enough to absolutely despise it.

I got to 2:35. I heard "cis-white men". I stopped.

JFC this is not an argument. Go find some counter data or a reasoned argument. Correlation absolutely is evidence of causation.

It depends on the context.

the context is that one user said "here's a graph. older stuff is more important. don't take this too seriously" and then some other user said something completely senseless in 'reply', and now other(?) anons are sperging out because they interpret the senseless remark as a weak attempt at a rebuttal of the original user's remarks.

Fuck you

Thats what it was user

Attached: whoop.mp4 (320x240, 341.21K)

Nice try user. We know this is a lie.

it can't be, because it doesn't even contradict the original user's remarks.
remark 1: "Remember, correlation is not causation. If A causes B and A causes C then that doesn't mean B causes C."
remark 2: "Correlation is not necessarily causation. If A causes B and B causes C, A must cause C."
both agree that correlation is not causation. One proposes that two phenomena (B and C) can have a common cause (A) -- i.e., although the correlation of B and C is strong, neither causes the other. The other proposes that earlier causes in a causal chain of phenomena can be said to directly cause the later effects: f.e., if you light a fire, and then the fire burns down a house, then "you burned down the house".
Both remarks are true. The second does not contradict the first. The second is not a rebuttal. It just sounds like one because the second user is an idiot.

Thats called a shitty attempt

laziness is simply a bad feature, like 'contexts' in Perl. The expressivity you get from monadic operators is entirely offset by how dense and obscure the language becomes in the hands of its community. Typeclasses are OK. If everyone who learned Haskell had learned OCaml first, they'd be a lot impressed with it.

"lot less".
ML is where 100% of any magical feelings come from, when someone's learning Haskell. "wow, types can be actually useful." / "wow look at me I've spent the last hour designing this program and I haven't put anything but types to paper." / "wow, so you can do this instead of returning null in error". That stuff's real. Then you go on to appreciating lame gimmicks: oh, an infinite list of primes... well I guess this language has that cool (ML) stuff, so maybe this is also really useful...

C++, Java, Python and Javascript

lmaoing at all these fucking front-end developers. No wonder we have all that bloat shit.

C is unsuitable for modern software development.
C++ is based on C but even worse.
I think you mean invariants?
Also
Gno

>>>/reddit/

If you want a hearty laugh, I suggest that you watch at least a minute more.

few slides later
To top it all of, looking at their shithub
Guess they killed themselves.

Yuck! Just give me an Amiga, Atari, 8-bit computer, maybe even TempleOS. Those are fun, this other shit ain't.

That list suprises me. I expected Pajeet Hypertext Processor to be way higher on the list.

This has to be satire.

shut up jsfag

Lua is the best high level language for ease of programming. C is the best high performance high level language.
I don't like assembly because for general purpose programs it's slow, both to execute and to write programs in.

no.
fuck off, even in Java (which is about 100x safer than C++) class invariants are impossible to preserve except for literally nobody who fully understands the language. see for example stackoverflow.com/questions/16159203/why-does-this-java-program-terminate-despite-that-apparently-it-shouldnt-and-d
C++ templates are trash
the same claim is made about every PL, yet 99.999999999999999% of the users literally don't understand the language. C is one of the biggest offenders here, so adding new features to C isn't helping

You mean SML, OCaml is just SML+PHP. Typeclasses only have an advantages over modules because the languages are text based.

True, I was spouting meme terminology myself.
Better phrased would be
>C, C++ and java are unsuitable for any software development.
The actual state of software proves my point.

I liked the Oz syntax, but unfortunately that it is slow as fuck and really only suitable as a research/teaching tool.

>fuck off, even in Java (which is about 100x safer than C++) class invariants are impossible to preserve except for literally nobody who fully understands the language. see for example stackoverflow.com/questions/16159203/why-does-this-java-program-terminate-despite-that-apparently-it-shouldnt-and-d

I never said that. I was just giving an example of how flaky class invariants are in even the most safe mainstream languages. C++ is 100x worse. If you unironically think you're getting some better code just by using classes your code is probably broken as fuck. Classes in fact don't provide any useful form of invariants at all pretty much. The only one useful thing about them is probably the ability to make an Abstract Data Type, which can already be done in any typed PL ever, including SML or Haskell (and stuff like Ada IIRC)

HAHAHAHAHAHAHAHAHAHA
Using that fucking piece of shit pajeet language with everything is a class garbage for real time machine control.

Attached: a50eb56fd359d8525ecd69f0e243619d9a683e1c3dfe878c776867ac538aaf8c.gif (500x281, 1009.49K)

So, when you look at the answers, you find that he wasn't doing what he needed to be for those invariants to apply: his class was neither immutable, nor the fields marked as volatile.


This dumb motherfucker is a gold mine. I can't tell if he's retarded or just trolling. One of his questions is that he can't understand the error message generated by this "java program":
class A { static { System.out.println("Hello world"); }} It prints out "Hello world" as expected. Can you GUESS the error message and it's cause?

Then, there's this q and a:
Question (not his) is how to do this Python in Java for x in range(2, n): if n % x == 0: print n, 'equals', x, '*', n/x breakelse: # loop fell through without finding a factor print n, 'is a prime number' His answer: class A { public static void main(String[] args) { int n = 13; found: { for (int x : new int[]{2,3,4,5,6,7,8,9,10,11,12}) if (n % x == 0) { System.out.println("" + n + " equals " + x + "*" + (n/x)); break found; } System.out.println("" + n + " is a prime number"); } }} I actually feel bad knowing that there are people who think that this is okay.

There are too many ice cream flavors. Back in the day when most frozen food breakthroughs were made there were four. Ice cream men could focus on what they were doing instead of bumbling to keep themselves up to date on the latest flavors to maintain employability.

The comparison made was with C. C++ is 100x worse than what? Java? SML or Haskell? That's not relevant, because even if C++ was just C with constructors and destructors, it would still be 100x times better than ordinary C when it comes to preserving invariants.
It just so happens that they can, at least in C++. Not having to manually manage resources for every single data structure that isn't static or entirely on the stack is pretty useful. Also, unlike in Java, object lifetime in C++ is completely deterministic.

Sepples is impossibly complicated and it doesn't even have real macros. It forces you to write in the object-oriented paradigm in which everything is a class even if it doesn't need to be. 30 years in and they finally added in lambda functions, a feature which has existed in LISP since day one. God, what a clusterfuck of a language.

Could very well be, considering pic related is his avatar. And the fact that within the span of a few weeks he asks a basic question about that hello world as well as one where he is presumably debugging machine control software.
Maybe we are being taken on a ruse cruise by Zig Forums from 5 years ago.


913759
Indeed, you don't need classes and curly bracket vomit to do something like that. Well, if you are using a sane language.
package simpleadt is type point is private; --Keyword here is private with Type_Invariant => SomeTest(P: point); --Functions below are just the specifications. --Their Bodies would be in the package body. procedure StepX(P: in out Point); --make a step or whatever function GetX(P: in Point) return Integer; --Read X coordinate private --The full type declaration happens in the private part. --That point is actually a record with two fields in unknown to clients. --They just interact with it via the function provided in the public part. type point is record X: Integer := 0; Y: Integer := 1; end record; --The Expression function below gets checked at relavant points during runtime function SomeTest(P: Point) return Boolean is (P.Y-P.X = 1); end; end simpleadt;

Attached: Avatar.png (328x328, 260.02K)

OH NO NO NO NO

Tough noogies, it keeps out the riff-raff. Plus, you're not constantly second-guessing a shit cumpiler for a shit, halfass standard designed to create security disaster that we're in today. Fuck C and the cianigger it rode in on.

Attached: looks hard.jpg (260x331, 32.23K)

as I said, my point is that people don't know how their PL works, regardless of how "simple" the meme community believes them to be. For example every idiot claims C is simple, yet most C code is garbage and the writer is obviously oblivious of one or more of {the machine, the spec, the implementation}

meanwhile in the software world, there's no good software and changing PL every 5 minutes (and guaranteeing everyone in your company has no understanding of the code) isn't helping

100x worse than C
gno
let's be honest: most people only use C++ for games because it's what's used and they believe something bad will happen if they use a simpler language like C (which isn't a simple language, but at least is simpler than C++). then you have about 3 PL nerds who use C++ because they're too virgin to understand that the benefits C++ provides are all moot because of the language's complexity (again, even C was too complex in the first place, but PL virgins don't understand this because they never got fucked by the edge cases yet)

Wrong.
Why the fuck are you even bringing up malloc? Constructors and destructors have nothing to do with heap allocation. "Deterministic" here means that you can always know at which point in the program deallocation will occur, unlike with most (all?) garbage-collected languages.
Or maybe they like the encapsulation provided by namespaces and classes, and don't want to reinvent the wheel in C every time they need some form of dynamic polymorphism? Maybe they want lambdas? Or perhaps they like having generic containers and algorithms, along with proper strings?

Attached: 0b1df90eb319554d9090fdf46fde49f24f2ec5fd79bd227ed485748d94c4542c.jpg (250x241, 9.03K)

It's impossible to troll on Shit Overflow. Unless you make it really over the top, you'll get buried by a mountain of pajeetposting and retards will believe you're being serious.

Do you mean object?
But Java has primitives. Were you referring to Python?
Really?

Attached: 5613312d8763e584bbc5241192ebd4fb504c1a6882db07da4366d6ae72f96468.jpg (720x720, 25.59K)

I've been trying to compile a list of languages that aren't cancer.

Since programming is only a hobby atm I guess I get to do this.

So far the list is:

Forth, Lisp (mostly Scheme nowadays), SML, and Prolog

Everything else is basically just cancerous.

is dead. It's literally a few autists with their own personal Forth-like languages. Apart from them the only community you'll find are non-coding LARPers, faker than the most faking-it-till-he-makes it user on Zig Forums (because they don't even have the goal of one day 'making it')
isn't a language. 'Scheme' is barely even a language. Common Lisp is pretty good though.
if you typo a type constructor in a pattern-match, you'll bind a variable instead. OCaml fixes this with case-sensitive type constructors. ATS fixes this by requiring that type constructors in pattern matches look like function calls.
OCaml's a little bit weird, vs. SML, but it's not any more cancerous and you'll get more done with it.
oh come on. Prolog is so dead that more Prolog is written in Prolog-like DSLs inside other languages, than in Prologs themselves. Is that the joke? That only in death do we escape cancer?

Scheme isn't dead yet. It is interesting that some of the most expressive languages (Forth and Lisp) are essentially dead.

Prolog seems fun just for the mind bending experience of logic programming.

I cannot defend SML.

You don't need to.
C/D/C++/Rust/Ada (pick one)
Java/C#/Go/Swift (pick one)
Python/Ruby/Perl (pick one)
Javascript/PHP (pick one)
Now mix them together

C++
Go
Fuck, none of them
If I have to, javascript

a reduced version of Lua which can be compiled into machine code is king, although I don't know whether it's easy to implement the tables that way.

C. Pascal. Scheme. Go. Erlang.
What else could you possibly want?

Are you kidding me? C AND Pascal? Go AND Erlang? Pick one of each.

A good language.

might as well use matlab

>>>/bog/

that's one of the only bad aspects of Lua. I barely notice it though, it's only ever been annoying when using command-line arguments and the like (because those arrays start at 0).

lua fam, lua

Attached: 860e21e67d72af94cd64cb434919d6ba4f742061e01fca2c35a6b6a62d108bd4.jpg (200x200, 13.96K)

>>>Zig Forums

Forth isn't dead since it's used in embedded systems. You could even use it for desktop and server shit if you want to, but there industry trends are pic related, which is why today you have bloat and botnet galore. And it will get worse as white men are pushed out of IT for political reasons. Fun times ahead!

Attached: load a shit.jpg (1936x2592, 2.7M)

All you need is COBOL

the reason why there are so many languages is because of proprietary languages and web design non sense. girls want a language that fits their small mind so they people make a language for them but then there are other pajeets or girls with different degree of brain retardation so you have to make a language for them also

Back in the day, there were hundreds of programming languages. There's a paper from 1966 called "The next 700 programming languages" because there were already around 700 in existence at the time. Only a few of them were important, but that doesn't mean people weren't making them. A lot of them were (what we would call) scripting languages for single applications.

That was because computer companies cared about compatibility. The calling conventions supported higher level data structures. Multics and VMS have data descriptors which identify the type of data so you don't have to care about the language as long as the data in memory matches what the descriptor says. The Lisp machines extended it to OOP, closures, GC, and dynamic typing. On Lisp machines, the same data types are shared for every program and the hardware has tags so it knows the type of everything. The same GC is shared for all programs running on the computer. UNIX weenies only care about C, so everyone has to use a C FFI even when they don't want to use C. C++ is from the same company but it's still as much of a second-class citizen on UNIX as it was in 1990. This means fewer common language features are compatible today than they were in the 60s. The only way to share a string is by converting it to a null-terminated string or keep a null on the end, which can cause serious bugs on the C side when the string contains a null somewhere else, and C can't handle garbage collection.

Of course the most amazing thing about functionprototypes that everyone overlooks is that a DECENTenvironment, with a decent compiler and linker, would beable to check all that stuff at link time! There's nothingwrong with C as it was originally designed, it's just thatnobody thought to have link-type parameter checking builtinto the compilation environment. Rather than someoneputting all that work into the linker (what, a few months?)we have a language standard that forces every user toperform the job of link editor in their head. Great. Andit's becoming the lingua franca of programming in the 90s.It's grimly amusing to note that UNIX, the cradle of C as itwere, lags hopelessly behind DOS in the quality of Cprogramming environments, with the notable exception ofSaber-C. Saber-C is another unusual case, however, since itgives you a NICE environment, with compile and link errorchecking, runtime error checking, and so on, and when youwant to actually generate an executable....? You're on yourown and you're back to using the outdated piece of shitcompiler the vendor gives you with your system - adescendant of the original 'pcc' or some similarabomination, hacked on by generations of grad students ontheir way to guruhood. Or you can use gcc, if you have thetime, the MIPS, and the disk space, and can tolerate itslittle foibles. The crowning glory of it all is C++, the birthplaceof function prototypes. Function prototypes must have beenimplemented the way they were because it was easier for thecompiler writer to get all the parameters before starting toparse the function, rather than having to figure them out inthe ensuing lines. When a language is bent, twisted, andmalformed to save the compiler writer a bit of effort,you've got to wonder why the guy's writing compilers in thefirst place. Now, C++ could really use a decent compilationenvironment, with link time name resolution. ButNnnnnoooooOOOOOOO, rather than "do it right" some late nightcaffeine and taco fever nightmare scheme was concoctedwhereby symbols get transformed into DES-encrypted dog vomitand back. C++ is heading for standardization. All of thishappened because nobody wanted to put a few hooks for typechecking in their linker and object file format. I thinkthere's a lesson in all this, but I'm damned if I can seeit.*The story of the programmer who wrote all his functionprototypes to accept (char *) and cast everything to (char*) before calling any function is pure fiction. Really.

fuck off

Attached: shut-up-richard.mp4 (1280x720, 611.29K)

Its from UnixHaters newfag.
He posts this shit in every other thread.

And I bitch about it in every thread.

do you actually think anyone reads the text you quote that you paste in every thread about programming languages?

Engineering is all about finding the objectively best way to do a specific task. The “opinion” fags is what ruined it.

You have to
How about I add Typescript and Coffeescript to the mix?

Coffeescript is fucking dead and Typescript is nothing but a temporary transition before actual Javascript implements types.

"There are only two kinds of languages: the ones people complain about and the ones nobody uses." -- Bjarne Stroustup


I happen to agree with your choice of languages, but it could have been another language family above assembly as well. A lot of people when I was learning C, did programming in Pascal. The object oriented paradigm became popular and both languages got an objection oriented extension. I have more complaints about C++ than I do about C, but I think that is unavoidable when you design languages to mimic human thinking instead of computer logic. I particularly dislike exceptions. In any case I stick with ANSI for both, because that whole ever evolving language circus and the maintenance required is not for me. I want programming to be about the program and the language to be a tool. I like languages like lua and tcl as scripting extensions to programs and Perl is my preferred text-processing beast. A POSIX compliant shell on top of that and I rarely touch other languages unless it is to patch existing code. Oh, I forgot.. I do use PHP too, not that I particularly enjoy it, but that's not as much an objection to the language as it is the use case.

My biggest problem is how languages are not stable. I think this is a place where, it would be really useful to stick with do one thing and do that thing well. I don't want to fight with libraries over which version of a programming language we should be using and I'm glad there are a lot of C and C++ people who agree and appreciate the ANSI standards. We have a serious dependency hell on Linux much worse than what we made fun of on Windows, because of this. Can we please stop this incompatibility madness and then I don't care whatever language you use. Oh thank you, I need both Python 2 and 3 for this project and thanks again for the upgrade to C++11 it really helped my system become a big mess.

Attached: what_a_mess.jpg (1024x609, 148.82K)

Terra is exactly this.

Attached: e6bc507a8157315eebfc293bed14028b0d58ee6448042eb14c9689d5b85232ed.jpg (809x800, 280.73K)

This is good guide to decide what to learn. Still, it will be 4 languages to keep tabs on. Like learned natural languages, their conventions will keep bouncing around in your head, making you a worse coder in all of them. Now step into corporate environment where you possibly need to learn new language every now and then as you are transfered between teams and projects.

I'm absolutely sure the reason why so many software projects fail (70%) is because of this pointless jumping around and cognitive burden.

... what?

Native bilingual faggot here: I excel in one of the languages, "suck" at the other one. I put suck in quotation marks because I actually am fairly fluent in it, just that fully context switching to it often takes whole minutes since I am not used to speaking it, so I only speak it if it's strictly necessary to avoid making a fool of myself. I also happen to be fairly fluent in written English because I deal every day with that, but I suck at spoken English because I could count the times I had the chance to do it.

Likewise, I deal with several programming languages, at work and at home, and also attempt to abuse the exclusive properties of each language. I have been working with these languages day after day for the past year and now switching between them is fairly cheap. Maybe not free, but I can assure you once the switch is done there are absolutely no downsides, because as they say, knowledge takes up no space, and practice makes perfect.

This may be true, but that's because context switching kills productivity, and also because learning is slow, and mastering even more so. You can also find context switching turnoffs when moving between different files written in the same language as well, btw.

If you're white, it's a good idea to take advantage of your higher IQ and focus on the languages at the top of user's stack. C and Ada ftw! Java, Python, and Javascript are all very easy (and have their place), but the job market for these skills can easily be saturated by the unwashed masses. Also don't overlook VHDL/SystemVerilog, both hardware design and electrical engineering are still relatively unpozed fields (It's not pajeet/gook free, but at least the pajeets there have a higher likelihood of being competent)

lol they are almost the exact same language. His list is shit you can easily achieve.
A proper aim would be to learn a language from each family.

there are only 2 categories (C, Ada, etc; everything else), not 4. "scripting" is a meme. "web" languages like JS or PHP aren't even a real thing, those are just broken as fuck shit

lol a bunch of regex and macros as a compiler (and the spec is just whatever the compiler produces)

lua has only ~1 bad aspects. lol

Just because you don't understand how Windows has dependency hell, doesn't mean it's not there. For example ASLR is impossible to implement on Windows (or at least has been for 15 years. I don't care if it's been fixed since the last time I checked, that's still 15 years to get some basic thing implemented)

no, you don't even understand one of them.

they're literally the same shit, nigger

I am sure you could recite the spec of your language it is C, right? Or at least that's what you would like to say if you actually programmed to a T, and also every single bug of every single compiler ever produced for it.

no, i don't understand any PL either. that's the problem. they're all bloat and full of monkeys amending the spec and implementation. I understand the one I wrote though, since a necessary trait required by the basis of a secure OS is being able to understand every detail of the PL

Yes user its a failure of a response I agree.

...