I know this question is vague but, what are the "best" programming languages and why?

I know this question is vague but, what are the "best" programming languages and why?

Other urls found in this thread:

gcc.godbolt.org/z/cgWl1l
gcc.godbolt.org/z/sj_lQu
youtube.com/watch?v=1S1fISh-pag
twitter.com/NSFWRedditImage

Javascript is the only language you need, anyone who says otherwise is a larper who has never written anything more significant than fizzbuzz

ignore this street shitter
C is the only language you need to know

look up anything significant and it's written in C

let's say that the best languages are the ones that give you the most practical utility, and that there are multiple best languages due to large gaps in what any individual language provides. The book-retarded will object that turing equivalence means that you can learn a single language and get all possible utility from it; this is only true if you discard 'practical', and pretend you have all the time in the world. So,
1. JavaScript, for browsers, phones, desktops with electron, servers with node, games with JS game engines.
2. C, for unix and embedded
3. C++, for desktop, servers, games
4. C#, for Windows, games
5. Java, for enterprise software. It's basically COBOL at this point.
No others come close. For a specific enough task ("I want to patch this Django website") any language can be #1.
Incidentally, these languages are all shit. Just awful. JS and Java especially. But they are not the best languages because they are shit, but because they rose from even worse shit and there's now a lot done with them.

Rust is the only language you need, anyone who says otherwise is a larper who has never written anything more significant than fizzbuzz

actually that should be
5. PHP, for WordPress, Infinity NEVER, etc.
6. Java
I hate PHP so much I actually forgot about it. My apologies.

Python is the only language you need, anyone who says otherwise is a larper who has never written anything more significant than fizzbuzz

also, if you drop the vagueness even a bit, and say for example "I want to admin servers", you'll get a much different list from

And now you have said "otherwise" three times so far...

ok, then how about languages that follow the suckless and unix philosophy pretty well

உங்களுக்குத் தேவையான ஒரே மொழி தமிழ் மட்டுமே இல்லையென்றாலும், fizzbuzz ஐ விட முக்கியமானது எதுவுமே எழுதவில்லை

That's like asking what consistency of shit is the best when shitting on a street.

C++ is the only language you need, anyone who says otherwise is a larper who has never written anything more significant than this fizzbuzz
#include #include using namespace std;int main(){ string words[4]={"","fizz","buzz","fizzbuzz"}; int z=0, m=810092048; for(size_t i=1;i

well
1. C, literally all the suckless software is written in C, no? And if you do anything GUI toolkits aside with Unix you'll be working with APIs that are written for/with C, so using C yourself means one less step to using the API. You've also got gdb, ltrace, valgrind, and other friends.
Speaking of, these are the languages supported by gdb: C, C++, D, Go with gccgo or 6g, Objective C, OpenCL C, Fortran, Pascal, Rust, Modula-2, Ada. If your language isn't in this list, it sucks :^)
2. Perl, for some values of unix philosophy
3. Awk, if Perl is too useful for you
4. Forth, for simplicity and minimalism more than unix
If you emphasize the code quality and code not getting too complex parts of suckless, you can argue for languages like Ada, D, Go, maybe Scheme. Whether that argument will be accepted, I dunno. Write some good code and present it along with your argument; bonus points if it's a quality rewrite of some shit C.

gcc.godbolt.org/z/cgWl1l
$ time for x in {1..500}; do ./fb; done >/dev/nullreal 0m0.786suser 0m0.336ssys 0m0.486svs.#include #include int main (void) { for (int i = 1; i

oh yeah, your might also suck if it's not supported by the Compiler Explorer.

That is a sterile, boring, uninspiring fizzbuzz. Like a good logo, a good fizzbuzz merits of it self.

C++ btfo

no, this one is boring, and still faster than the C++:with Ada.Text_IO; use Ada.Text_IO;with Ada.Integer_Text_IO; use Ada.Integer_Text_IO;procedure Fb3 isbegin for J in 1 .. 100 loop if J rem 3 = 0 and J rem 5 /= 0 then Put_Line ("Fizz"); elsif J rem 3 /= 0 and J rem 5 = 0 then Put_Line ("Buzz"); elsif J rem 3 = 0 and J rem 5 = 0 then Put_Line ("FizzBuzz"); else Put (J, Width => 0); New_Line; end if; end loop;end Fb3;gcc.godbolt.org/z/sj_lQu
all three compiled locally with -O3, but I totally forgot to set that on the others in godbolt. C++'s assembler output remains an epic trilogy with -O3, anyway

youtube.com/watch?v=1S1fISh-pag

Speed is not the only aspect of art, ada user. A good fizzbuzz, much like a good symphony, builds a sense of mystique amongst the laymen viewing the program's structure. You are comparing a child's drawing with a master painter's crown jewel. The time it takes to obtain the output is meaningless; it is the journey to that output which fizzes the buzz of our heart.

wrong, this is the only thing that matters

It takes a true autistic to not appreciate art like

flexibility also matters.
give two guys the output as a task. One guy writes it in an editor with some copy and paste. One guy writes a program and runs it.
Now ask change the desired output and ask them to incorporate it. "Actually I want comma-separated output with the number in the first column and an X in second column in the Fizz case and an X in the third column in the Buzz case. In the number-only case the output should look like 3,,"
guy with the program: really? OK. done.
guy with the output in an editor: FUCK. YOU.

with Ada.Text_IO; use Ada.Text_IO;with Ada.Integer_Text_IO; use Ada.Integer_Text_IO;procedure Fb4 isbegin for J in 1 .. 100 loop Put (J, Width => 0); Put_Line ((if J rem 3 = 0 then ",X" else ",") & (if J rem 5 = 0 then ",X" else ",")); end loop;end Fb4;btw.1,,2,,3,X,4,,5,,X6,X,7,,8,,9,X,10,,X11,,12,X,13,,14,,15,X,X

...

this actually turns out to be twice as slow

Time:0.000071
(with clock code added)
#include #include int main (void) { float startTime = (float)clock()/CLOCKS_PER_SEC; //puts(fizz); puts( "1\n2\nFizz\n4\nBuzz\nFizz\n7\n8\nFizz\nBuzz\n11\nFizz\n13\n14\n" "FizzBuzz\n16\n17\nFizz\n19\nBuzz\nFizz\n22\n23\nFizz\nBuzz\n26\nFizz" "\n28\n29\nFizzBuzz\n31\n32\nFizz\n34\nBuzz\nFizz\n37\n38\nFizz\nBuzz\n" "41\nFizz\n43\n44\nFizzBuzz\n46\n47\nFizz\n49\nBuzz\nFizz\n52\n53\nFizz\n" "Buzz\n56\nFizz\n58\n59\nFizzBuzz\n61\n62\nFizz\n64\nBuzz\nFizz\n67\n68\n" "Fizz\nBuzz\n71\nFizz\n73\n74\nFizzBuzz\n76\n77\nFizz\n79\nBuzz\nFizz\n" "82\n83\nFizz\nBuzz\n86\nFizz\n88\n89\nFizzBuzz\n91\n92\nFizz\n94\nBuzz" "\nFizz\n97\n98\nFizz\nBuzz\n\0" ); float endTime = (float)clock()/CLOCKS_PER_SEC; printf("\nTime:%f\n",endTime-startTime); return 0;}
Time:0.000160

great, but what about when you need to fizzbuzz all the way to 1000? or even 100,000? your hands will never keep up. you need another program to generate the string, and then iterate through it and add numbers. at the very least. then load this into your program and print it.

i don't really know assembly but this is some hello world shit modified to prove a point
section .text global _start ;must be declared for linker (ld) _start: ;tells linker entry point mov edx,len ;message length mov ecx,msg ;message to write mov ebx,1 ;file descriptor (stdout) mov eax,4 ;system call number (sys_write) int 0x80 ;call kernel mov eax,1 ;system call number (sys_exit) int 0x80 ;call kernelsection .datamsg db `1\n2\nFizz\n4\nBuzz\nFizz\n7\n8\nFizz\nBuzz\n11\nFizz\n13\n14\nFizzBuzz\n16\n17\nFizz\n19\nBuzz\nFizz\n22\n23\nFizz\nBuzz\n26\nFizz\n28\n29\nFizzBuzz\n31\n32\nFizz\n34\nBuzz\nFizz\n37\n38\nFizz\nBuzz\n41\nFizz\n43\n44\nFizzBuzz\n46\n47\nFizz\n49\nBuzz\nFizz\n52\n53\nFizz\nBuzz\n56\nFizz\n58\n59\nFizzBuzz\n61\n62\nFizz\n64\nBuzz\nFizz\n67\n68\nFizz\nBuzz\n71\nFizz\n73\n74\nFizzBuzz\n76\n77\nFizz\n79\nBuzz\nFizz\n82\n83\nFizz\nBuzz\n86\nFizz\n88\n89\nFizzBuzz\n91\n92\nFizz\n94\nBuzz\nFizz\n97\n98\nFizz\nBuzz`, 0xa ;string to be printedlen equ $ - msg ;length of the string
real 0m0.222s
user 0m0.125s
sys 0m0.084s

real 0m0.429s
user 0m0.273s
sys 0m0.135s


clearly this is not the way to go but i was curious what peak performance would look like from 1 to 100, it's obviously not getting faster then simply printing the data to stdout.

why this

is fucking twice is slow is not triggering extreme irritation. clearly the straight assembly version is not twice as slow, it's twice as fast.

...

#include #include char fizz[900]="1\n2\nFizz\n4\nBuzz\nFizz\n7\n8\nFizz\nBuzz\n11\nFizz\n13\n14\n" "FizzBuzz\n16\n17\nFizz\n19\nBuzz\nFizz\n22\n23\nFizz\nBuzz\n26\nFizz" "\n28\n29\nFizzBuzz\n31\n32\nFizz\n34\nBuzz\nFizz\n37\n38\nFizz\nBuzz\n" "41\nFizz\n43\n44\nFizzBuzz\n46\n47\nFizz\n49\nBuzz\nFizz\n52\n53\nFizz\n" "Buzz\n56\nFizz\n58\n59\nFizzBuzz\n61\n62\nFizz\n64\nBuzz\nFizz\n67\n68\n" "Fizz\nBuzz\n71\nFizz\n73\n74\nFizzBuzz\n76\n77\nFizz\n79\nBuzz\nFizz\n" "82\n83\nFizz\nBuzz\n86\nFizz\n88\n89\nFizzBuzz\n91\n92\nFizz\n94\nBuzz" "\nFizz\n97\n98\nFizz\nBuzz\n\0";int main (void) { //syscall(SYS_write, 1, fizz, 900); register int syscall_no asm("rax") = 1; register int arg1 asm("rdi") = 1; register char* arg2 asm("rsi") = fizz; register int arg3 asm("rdx") = 900; asm("syscall"); return 0;}
real 0m0.371s
user 0m0.245s
sys 0m0.111s


still twice as fucking slow
whatever i'm done.

actually no, this is considerably faster
#include #include #include #include char fizz[900]="1\n2\nFizz\n4\nBuzz\nFizz\n7\n8\nFizz\nBuzz\n11\nFizz\n13\n14\n" "FizzBuzz\n16\n17\nFizz\n19\nBuzz\nFizz\n22\n23\nFizz\nBuzz\n26\nFizz" "\n28\n29\nFizzBuzz\n31\n32\nFizz\n34\nBuzz\nFizz\n37\n38\nFizz\nBuzz\n" "41\nFizz\n43\n44\nFizzBuzz\n46\n47\nFizz\n49\nBuzz\nFizz\n52\n53\nFizz\n" "Buzz\n56\nFizz\n58\n59\nFizzBuzz\n61\n62\nFizz\n64\nBuzz\nFizz\n67\n68\n" "Fizz\nBuzz\n71\nFizz\n73\n74\nFizzBuzz\n76\n77\nFizz\n79\nBuzz\nFizz\n" "82\n83\nFizz\nBuzz\n86\nFizz\n88\n89\nFizzBuzz\n91\n92\nFizz\n94\nBuzz" "\nFizz\n97\n98\nFizz\nBuzz\n\0";int main (void) { float startTime = (float)clock()/CLOCKS_PER_SEC; //syscall(SYS_write, 1, fizz, 900); register int syscall_no asm("rax") = 1; register int arg1 asm("rdi") = 1; register char* arg2 asm("rsi") = fizz; register int arg3 asm("rdx") = 900; asm("syscall"); float endTime = (float)clock()/CLOCKS_PER_SEC; printf("\nTime:%f\n",endTime-startTime); return 0;}
Time:0.000028
compared to

Time:0.000160

clearly puts has extreme bloat somewhere.

strace it, dude.
It's really obvious why it takes so much longer.

puts is making a write syscall on every newline

so calling the write syscall directly instead of using puts results in a 500% performance improvement.

How will all this fizzbuzzery help AI generate prettier waifus? Think of the real issues, people.

interesting
#include #include #include #include #include #include char teststr[2000];int main (void) { char buff[6]; for (int i = 0; i

user pls.#include #include #include #include #include #include #define BUFSIZE 2000char teststr[BUFSIZE];#define NOW ((float)clock()/CLOCKS_PER_SEC)struct timings { float pre; float syscall; float puts; float printf;};int main (void) { struct timings times; int width; int n = 0; char *out; for (out = teststr; n < 500 && out < &teststr[BUFSIZE]; out += width) width = snprintf(out, &teststr[BUFSIZE - 1] - out, "%d\n", n++); times.pre = NOW; syscall(SYS_write, 1, teststr, out - teststr); times.syscall = NOW - times.pre; times.pre = NOW; puts(teststr); fflush(stdout); times.puts = NOW - times.pre; times.pre = NOW; printf("%s",teststr); fflush(stdout); times.printf = NOW - times.pre; fprintf(stderr, "syscall: %f\n" "puts: %f\n" "printf: %f\n", times.syscall, times.puts, times.printf);}$ ./putwhat...syscall: 0.000891puts: 0.002123printf: 0.001856$ ./putwhat >/dev/nullsyscall: 0.000010puts: 0.000148printf: 0.000007clock_gettime(CLOCK_PROCESS_CPUTIME_ID, {tv_sec=0, tv_nsec=783504}) = 0write(1, "0\n1\n2\n3\n4\n5\n6\n7\n8\n9\n10\n11\n12\n13\n"..., 1890) = 1890clock_gettime(CLOCK_PROCESS_CPUTIME_ID, {tv_sec=0, tv_nsec=798562}) = 0clock_gettime(CLOCK_PROCESS_CPUTIME_ID, {tv_sec=0, tv_nsec=805497}) = 0fstat(1, {st_mode=S_IFCHR|0666, st_rdev=makedev(0x1, 0x3), ...}) = 0ioctl(1, TCGETS, 0x7ffdbcdada70) = -1 ENOTTY (Inappropriate ioctl for device)brk(NULL) = 0xce8000brk(0xd09000) = 0xd09000brk(NULL) = 0xd09000write(1, "0\n1\n2\n3\n4\n5\n6\n7\n8\n9\n10\n11\n12\n13\n"..., 1891) = 1891clock_gettime(CLOCK_PROCESS_CPUTIME_ID, {tv_sec=0, tv_nsec=896332}) = 0clock_gettime(CLOCK_PROCESS_CPUTIME_ID, {tv_sec=0, tv_nsec=905212}) = 0write(1, "0\n1\n2\n3\n4\n5\n6\n7\n8\n9\n10\n11\n12\n13\n"..., 1890) = 1890clock_gettime(CLOCK_PROCESS_CPUTIME_ID, {tv_sec=0, tv_nsec=920587}) = 0

Imagine not having integers

D is the best programming language, it's fast, has a package manager, is memory safe, has an amazingly expansive standard library, insane meta programming capabilities and is just a pleasure to write.

After that there's Crystal which is great for writing web servers, just get yourself the Amber framework and you're all good.

That's about all you need tbh.

Funny how this is one of the few threads the Lispfag hasn't replied yet. It's almost as if he doesn't know jack shit about programming, and can only parrot crap from ancient mailing lists..

Python is the best programming language.

its slow

you should get your bf's big python out of you're ass and learn a real programming language instead.

Learn BASH scripting. If you are a derpsfaggot learn powershell.

Python is by far the superior Language if you really need to make your own functions to get things done. Never code from scratch unless you have to.

Challenge write a simple program to crack a hash in C/C++. Now write the same program in python. Notice in python you didn't have to convert strings, char, or any faggotry like that.

I have learned a half dozen languages. Python is by far the most useful. Not that C like languages are bad.

It's just I fucking hate troubleshooting software. Fucking Linux developers are the worst. Give instructions on how to build a package. do dependency goosehunt. Find deps and install them do build. Endless error message and looking shit up on search engines. Finally get shit to run like an hour later.

Go on Github. Download python script. python somescript does run ok python3 somescript. Shit runs as expected. Little troubleshooting

Simpler language. More human readable. Everything works without bullshit dev excuses. Works on my machine.

Attached: matrixback2.jpg (1152x864, 1.22M)

Good luck and godspeed user.