Is BASIC still worth learning?

Is BASIC still worth learning?

Attached: Listing1[1].jpg (1056x1401, 600.15K)

Other urls found in this thread:

cs.utexas.edu/users/EWD/transcriptions/EWD04xx/EWD498.html
freebasic.net/
wiki.osdev.org/FreeBasic_Bare_Bones
moria.de/~michael/bas/
amazon.co.uk/Programming-Microcontroller-MBASIC-Embedded-Technology-ebook/dp/B005HITT8O
petesqbsite.com/index.php
archive.is/glSDh
tedunangst.com/flak/post/to-errno-or-to-error)
iancgbell.clara.net/elite/text/index.htm
elitehomepage.org/text/index.htm
sami.salkosuo.net/elite-for-emacs/
github.com/samisalkosuo/elite-for-emacs
zap.org.au/software/trader/
youtube.com/watch?v=_bTtmSxbcTc
worldofspectrum.org/infoseekid.cgi?id=0019039
github.com/VCVRack/Rack
github.com/topisani/OTTO
en.wikipedia.org/wiki/Elite_(video_game)#Development
twitter.com/NSFWRedditVideo

I guess it's perfect for a child. C=64 was amazing!

Doesn't it rot your brain or something

You gotta give me a source on that.

cs.utexas.edu/users/EWD/transcriptions/EWD04xx/EWD498.html
More seriously, BASIC is a very poor language. TI-BASIC was one of my first programming languages. Doing anything complex in it was a pain.
It's possible that the lack of complexity made it very slightly easier to learn for absolute beginners, but languages like Python are easy enough and aren't dead ends.
I have fond memories of writing BASIC but I think that's more because of the context I was writing it in than because of the language itself.
If you're not a complete beginner and you don't have prior nostalgic experience with BASIC or historic interest or whatever you won't get anything out of it.

No. Good high level languages exist on almost every platform that once relied on a basic ROM for non asm programming, and they are widespread thanks to the internet.

My intro to programming was teaching myself TI-BASIC while bored in class. The simplicity made it easy to learn and was my gateway into programing.

Well, I am an absolute beginner. Should I still learn BASIC or should I just learn Python?

BASIC, if you want to go oldschool or you have a C64 laying around. Otherwise, Python is a good starting point.

at least go with FreeBASIC
freebasic.net/

Attached: prospector.jpg (926x697, 138.71K)

Python. It'll let you do useful things and it won't force you to write in a bad style.

Oh yes it does. It literally won't run the script unless it's formatted in its retarded way.

Even if you dislike Python's formatting it's still peanuts compared to what BASIC makes you do.

"Peanuts" is a pretty bad analogy for ease. They are annoying to crack open and make a mess.

I mean that it's insignificant. Unimportant.
You're comparing BASIC to Python, and the best thing you can think of is "Python forces you to make your code readable"? Come on now. There are dialects of BASIC without support for functions, and that's what you think is worth mentioning?

C is the only way to start. I'm already seeing the ravages in rigor that starting with something as high level as Python produce in my university.

Why would C be the only way? It may have more rigor than Python, but C is very flawed.
I haven't used it as much as C, but Go seems like it would be a lot more rigorous than C without losing the things that make C rigorous. Or if that's too newfangled, Pascal is explicitly a teaching language, isn't it? How about that one?
Calling C better than Python is one thing, but what could possibly lead you to believe it's the only rigorous option?

One could even write an OS in FreeBasic, if one were so inclined.

wiki.osdev.org/FreeBasic_Bare_Bones

It was never worth learning. It's just an entry language that you'd work on for a few weeks as a child to get you into programming.


If you're 13+ then don't even bother with basic.

lol

Nah. The only real use for BASIC is writing Excel macros, so if that's your thing, it's nice. But if that's your job you could just probably develop full fledged extensions in C# which is vastly superior.

The language itself was shit when it came out, but it aged even worse. It's unusable today even if you wanted to.


I really don't get how BASIC is supposed to be a "simple language for beginners". I've used probably 10 languages over the years, including C, and I still can't wrap my head around BASIC's asinine syntax whenever I try. It's just so ugly and illogical I get a headache. Honestly even C would be much easier to learn than BASIC. But see for instance PASCAL, a very nice "learning language" that actually abstracts away the language features that are confusing until you understand how computers work. Why couldn't BASIC be like PASCAL? Wtf was Bill thinking?

What are you talking about user?

Is it still worth learning? Depends, is there any modern implementations that can be applied to a modern workflow but doesn't completely butcher it like VisualBasic?

BASIC is a great language because it's FUN. Using it is just pure joy, it makes programming feel almost like a video game.

No you idiot, you have it all wrong. BASIC wasn't targeted for children, it was targeted towards non-programmers. That was the whole intention of Dartmouth BASIC, the original implementation of it. It ran on serious mainframes and was targeted towards people like engineers and doctors with no experience programming. Home Computers weren't for "children" specifically, they were made for normalfags in general. That's why it made the perfect default user interface in the age right before the GUI became mainstream. I cringe whenever people say BASIC was made for children. It was meant as a default, bare-bones, standardized interface for the computer of the home at a time when the GUI was still a pipedream for many people.

Attached: AAAAAAAAAAAA.jpg (292x292, 13.14K)

It's just linear programming with no defined functions. There's nothing to get, it's pretty much a high-level assembly language. Or maybe a batch processing language like Windows .bat files.

Unless you're talking about modern object-oriented implementations of BASIC because that shit is cancer

Old school BASIC is great to learn with, precisely because there isn't that much to it. And if you're using it on old computer (or emulator) you can read/write to all areas of memory and call machine language subroutines.

Attached: camel.png (256x3448, 18.11K)

It depends on the dialect. Despite being ancient, the BBC Micro's BASIC was designed for structured programming. But even more common dialects like MBASIC for CP/M (and thus GW-BASIC and QBASIC for DOS) let you GOSUB to a subroutine (with its own local variables), rather than be forced to do everything via GOTO.

Attached: raffles.png (384x272, 4K)

If your workflow consista of the terminal, then you might find this useful:
moria.de/~michael/bas/

Apparently you can program PIC microcontrollers in some MicroBASIC dialect. Probably makes more sense there than on modern computer, where the nanny OS limits what you can do.
amazon.co.uk/Programming-Microcontroller-MBASIC-Embedded-Technology-ebook/dp/B005HITT8O

Daily reminder that even knowing about BASIC will greatly limit your ability as a programmer.

Apparently even Commodore BASIC v2 had GOSUB

No. That said, the other day I was watching a YouTube video and I wanted to try something in the video, so I wrote a 15 line QBasic program.
SCREEN 12WINDOW (-2!, 2!)-(2!, -2!)TAU! = 3.14159265358979# * 2REM z! = ((1 + SQR(5)) / 2 - 1)REM z! = (2! / TAU!)REM z! = 1 / SQR(2!)FOR z! = 0! TO 1! STEP .01 FOR x% = 0 TO 299 px! = (.01 + x% / 200!) * COS(z! * TAU! * x%) py! = (.01 + x% / 200!) * SIN(z! * TAU! * x%) CIRCLE (px, py), .01, 1 NEXT x% LINE (-2!, 2!)-(2!, -2!), 0, BFNEXT z! The only use I have for BASIC is for a quick and dirty program to do something graphical.

Old school BASIC? Probably not unless you are going to make something for a Speccy for fun. FreeBASIC is fine in a modern context, though...although from a financial standpoint you'll still be best off with C and its extended family.

It's too bad Microsoft got rid of QBASIC because it was a really nice and handy tool. I guess they figured everyone would just jump onto GUI shit.
Also: petesqbsite.com/index.php

As an absolute beginner you should learn C. Not C++, this is a very important distinction there. Once you get how C works and how to get things done with it, it will be very easy to learn other languages.

Pascal is SEVERELY underrated. FreePascal is pretty dead ATM (no feature parity with Delphi), but it can do many things, much better and simpler than many more modern programming language.

I don't think Pascal is still a thing in the past, but a friend was a huge fan of it, and I was persuaded that the language has a lot potential. Definitely worth to try!

Is C worth learning from a hobbyist perspective? I want mostly want to do programming as a side activity that I may get serious with if I'm dedicated enough.

C can be fun, but if you're going to learn just one language, pick Python (or something similar). It will let you do useful things with a minimum of fuss. Making useful programs that genuinely save you time or let you do things you couldn't otherwise have done is exhiliarating, and C makes that a lot harder than it could be.
If the decision were between learning C and learning BASIC, or learning C and learning nothing, then I'd advise C. But there are other options.
You should look at C if you do decide to get serious, or if you start looking at multiple languages.
I've heard good things about the attached book.

If you need to ask, you're fucking retarded.

Technically yes, but whoever is subject to this idea will question that decision if they learn about the other, better languages.

I think assembler is more fun than basic, anyway.

The python duck typing system is a giant mess that is impossible to crack.

Rude

Hello again StephenLynx. This retarded argument again?

No. Learn C. From K&R.
Learn a BASIC language too - it will take all of 30 minutes.

Python for beginners is not good. They must figure out which pre-packaged system to use and immediately have 1 million choices to make and opportunities to explore, and are immediately overwhelmed. Also, Python is cancer. Learn C and you will learn how a computer operates. Learn Python and you will learn the intricacies of other people's libraries and not much else.

C should be the language you learn after you already know Lisp, Ada, PL/I, Fortran, BASIC, and a lot of other languages, so you know why C sucks. Programming languages have an influence on how you think and have cultures even though some of these cultures are no longer around. Reading the original papers and code, like I do with Multics and Lisp machines, give you a better idea of why they did these things. It's usually not bullshit like "to not confuse dumb C programmers" like the UNIX weenie excuse for why some part of JavaScript or PHP sucks.

And I hacked the renderer code to throw cpp the proper"-DFRAME=%05d" to spit out numbers the way I wantedthem. Why did I want the leading zeros? I don't know, I justthought it was cleaner.So I fired up the animation and let it run for a while(days).Well, the output was quite amusing (or at least it wouldhave been if I didn't need the results for my thesis defensea week later). The object would go down for a few frames,then jump up and go down a little, then go back to where itmaybe should have been, then jump up....After a little headscratching, I realized that the leadingzeros in my frame numbers were causing cpp to treat them asoctal values. How precious.But still, if I say "#define FRAME 00009" then "#ifFRAME==00009" should still fire (or it should at least whineat me). Well, 00009==00009 does trigger, but so does00009==00011.Huh? Well, some C library thinks that the nine in 00009 isn'toctal, so it interprets it as 9 decimal. And 00011 is a fineoctal rep of 9 decimal. So, both "#if FRAME==00009" and"#if FRAME==00011" fired and I applied two translate callsto my object geometry. And(!), it's not that having adecimal digit makes the whole number decimal. The string00019 gets interpreted as 00010 octal plus 9 decimal = 17decimal. Lovely, not.

I feel compelled to submit the following piece of C code: switch (x) default: if (prime(x)) case 2: case 3: case 5: case 7: process_prime(x); else case 4: case 6: case 8: case 9: case 10: process_composite(x);This can be found in Harbison and Steele's "C: A ReferenceManual" (page 216 in the second edition). They then remark: This is, frankly, the most bizarre switch statement we have ever seen that still has pretenses to being purposeful.In every other programming language the notion of a casedispatch is supported through some rigidly authoritarianconstruct whose syntax and semantics can be grasped by themost primitive programming chimpanzee. But in C this highlystructured notion becomes a thing of sharp edges and losescrews, sort of the programming language equivalent of acloset full of tangled wire hangers.

Do you think you ingratiate yourself to anyone by pointing out the obvious?

Objectively wrong

C is undeniably inexcusably flawed. Just look at this, for example:
archive.is/glSDh (tedunangst.com/flak/post/to-errno-or-to-error)
There are many reasons to keep using it, but calling it good is suspect, and I get confused when people hold it up as the ultimate programming language.

People call it that because it is, especially if you count in it's bigger brother C++.

C sucked in the 1970s, sucked in 1992, and it still has the same serious problems today that will never be fixed. You have to use a different language if you want the solutions.

>archive.is/glSDh (tedunangst.com/flak/post/to-errno-or-to-error)
Error handling was done right in the 60s, so there's no excuse for C sucking this much.

Not using C is always the better option. It's too hard to use different languages with C because C is intentionally incompatible. Compatibility means parts of programs written in different languages work together without needing a "foreign" language in the middle, like on Multics and the Lisp machines. If Python and Ruby were compatible, you could pass a Ruby string to a Python function and return a Python array/list that is used from Ruby, because they share the same data types and GC.

C is the ultimate in suck because decades old languages didn't have those flaws. The growing popularity of C caused new languages to become worse, like JavaScript, PHP, and Perl. These flaws are crucial to C's existence and all the UNIX languages that copied them, so pointing out mistakes in C makes them look bad too. That's why you have to go outside of UNIX culture to see why they suck.

There are times when I feel that clocks are running fasterbut the calendar is running backwards. My first seriousprogramming was done in Burroughs B6700 Extended Algol. Igot used to the idea that if the hardware can't give you theright answer, it complains, and your ON OVERFLOW statementhas a chance to do something else. That saved my bacon morethan once.When I met C, it was obviously pathetic compared with the_real_ languages I'd used, but heck, it ran on a 16-bitmachine, and it was better than 'as'. When the VAX cameout, I was very pleased: "the interrupt on integer overflowbit is _just_ what I want". Then I was very disappointed:"the wretched C system _has_ a signal for integer overflowbut makes sure it never happens even when it ought to".It would be a good thing if hardware designers wouldremember that the ANSI C standard provides _two_ forms of"integer" arithmetic: 'unsigned' arithmetic which must wraparound, and 'signed' arithmetic which MAY TRAP (or wrap, ormake demons fly out of your nose). "Portable Cprogrammers", know that they CANNOT rely on integerarithmetic _not_ trapping, and they know (if they have donetheir homework) that there are commercially significantmachines where C integer overflow _is_ trapped, so theywould rather the Alpha trapped so that they could use theAlpha as a porting base.

Date: Thu, 8 Oct 92 08:26:17 -0400 Of course Lint is useless. It's one of those programs that was written as a student project N^2 years ago and has never been debugged or brought up to date.I was unaware that proscriptions against free()ing pointersyou had never malloc()ed and referencing uninitializedvariables were a recent development, but isn't it great thatwe have (void) so that lint doesn't complain when I don'tcheck the return status from printf()? Even BASIC hason-error-goto - too bad Kerninghan and Ritchie couldn't bebothered to put an error handler hook into the language.By the way, that free() didn't crash the program. Unixmemory allocation routines are so nice and trusting - theyjust prepend a couple of bytes to your pointer and then addit to the free list, regardless of where the pointer points- if you own the memory, you just freed it(congratulations). The real fun starts when you do yournext malloc and get a segmentation violation. In my case, Iwas left scratching my head and wondering why 6 calls deepinto getpwnam(3) (a supposedly stable library call, but hey,this is Unix - there are no guarantees), my program suddenlysegfaulted without the slightest indication why. Pfft. Try giving LINT ANSI-C and see what it does.Isn't it nice that all the different implementations ofANSI-C are *less* compatible with each other than PCC-basedcompilers? Standards are soooooo wonderful. Try giving the Sun compiler ANSI-C, for that matter. With the advent of Solaris, Sun will address that problem bynot giving you a C compiler.

Yes, but no nu-Basics. If you're not typing in line numbers you're a fag.

gosub your mom's a whore

From a hobbyist perspective...depends on what you want to get out of the hobby. I mean you can do a lot with FreeBASIC, including interfacing with C if you so choose in the future, so for little spot things its fine for me. Really though if you are just tinkering rather than producing serious software, just find a language you can get your head around and stick with it.

It invites spaghetti code, can't be compiled to binary and generally doesn't even use indenting.

So if not Python and not C then what would a good first language be? Lisp?

You're not going to find something that anyone here recommends but nobody here violently disagrees with.
Python is considered a good option even outside this contrarian hellhole. Just go with that.

You were already given your answer. Python. You troll.

Learn python. It's not perfect but by the time you actually care you'll be inherently more capable at learning other languages and in the meantime you'll be learning something you can actually use along with other people. Lisp is an intellectual toy, nothing more.

Old school BASIC and immediately thereafter assembly language, on an 8-bit CPU. Avoid Unix and other environment with nanny OS and lots of abstractions.

Python and C are fine languages, don't get trolled by memers. So are C++, Java, JavaScript, C#, Rust, Go, and virtually any language with a sizable user base. They're all very easy to get started with. In fact you could try learning a bit of each one and go with the one that seems most enjoyable. They're not necessarily great languages, but by the time you're proficient enough in them you'll be able to judge languages on your own. I'd avoid functional langs like Lisp or Haskell because the tutorials will be confusing and boring at first if you have no programming experience (but as a second or third language definitely worth a try).

Most programming skill translates well, once you learn one decent modern language you're 2/3rds of the way there to learning others. Just don't learn ancient caveman shit like basic or perl.

I'm very disappointed.

Attached: basic.webm (720x480, 4.44M)

Wasn't Elite made in BBC Basic?

C is a good starting option because 1) it's very simple, 2) it's very straightforward, and 3) it gives you important basics knowledge. Python is immensely more complicated language and is harder to learn, plus it has ass backwards syntax-formatting abomination, so you better avoid it until you're comfortable with the standard syntax.

Attached: all according to keikaku.jpg (184x184, 52.83K)

Attached: women-programmers.jpg (640x480, 39.06K)

No, you fags. This is from Great Teacher Onizuka (GTO) from the scene where he (the BASIC airhead) wants to become a programmer, buys books for BASIC and then gets an interview for a job.
If you haven't watched it, do that now.
NOW

I watched all 43 episodes and that scene never happened. Its actually from OVA called "Golden Boy".

Fuck I watched that too and mixed it up. Well whatever. Watch it.

rewatching it now since I totally forgort about it

Fuck off with your botnet shit, kike.

Learn HolyC.

Attached: TempleOS; Terry Davis Jedi Tilt 6;20;2017.webm (640x360, 5.91M)

It doesn't look like it. This is from the FAQ at www.elitehomepage.org
Maybe you're thinking of Ultima 1? The original Apple II game was made in BASIC. Also a whole bunch of early SSI and Avalon Hill games were made in BASIC.

Attached: fifty_mission_crush_d7.jpg (640x971, 168.22K)

I was confused. The first 3D model that would later become the spaceship design was made in BASIC, then the game was written in assembly.

At least according to an old BBC documentary about Elite.

You could probably do the Text Elite in BASIC though, if your computer has enough memory.
iancgbell.clara.net/elite/text/index.htm
elitehomepage.org/text/index.htm

There's also an EMACS version:
sami.salkosuo.net/elite-for-emacs/
github.com/samisalkosuo/elite-for-emacs

Attached: eliteforemacs_img2.png (957x663, 114.92K)

Of fucking course there is.

But that looks like an interesting link. I've always liked those text-based games, ever since I got into text adventures a few years back.

HolyC is all right. Does Terry ever give reason as to why he didn't write a fully compliant ANSI C compiler instead?

The text trading game goes back a long ways. Look into Taipan and Star Trader. But there lots of other variants and similar games for various computers. I guess Drug Wars also falls under the same genre.
zap.org.au/software/trader/


He explains it in this video:
youtube.com/watch?v=_bTtmSxbcTc

Attached: trade-wars-2002_01.png (640x400, 2.46K)

What does he mean he doesn't want "to port the entire volume of C onto his machine"? When he says remove lines of code, is he talking HolyC code or actual instructions in assembly?

C hardly does anything, so it should be simple and straightforward, but it's very complicated because of all the bullshit. C strings and arrays suck and have non-straightforward semantics, like &a and a both being memory addresses, but with different types. In other languages, a is the array variable itself and taking the address (if the language allows it) gives you a pointer to the whole array. In C, a "decays" to &a[0] unless it's used in sizeof and some other corner cases. C is also missing a standard error handling mechanism, good memory management, real strings, overflow checks, arrays that can actually be passed to another function, and other things that are not complicated but C weenies say are complicated because they don't want to learn anything new, or old in this case, since that was all around before C existed.

> There's nothing wrong with C as it was originally > designed,> ...bullshite.Since when is it acceptable for a language to incorporatetwo entirely diverse concepts such as setf and cadr into thesame operator (=), the sole semantic distinction being thatif you mean cadr and not setf, you have to bracket yourvariable with the characters that are used to representswearing in cartoons? Or do you have to do that if you meansetf, not cadr? Sigh.Wouldn't hurt to have an error handling hook, real memoryallocation (and garbage collection) routines, real datatypes with machine independent sizes (and string data typesthat don't barf if you have a NUL in them), reasonableequality testing for all types of variables without havingto call some heinous library routine like strncmp,and... and... and... Sheesh.I've always loved the "elevator controller" paradigm,because C is well suited to programming embedded controllersand not much else. Not that I'd knowingly risk my life inan elevator that was controlled by a program written in C,mind you...

It hasn't been worth learning since the 80s ended.

I know everyone hates it, but I think it's a great language for what it is. It's good for beginners because of how simple and direct it is. It's kind of shit unto itself, but it serves as a good starting point

No, the former is an address the latter is a pointer. I can see why a brainlet like you might get confused.
I assume you are talking about when you want to pass an array as an argument to a function? There is no decay you can still access a[i], there is just no way for the function to know what the end of array is without the programmer telling it. If you are talking about multi-dimension arrays then it still doesn't decay the function just sees it as a 1D array. You can still access every element you just have to adjust your index accordingly. If you are using nD arrays I assume you have the mathematical background necessary for this to be nothing more then a minor inconvenience.
You need to stop.

BASIC is a good toy for children.

HAHAHA! What a literal faggot!

well it is the foundation which Microsoft was built on
you can figure out how Microsoft built its empire by trying to understand what they were going for

An array variable like a in char a[10] is a pointer? That's not how I understand it, but maybe the next C draft made arrays even worse. In a sane language, it's an array, an object that contains 10 characters, and taking its address gives you a pointer to the whole array.

Of course there's decay. It's not an array anymore or even a pointer to an array, just a pointer to a single element, which you can access like an array because of a hack. The type is different, not that the average C weenie, or even an above average one, knows anything about the bullshit that is C types. In C there's brain-dead bullshit where a[i] actually means *(a + i), so you're actually doing *(a + i) even when a is declared in the same function.

C multi-dimensional arrays are broken. Having to reinvent wheels and remake important programming concepts from scratch because the language doesn't do it properly is a huge inconvenience, and it sucks even more because you can't pass a multi-dimensional array to another function without a bunch of bullshit. If C did arrays properly, the language would be smaller, have fewer corner cases, and not suck as much.


No, that's exactly right. Languages like Lisp, PL/I, Ada, and even the smallest versions of BASIC do much more than C does, but are much simpler in the areas that C has in common. This causes C programs to be more bloated, including machine code size, than programs in other languages. That's not the only thing that makes a language good, but considering how little C does compared to BASIC, let alone Lisp, the language has no reason to be so bloated and complex. C is complex because it's badly designed, not because has a lot of features. It doesn't even have a way to handle integer overflow or properly pass an array to a function.

There are many reasons why GNU Emacs is as big as it iswhile its original ITS counterpart was much smaller:- C is a horrible language in which to implement such thingsas a Lisp interpreter and an interactive program. Inparticular any program that wants to be careful not to crash(and dump core) in the presence of errors has to becomebloated because it has to check everywhere. A reasonablecondition system would reduce the size of the code.- Unix is a horrible operating system for which to write anEmacs-like editor because it does not provide adequatesupport for anything except trivial "Hello world" programs.In particular, there is no standard good way (or even any inmany variants) to control your virtual memory sharingproperties.- Unix presents such a poor interaction environment to users(the various shells are pitiful) that GNU Emacs has had toimport a lot of the functionality that a minimally adequate"shell" would provide. Many programmers at TLA neverdirectly interact with the shell, GNU Emacs IS their shell,because it is the only adequate choice, and isolates themfrom the various Unix (and even OS) variants.Don't complain about TLA programs vs. Unix. The typicalworkstation Unix requires 3 - 6 Mb just for the kernel, andprovides less functionality (at the OS level) than the OSsof yesteryear. It is not surprising that programs that ranon adequate amounts of memory under those OSs have toreimplement some of the functionality that Unix has neverprovided.

I've already seen 10 years of degenerates who think they're low level or sound because they use C.

Also here's something neat: a space-trading game in one (long) line of BASIC.
worldofspectrum.org/infoseekid.cgi?id=0019039

Attached: straderl.gif (320x240, 3.97K)

Hey UnixHaterAnon, I have a question: Could Lisp machines do real-time DSP in their time?
I ask that because C/C++ and Fortran are the only high-level languages people use to do that, mainly because they aren't garbage collected, and all Lisp implementations I've heard of rely on a GC.
I would love to see a programmable guitar pedal with a Lisp interpreter/JIT compiler embed in it, but no one has done it yet because it doesn't seem to be a language suitable for that purpose, and I'm glad that C/C++ compilers allow us to mess with audio and video shit with their blast processing capabilities, so I'm not really sold on your UNIX hating tbh m8.

Is something that annoying loser would say.

Just because the file is one line does not mean that the program is one line. There are statement separators (:) everywhere.


What is Forth?
For a DSP, C/C++ and Fortran are just high-level assembly. In fact, they probably aren't using these languages at all, but rather a limited subset that looks like them, superficially. So, we can defined an "Embedded LISP", that probably doesn't have eval, so it could be aggressively compiled into code for an embedded processor (aggressive here meaning that it is compiled so as to have static memory allocation). No one has done this.

What did you mean by this?
Seriously, take github.com/VCVRack/Rack or github.com/topisani/OTTO for example, it is all pure modern C++.
Of course they don't use every single feature of the language, but doing that in C++ would be pure foolishness, since the whole point of it is to have a bunch of "zero-cost abstractions" at your disposal with a(n almost) seamless integration with C.

That said, yeah, I forgot about Forth and a Lisp dialect for DSP could probably be fine, even though I doubt that linked lists would be the right kind of data structure for that domain.

It's literally all on line 1 tho. Sure, he could have named it 1-screen game instead but what kind of lame ass game would only fit in one statement? This one's actually playable.

> github.com/VCVRack/Rack
< pure modern C++

> github.com/topisani/OTTO
< DSP is written seperately in faust, a functional language designed for audio processing that compiles to C++

When I read DSP, I thought that you meant the class of processor, not the application. As for that, normal LISP can run digital signal processing code fine. A decently optimizing LISP interpreter is probably not going to invoke the GC for any code not written by a complete idiot. For this case, I would say that lousy developers write in C++ because the lack of a garbage collection run hides how shitty their code is.

And another thing: this doesn't really matter. A good compiler is going to produce the best output code, which doesn't necessarily match the internal representation of the language. C/C++ wienies like to believe that they are programming "close to the metal", when, in reality, their compiler is going to convert their code into a functional-like IR, because most code optimizations have been investigated in functional languages, before mangling it even more, ignoring half of the things that they told the compiler to do, and producing a result that has the semantics of their program (as coded). A good LISP compiler will do the same, though LISP has had fewer people doing that work. LISP would be faster than C/C++ if it has half as many people working to optimize it as GCC and Oracle's Java do.

I am tired of hearing your LITHP.

Is smalltalk/squeak still worth learning?

For a non-tech perhaps, or if you are into retro computing.
BASIC would make sense if you went down the engineering route as I think some of them still use FORTRAN which has similarities (particularly the older versions), but even then not needed in depth.
It was more of a stop-gap between punch cards and modern languages.


It is as simple as it gets - BASICally a numbered list the computer goes through. It is like a first cousin of assembly, that talks in vague terms and shits the bed if you don't follow each item on it's list to the letter.


Actual programming with limited time for most value? Then C, C++, Java (if you are looking at building apps & using frameworks).
If you want something you can use to quickly automate a task of an activity you're currently doing manually (i.e. where programming would be most useful to you now) - Python, Perl, Bash, JavaScript, PHP.

Attached: ClipboardImage.png (3264x2448 281.63 KB, 12.05M)

That rocked Assembly with it's balls out. Those Cambridge mathematicians did some crazy sh*t to fit that 1000 planetary system + trading platform + 3D wireframe space flight simulator, and classical music score into a measly 64kb.
en.wikipedia.org/wiki/Elite_(video_game)#Development

Unlike today's StarCult the game actually delivered.

I've been getting sort of decent at Perl.
Is Ada a meme?

Just don't use GOTO you pussies

I will beat your faggot ass into oblivion