30 years ago

every single 10 year old that had a home computer in his house knew how to program in BASIC
adult in his early 20's who uses a computer for almost everything doesn't even know what a variable is

what happened?

Attached: moves.gif (300x424, 2.56M)

Other urls found in this thread:

people.eecs.berkeley.edu/~kubitron/cs252/handouts/papers/symbolics.pdf
cosm.sfasu.edu/gharber/353/notes/Unix_philosophy.pdf
youtube.com/watch?v=AWXP_Ao-JIk
twitter.com/AnonBabble

Computers got too easy to use.

no one has computer nowadays

Attached: gameoflife.mp4 (1280x720, 4.68M)

Computers are hardly useful for more than work, thus mostly used by professionals
Computers are basically a necessity for modern life and everyone has one or several of them

people have smartphones, computers are for the ancients

Attached: whatisacomputer.jpg (1280x720, 69.59K)

In 1988 the computer gaming industry was already big, and the hardware was good enough to do music and pixel art (esp. on Amiga and Atari ST). But even on PC, lots of dudes made things, even if it was just ANSI art, and the BBS scene was pretty strong.
But overall it was still more of a hobbyist environment, where you learned to program games in BASIC or Turbo Pascal and did shit on your own or in small groups, without any corporate/political influence or a bunch of random faggots telling you what you're supposed to like and do.

Attached: A-Talk.jpg (1199x930, 291.78K)

thats a facebook-machine, not a computer

Subhuman normalfags ruin everything.

Was Jobs a mistake?

Attached: ClipboardImage.png (1280x720, 480.59K)

Eternal September and the easiness of use of computer in the most dumb way that it could have been done:
IT JUST WORKS
There was no good pedagogical system made in the beginning for the masses and there still isn't the use of a computer should have required a computer license but nooo lets just let any niggercattle drive a car with no license.

Reminder that Jobs sniffed his own farts so hard that it killed him.

We must have many children and teach them the lost art of proper computing and steer them away from the wide road to hell that is mobile platforms.

lol

only rich well-to-do families had computers 30 years ago.

That's the problem. Corps were hiding computers from users under layers of abstraction even back then.

You could still use assembler if you'd like, and many did because BASIC was too slow for more ambitious projects.

Obligatory

Attached: Terry Davis - Where It All Went Wrong.webm (1280x720, 1.39M)

t. not me
t. me
op teach me how to be good at computers like the good ol days

I don't expect everyone to be able to code, but people should at least understand the basics of using a computer. Like those people who save everything to the goddamn desktop because they don't understand how directories work. That's simply not acceptable.

Millennials are truly a wasted generation.
At least Gen Zyklon B are using SBCs at the age of 10.

/thread

not so much wasted as lost. unfortunately for gen z the economic effects of an entire generation being utterly destroyed economically will be felt long after we are dead. the boomers are crashing this plane with no survivors.

They made the machines stupid so they wouldn't scare the nigger cattle, that's what happened. Every baboon and retard can have a pooter nowadays.

There wasn't much abstraction. You didn't get all kinds of libraries and frameworks, you had to write your own stuff. BASIC even gave you direct access to all the computer's memory and ports, and you could even overwrite the OS if you wanted to (pretty much the equivalent of ring 0 on modern hardware). But mostly that was useful for POKE'ing machine language instructions into memory and calling it like a subroutine. Because BASIC itself was good enough for turn-based or slow-paced games, but if you wanted to do a fast action game, you needed to use some machine language.
Terry mentions in one of his videos that a very common beginner's program was a memory dumper. You could do something like that in a dozen lines of code, and then expand it into an actual hex editor, and later modify it to read/write floppy disk instead of memory. There weren't any big hurdles in the way like nanny OS or complicated filessytem. And best of all, if you fucked up, simply hitting the reset button brought you back to a sane state in one second (assuming your machine had BASIC in ROM or on cartridge), otherwise it took a few more seconds.

Attached: CoverTRS80News.jpg (508x657, 402.42K)

Do you even need to ask?

Attached: RMS on Steve Jobs' Death.jpg (800x450, 55.91K)

install gentoo fgt

The need to learn how to program is far less today than it was before

I don't agree, and not everyone needs to be a programmer, but people don't even know what an operating system is. Even when I was a kid in the 90s we were at least taught basic stuff about computers, dos, and windows. Now they just 'teach' nigger-cattle to use microsoft office, and every aspect of the computer itself is just voodoo.

IQ curve. When computers were only available to top researchers the users were even smarter. As more people get access to computers of fucking course the quality of the average user will go down.

If you think a variable in computer science and a variable in grade school mathematics are substantially different things, then you don't know shit about computers.

...

user, in 1 = x + 2 x is not a variable. It's a constant.

but in y=x x is a variable

Well actually, it is an equality.

No, it's a constant with a value of y. If you have f(x) = x then x is a variable but I don't think you see that in grade school.

...

They were dumbed down for lower-middle class whites, fag.
In what dimension did Nigs have money to buy computers en masse before the mid-2000s?

y=x is only used for basic algebra. After Year/Grade 8, you use f(x).

As long as the point is getting across.

Attached: 2b4aa324da0089e8c7245734bed4f5c611d1067e852a00546aeb23d8947205b7.jpg (1200x900, 261.22K)

Both x and y are variables, x being independent (taking any values from the functions domain) and y being dependent on x (and thus taking values from a range which depends on both the domain of x and the relationship betwen x and y established by y = f(x)).
The derivative expressed as dy/dx or y' is hardly "basic algebra". Even in differential equations y is used because it's much simpler to write y (when it is obvious what y is and what it is dependent on) than f(x).

BASIC on just one floppy came with many computers, while C compilers and things like that were nowhere near the obviously easy thing to get back then.

BASIC was simple. I remember finding it on the Windows 98SE CD, running it and I could already start to program. These days, a guy wanting to program has to: choose a language, install compiler, install IDE, install libraries. Then deal with dumb shit such as declaring variables, semicolons, or God forbid, forced fucking indentation, instead of just coding.

Seems incredible parochial to imagine that every non-brainlet dedicating there time to learning to program would be constructive. What is physics, electrical engineering, biology, etc.

No one demands everyone to become expert hacker, just basic computer literacy is enough.
Fields of science where even a little bit of programming know-how can help immensely.

To be fair people are already taught the bare basics of physics and biology. IMO more of electricity should be taught, I've sometimes felt it necessary. Basics of coding should be taught too, but CS should not (if there's time add some more math or logic instead). For dummies coding is something anyone with a 3-digit IQ is able to grasp (with effort, I'm not saying it's easy), don't forget even MBAs manage it with Excel.

Programming is too focused on UNIX languages like C, C++, Java, JavaScript, and PHP, which all suck. BASIC is simple and easy to learn.

I feel compelled to submit the following piece of C code: switch (x) default: if (prime(x)) case 2: case 3: case 5: case 7: process_prime(x); else case 4: case 6: case 8: case 9: case 10: process_composite(x);This can be found in Harbison and Steele's "C: A ReferenceManual" (page 216 in the second edition). They then remark: This is, frankly, the most bizarre switch statement we have ever seen that still has pretenses to being purposeful.In every other programming language the notion of a casedispatch is supported through some rigidly authoritarianconstruct whose syntax and semantics can be grasped by themost primitive programming chimpanzee. But in C this highlystructured notion becomes a thing of sharp edges and losescrews, sort of the programming language equivalent of acloset full of tangled wire hangers.

and they grew up to be faggots who post stories on HN about how they used basic when they grew up and they still suck. seriously, everyone there is retarded and they spend their entire life coming up with new memes like "le abstraction doesn't exist" and "security is hard". they don't even know how to do some basic concurrent programming, or how to sanitize input into their retarded database queries
nope, all the kiddies know JS now. but yes unfortunately they don't know what an integer is and all their programs spew undefined and NaN everywhere

Computers went from extremely niche items that only autists would seek out to being extremely common items that are in every single household no matter what.

All 100 of them. kek

Wat. 30 years ago was 1988. There were tons of home computers in homes at that time. What you're talking about is more like the mid 70's when you had to build your own computer from a kit.
I personally knew well over a dozen other kids with various computers, everything from Apple II, TRS-80, Amstrad CPC, and Atari ST. And they all knew how to operate it and write some BASIC. The manuals that came with the computer taught you everything.

Attached: 2.png (384x272, 2.34K)

They used to teach BASIC to every kid in school who wasn't at the lead paint chip eater tier.

Dartmouth pioneered the idea of teaching it to liberal arts majors and this spread far and wide, some people even credit this for sparking the home computer revolution.

These days the funding and energy is directed toward studies departments so the Chinese can dominate us. Yes, it was commies on the Long March Through The Institutions who ruined education in the USA.

Not my manuals man. I want a single page. It needs to only contain surface material. I don't need to know the internals of programs. I don't need to know how the outputs work with out programs. Just a single page, preferably with one sentence on it. Also, everything is electronic data now, no more physical copies, ever.

When HP started selling RPN calculators without a printed full manual I knew we were fucked.

Attached: watch-out.jpg (639x419, 44.48K)

A reality which universities and faggots on the internet refuse to accept is that LOWER-LEVEL LANGUAGES ARE EASIER TO LEARN FOR BEGINNERS; and because they generally mimic the operation of actual hardware better (i.e. you can convert a statement to an approximation of clocks used in your head), PEOPLE WHO LEARN WITH LOWER LEVEL LANGUAGES ARE SUPERIOR PROGRAMMERS.

The reason crap like C# and Javascript keeps getting taught as a _first_ language is that people who use it in their jobs hope novices will be baby-duck'ed into using it too, just like them. The other reason is that for the people who spend their time developing these frameworks and abstractions, they simply don't want to see something they put work into ignored.

However, the actual experience for a complete novice using one of these is

....... But really, why are you implementing it yourself? Just use ThisFadLibrary instead".

Do you think computer classes should start with something like Altair-8800?
That would be rad.

C only "mimics" the hardware on hardware made to run C in the first place. RISCs mimic the operation of C because RISCs were designed to run C and UNIX programs. Any instruction not used by a C compiler is considered CISC by RISC weenies. Lisp machines are about efficiency and productivity. They were invented to make dynamic typing and GC and bignums faster because they were slow on most other hardware. Having these features on the lowest levels makes them more efficient and more productive.

C# and JavaScript look like C and they were designed for C and Java programmers. Hating C# and JavaScript means hating UNIX.

You're starting to understand the problem, but this applies even more to C and UNIX. That's why UNIX-Haters is still relevant. That's why they're still shilling Plan 9, an OS that was bad for 1991.

Subject: Hating Unix Means Hating Risc Date: Fri, 22 Mar 91 21:34:47 EST From: JW Hey. This is unix-haters, not RISC-haters. Look, those guys at berkeley decided to optimise theirchip for C and Unix programs. It says so right in theirpaper. They looked at how C programs tended to behave, and(later) how Unix behaved, and made a chip that worked thatway. So what if it's hard to make downward lexical funargswhen you have register windows? It's a special-purposechip, remember? Only then companies like Sun push their snazzy RISCmachines. To make their machines more attractive theyproudly point out "and of course it uses the greatgeneral-purpose RISC. Why it's so general purpose that itruns Unix and C just great!" This, I suppose, is a variation on the usual "the wayit's done in unix is by definition the general case"disease.

Do you know how I know you are full of shit?

I wouldn't pay even a dollar more for the extra silicon needed to implement that bloat. Especially when you consider the inevitable increase in power consumption.
Cool buzzwords. Are you some sort of corporate manager?
ebin

You would end up saving money, silicon, power, and memory. We're already paying more for x86 bloat and wasted silicon than any extra silicon that would come from a Lisp machine.

people.eecs.berkeley.edu/~kubitron/cs252/handouts/papers/symbolics.pdf

No, but it is more efficient and more productive. UNIX "academic" handouts use those buzzwords too.

cosm.sfasu.edu/gharber/353/notes/Unix_philosophy.pdf
While it's true that UNIX is slow, it's also not productive because of C.

W and X suck because of Y and Z.

> There's nothing wrong with C as it was originally > designed,> ...bullshite.Since when is it acceptable for a language to incorporatetwo entirely diverse concepts such as setf and cadr into thesame operator (=), the sole semantic distinction being thatif you mean cadr and not setf, you have to bracket yourvariable with the characters that are used to representswearing in cartoons? Or do you have to do that if you meansetf, not cadr? Sigh.Wouldn't hurt to have an error handling hook, real memoryallocation (and garbage collection) routines, real datatypes with machine independent sizes (and string data typesthat don't barf if you have a NUL in them), reasonableequality testing for all types of variables without havingto call some heinous library routine like strncmp,and... and... and... Sheesh.I've always loved the "elevator controller" paradigm,because C is well suited to programming embedded controllersand not much else. Not that I'd knowingly risk my life inan elevator that was controlled by a program written in C,mind you...

Nigger, I use LISP and I can tell you're full of shit.

This is great.

Attached: laughing bird.mp4 (480x480, 949.63K)

That crowd still exists and is probably bigger than ever. They simply don't have the majority anymore. Instead, the largest group is the consumers, who wouldn't have really had computers back then.

Attached: 2018-hey-son-i-found-a-picture-of-your-grandpa-31653661_1.png (500x514, 324.33K)

1978 Autist -- those of incredible mental capacity, who could develop code in their heads, convert it to octal/hexadecimal, type it into the Microcomputer's monitor program, and it worked first-go. Got their first job at 15 when a University Professor noticed their prodigal abilities and introduced them to the MAINFRAME.

2018 Autist -- wacks it to fanart of cartoon characters, still living with their parent(s) at 26, and who favourite hobby is posting Nazi pix to Internet as pissing people off gets them excited.

Attached: OPvt220.png (449x337, 2.5K)

Dumb question: can't you just make FPGA Lisp machine?

Not targeted at common people
Targeted at common people

Yes. It wont' necessarily be fast, though (compared to modern CPU's)

In theory you can, in practice FPGA tools are bitch to work with, or so I heard.
LoperOS when

FPGA tools are easy to work with

Automatic storage management is superfluous
Fixed.
Less storage than what? Data in a program written in a garbage-collected (lol) dynamically-typed (lol) language executed on a conventional architecture?
In other words, more complex and wasteful decoding logic.
How about: don't use a program that does "tag checking" if you care about performance.

Most i knew had a sega or nintendo and later had playstation and windows/mac shit

Hello /g/

The 'original' Lisp machines would've -loved- FPGAs, as they needed to do things with registers and memory that normal computers weren't great at.

As I was informed, reasonably priced FPGAs with open sdk are shitty and can't do anything useful.
Open sdks are usually in barely implemented state and can't do anything useful either.
Cool and powerful FPGAs are prohibitively expensive, have proprietary sdk that are prohibitively expensive, and said sdk only work in Windows.

It would be good to be wrong here though.

Muhellinals

Attached: 1521913079896.webm (448x252, 3.98M)

We need a war.

Attached: 1467935694790.gif (290x705, 301.72K)

No. Concentration camps for rehabilitation.

This fucking garbage takes up 4 megs. Meanwhile 30 years ago you could fit pic-related on an 880K floppy.

Attached: Neuromancer.png (320x256, 13.56K)

Proven to be ineffective. The best solution is gas chambers.

What and where did it all went wrong?

Attached: dae.jpg (1280x720, 172.54K)

the chief reason why people get cuck'd by companies and salesmen is that they just don't know how a computer works on a basic level and they don't know how to use them.


that video made me so very angry.
Also, if I ever meet the fag who said "anti-social" in the video, I would say: "I'd just like to interject for a moment. What you're referring to as anti-social, is in fact, asocial!"

youtube.com/watch?v=AWXP_Ao-JIk

I might just kill myself after watching that.

bump