The Real Computer Revolution Hasn’t Happened Yet

That'd be so shit. I'd need to explicitly tag all the shit I create. It would be easy to do bad shit like forget to keep the metadata accurate as the data changes. How would the computer even explicitly reference one particular file?

Per-file IDs? Also, the OS itself would probably add per-file tags, but I didn't think of that. Like if it has an image it has an image tag, etc.
Here's an actual working concept of what I'm describing eecs.harvard.edu/~margo/papers/hotos09/paper.pdf

Attached: Oekaki.png (500x250, 14.92K)

UNIX leaders say they're supposed to be bad and obsolete. UNIX weenies think Multics is primitive and bloated.

It's the same with programming languages. "Why Pascal is not my favorite programming language" is C marketing. He says "Comparing C and Pascal is rather like comparing a Learjet to a Piper Cub" which is like saying Learjets are unreliable, have a lot of design defects, and cause billions of dollars in damage. Comparing Pascal to C is how I know it's marketing. "The size of an array is part of its type" is true in Ada, PL/I, and Fortran too, but the bounds of parameters are handled better. The C "solution" is array decay, which sucks, so C can't do bounds checking or multidimensional arrays properly. He talks about #include as a good thing, but new languages in 1981 already had modules. Pascal now has modules too and C is stuck with #include in 2018 and it can't be fixed. If it wasn't marketing for C, it would have included a few more languages to show how they solved those problems, but that would make C look worse than Pascal.

What really sucks is that the buzzword-powered kludges resemble real solutions made by other communities, like dynamic linking, OOP, high level systems languages, and so on, but they don't have the same benefits as the real thing.

Why am I retraining myself in Ada? Because since 1979 Ihave been trying to write reliable code in C. (Definition:reliable code never gives wrong answers without an explicitapology.) Trying and failing. I have been frustrated tothe screaming point by trying to write code that couldsurvive (some) run-time errors in other people's code linkedwith it. I'd look wistfully at BSD's three-argument signalhandlers, which at least offered the possibility of providehardware specific recovery code in #ifdefs, but grit myteeth and struggle on having to write code that would workin System V as well.There are times when I feel that clocks are running fasterbut the calendar is running backwards. My first seriousprogramming was done in Burroughs B6700 Extended Algol. Igot used to the idea that if the hardware can't give you theright answer, it complains, and your ON OVERFLOW statementhas a chance to do something else. That saved my bacon morethan once.When I met C, it was obviously pathetic compared with the_real_ languages I'd used, but heck, it ran on a 16-bitmachine, and it was better than 'as'. When the VAX cameout, I was very pleased: "the interrupt on integer overflowbit is _just_ what I want". Then I was very disappointed:"the wretched C system _has_ a signal for integer overflowbut makes sure it never happens even when it ought to".It would be a good thing if hardware designers wouldremember that the ANSI C standard provides _two_ forms of"integer" arithmetic: 'unsigned' arithmetic which must wraparound, and 'signed' arithmetic which MAY TRAP (or wrap, ormake demons fly out of your nose). "Portable Cprogrammers", know that they CANNOT rely on integerarithmetic _not_ trapping, and they know (if they have donetheir homework) that there are commercially significantmachines where C integer overflow _is_ trapped, so theywould rather the Alpha trapped so that they could use theAlpha as a porting base.

It was better when everyone had to use 8.3 filenames on a floppy disk without subdirectories. All "improvements" eventually just get leveraged to a greater extent by big brother and big data.

Lisp machines died because they needed dedicated hardware support and Intel microprocessors killed all of that.

this

Computing for its own sake has been tainted by businessfags and normalfags. They vastly outnumber anyone actually interested in computing and are only interested in "getting the job done" whatever that means.

What? Do you want us to make a FPGA implementation of a LISPM and develop an operating system for it as a one man team? That's a big undertaking.
They don't though. Both of these languages have char, int, long, etc. This is one reason why those aren't object oriented languages.

I was doing some LISPM research through my university's research database and encountered found this take on it. Then came the summer of 1986. A lot of users were concerned with the high priceof LISP machines and started running applications on more economical deliveryvehicles, like IBM Personal Compu er ATs and Sun workstations. Sure, there wasa performance differential, but the trade-off was worth it. Why buy a Symbolics3600 when you could get three Sun computers for the same price? The AIconsulting firm DM Data estimated that by the end of 1986, there were more than6,000 LISP machines installed worldwide -- more than a third of them fromSymbolics. However, DM Data also estimated that there were probably fewer than6,000 LISP programmers qualified to take full advantage of the machine. Thatmeans there are LISP machines sitting in companies with nothing to do.

The real computer revolution has come and went and it has left most of humanity behind, OP.
Tech illiterates are just getting the handsmedown from technologists.
It hasn't helped man achieve that much in regards to mental work. It instead enabled people to amuse themselves in idle entertainment.

LISP machines died because they served no purpose other than to fuel MIT's terrible computer science department.


If LISP ran faster on UNIX and C based, even Windows, computers than LISP machines, why is it still constipated in practical use? LISP is nothing more than fanciful research language that has no real practical purpose.

If LISP runs faster on UNIX and C based, even Windows, computers, then why even spend the time implementing any kind of LISP hardware when it's unnecessary bloat?

Entertainment will be the death of mankind.


Lisp machines are from the 60s, you can't compare them to modern day computers. What if we make a LISP that actually runs fast though?