The Real Computer Revolution Hasn’t Happened Yet

hooktube.com/watch?v=oKg1hTOQXoY
archive.is/kzpa
(This is all the dialogue from the video, for those who will not watch/scrape it.)
Instead of an *actual* revolution, we have the following:
Java, C++, Javascript, Web engineering [sic], Windows, modern Unix derivatives (including Linux, BSDs, OS X, iOS, Minix, Android, Plan 9), x86, ARM, and the whole "Cloud" business. (Also, decades before all this happened, C and Unix essentially extirpated more interesting languages (Pascal, Forth, Fortran, Lisp) and Operating Systems (ITS, Multics, VMS.)

Basically, post-Smalltalk and post-Symbolics computing is snake oil. We're in the dark ages, utilising repainted hardware and software from the 70s. The only thing that prevents it from collapsing is Moore's law (failing).
All the things named were pushed as a "next big thing" while the _real_ important technologies have faded away into obscurity. x86 should have died at the hands of Symbolics Ivory or DEC Alpha, any Unix or Windows systems should have been replaced by some offspring of Genera and Smalltalk-80, C++/Java are bloated and inferior to contemporaries like Common Lisp, et cetera, et cetera. The modern computer industry is a joke, it's unable to discern buzzword-powered kludges from the real thing.

I've also provided the essay by Alan Kay in pdf format.

Other urls found in this thread:

history-computer.com/ModernComputer/Personal/Dynabook.html
eecs.harvard.edu/~margo/papers/hotos09/paper.pdf
adapower.com/index.php?Command=Class&ClassID=FAQ&CID=328
sbir.gov/sbirsearch/detail/826011
sel4.systems/Info/Docs/GD-NICTA-whitepaper.pdf
adacore.com/papers/safe-secure/
adaic.org/resources/add_content/standards/05rat/html/Rat-1-3-4.html
en.wikipedia.org/wiki/SPARK_(programming_language)
adacore.com/about-ada/benefits-and-features
adahome.com/FAQ/programming.html
en.wikipedia.org/wiki/Steelman_language_requirements
archive.is/8F3cB
archive.is/tyHlI
dwheeler.com/steelman/steeltab.htm
dwheeler.com/steelman/steelman.htm
twitter.com/NSFWRedditVideo

(Disclaimer, I haven't read the PDF.)
You know what that's called? Natural selection.
Get over it. The ugly brute that is C was better, practically speaking, than the rest. Nobody's stopping you from using Pascal if you want to. Get a compiler and IDE, grab a tutorial, register on a newbie forum and start coding like hell.
Tough shit, you being a genius and all. The tyranny of the mediocre majority, stepping on each other's toes rather than standing on each other's shoulders, etcetera.
Are you going to include OOP with its design patterns in that? I would. But back on track:
Yes it did. Every animal has a handheld computer with internet connection, high res video camera, touchscreen, functioning as a wireless telephone sometimes. That's the real computer revolution, and it happened. Giving magic technology to the plebs and them being able to use it even without understanding it.

What is it that you wanted, exactly, in the "real" revolution? Tech that was so arcane and difficult to use that only specialized CS doctors were allowed to touch it?

Why do you assume I can't program? I do program in FreePascal/SBCL CLisp for (believe it or not) end-user (personal, to be fair) programs. I'm planning on learning Ada soon (that is another language that wasn't allowed to shine because of C/C++). Anyways, that is beside the point.
Alan Kay does not consider Java/C++ to be OOP. His definition of OO is Lisp and Smalltalk.
Alan Kay spearheaded the idea of personal computing* and Lisp Machines had far more documentation than Unix (and every other OS today, really).
This could have been done with the elegance and dignity of computers retained. *history-computer.com/ModernComputer/Personal/Dynabook.html

t-this entire board is gonna be nothing but lisp meanies, isn't it?

This. Where's your modern Lisp OS now? Don't say it's too hard to do. There's tons of super-niche software projects like Genode and 9Front and Haiku, and even super-niche hardware projects like EOMA68 and TALOS II.
a lot of silly anons think OOP is stupid and for pajeets ^.^
I dont think so, but I dont think its really necessary for a lot of stuff.
maybe making the OOP stuff optional would be a better idea. Languages like Java and C# require you to do all the OOP class and object stuff even if you're not really using it.
apparently for C-like languages to not exist anymore because they hurt his feels UwU
My "real" revolution would be a Free Software revolution, but that's just me.


It feels like you meanies' definition of everything is Lisp and whatever the other dead OSes used.
*giggles*
once again, you lisp meanies' definition of everything good seems to be "not C or C-like."

Attached: UwU.jpg (480x640, 44.57K)

Attached: clisp-win32.png (677x342, 47.38K)

Do you understand the difference between a family of programming languages and one implemention of one dialect of a programming language?
Lisp is jewish, like AWK, but that screenshot is the least coherent way to try to prove it.

The real redpill is database filesystems. UNIX spread the idiotic idea of hierarchical filesystems, while database & metadata-based filesystems are objectively superior.

Attached: canvas.png (500x250, 16.42K)

That's too far back. We need to go back to the paradigm of early Unix, when filesystems were simply directed graphs, no tree structure imposed.

Yeah, no. We need tag-based filesystems only. No filenames, no file extensions. No one needs that shit.

Attached: Oekaki.png (500x250, 22.17K)

All persistent data should be stored in a single flat file, directly mapped to disk storage. The rest of the filesystem is kept around only as an abstraction layer for operating system facilities.

That'd be so shit. I'd need to explicitly tag all the shit I create. It would be easy to do bad shit like forget to keep the metadata accurate as the data changes. How would the computer even explicitly reference one particular file?

Per-file IDs? Also, the OS itself would probably add per-file tags, but I didn't think of that. Like if it has an image it has an image tag, etc.
Here's an actual working concept of what I'm describing eecs.harvard.edu/~margo/papers/hotos09/paper.pdf

Attached: Oekaki.png (500x250, 14.92K)

UNIX leaders say they're supposed to be bad and obsolete. UNIX weenies think Multics is primitive and bloated.

It's the same with programming languages. "Why Pascal is not my favorite programming language" is C marketing. He says "Comparing C and Pascal is rather like comparing a Learjet to a Piper Cub" which is like saying Learjets are unreliable, have a lot of design defects, and cause billions of dollars in damage. Comparing Pascal to C is how I know it's marketing. "The size of an array is part of its type" is true in Ada, PL/I, and Fortran too, but the bounds of parameters are handled better. The C "solution" is array decay, which sucks, so C can't do bounds checking or multidimensional arrays properly. He talks about #include as a good thing, but new languages in 1981 already had modules. Pascal now has modules too and C is stuck with #include in 2018 and it can't be fixed. If it wasn't marketing for C, it would have included a few more languages to show how they solved those problems, but that would make C look worse than Pascal.

What really sucks is that the buzzword-powered kludges resemble real solutions made by other communities, like dynamic linking, OOP, high level systems languages, and so on, but they don't have the same benefits as the real thing.

Why am I retraining myself in Ada? Because since 1979 Ihave been trying to write reliable code in C. (Definition:reliable code never gives wrong answers without an explicitapology.) Trying and failing. I have been frustrated tothe screaming point by trying to write code that couldsurvive (some) run-time errors in other people's code linkedwith it. I'd look wistfully at BSD's three-argument signalhandlers, which at least offered the possibility of providehardware specific recovery code in #ifdefs, but grit myteeth and struggle on having to write code that would workin System V as well.There are times when I feel that clocks are running fasterbut the calendar is running backwards. My first seriousprogramming was done in Burroughs B6700 Extended Algol. Igot used to the idea that if the hardware can't give you theright answer, it complains, and your ON OVERFLOW statementhas a chance to do something else. That saved my bacon morethan once.When I met C, it was obviously pathetic compared with the_real_ languages I'd used, but heck, it ran on a 16-bitmachine, and it was better than 'as'. When the VAX cameout, I was very pleased: "the interrupt on integer overflowbit is _just_ what I want". Then I was very disappointed:"the wretched C system _has_ a signal for integer overflowbut makes sure it never happens even when it ought to".It would be a good thing if hardware designers wouldremember that the ANSI C standard provides _two_ forms of"integer" arithmetic: 'unsigned' arithmetic which must wraparound, and 'signed' arithmetic which MAY TRAP (or wrap, ormake demons fly out of your nose). "Portable Cprogrammers", know that they CANNOT rely on integerarithmetic _not_ trapping, and they know (if they have donetheir homework) that there are commercially significantmachines where C integer overflow _is_ trapped, so theywould rather the Alpha trapped so that they could use theAlpha as a porting base.

It was better when everyone had to use 8.3 filenames on a floppy disk without subdirectories. All "improvements" eventually just get leveraged to a greater extent by big brother and big data.

Lisp machines died because they needed dedicated hardware support and Intel microprocessors killed all of that.

this

Computing for its own sake has been tainted by businessfags and normalfags. They vastly outnumber anyone actually interested in computing and are only interested in "getting the job done" whatever that means.

What? Do you want us to make a FPGA implementation of a LISPM and develop an operating system for it as a one man team? That's a big undertaking.
They don't though. Both of these languages have char, int, long, etc. This is one reason why those aren't object oriented languages.

I was doing some LISPM research through my university's research database and encountered found this take on it. Then came the summer of 1986. A lot of users were concerned with the high priceof LISP machines and started running applications on more economical deliveryvehicles, like IBM Personal Compu er ATs and Sun workstations. Sure, there wasa performance differential, but the trade-off was worth it. Why buy a Symbolics3600 when you could get three Sun computers for the same price? The AIconsulting firm DM Data estimated that by the end of 1986, there were more than6,000 LISP machines installed worldwide -- more than a third of them fromSymbolics. However, DM Data also estimated that there were probably fewer than6,000 LISP programmers qualified to take full advantage of the machine. Thatmeans there are LISP machines sitting in companies with nothing to do.

The real computer revolution has come and went and it has left most of humanity behind, OP.
Tech illiterates are just getting the handsmedown from technologists.
It hasn't helped man achieve that much in regards to mental work. It instead enabled people to amuse themselves in idle entertainment.

LISP machines died because they served no purpose other than to fuel MIT's terrible computer science department.


If LISP ran faster on UNIX and C based, even Windows, computers than LISP machines, why is it still constipated in practical use? LISP is nothing more than fanciful research language that has no real practical purpose.

If LISP runs faster on UNIX and C based, even Windows, computers, then why even spend the time implementing any kind of LISP hardware when it's unnecessary bloat?

Entertainment will be the death of mankind.


Lisp machines are from the 60s, you can't compare them to modern day computers. What if we make a LISP that actually runs fast though?

You mean those LISP machines that Symbolics were producing in the 80's? Wow.

Damn. I didn't know they made them up to that date. What kind of technology were they based on?

See for yourself.

Attached: genera2.png (898x1142, 169.31K)

Hey (((LISPnigger))), if C is so bad and (((LISP))) is so great, why does every machine in existance use an operating system programmed mostly in C, and why is (((LISP))) seldom used in development?
If it costs millions in dollars than why do companies still use C and are completely fine? They wouldn't use C if it truly costed them money, and they would use (((LISP))) if it had any positives. Natural selection. Also why did Multics machines had such bad latency compared to a RISC machine like the SGI Indy?

Hard mode:
no obscure and old bbs newsgroup opinion that magically prove X

Attached: computer latency.png (588x846, 131.99K)

ooh so it's because lispy machines need special snowflake hardware and can't work on normal processors.
And don't say that "those processors are made for UNIX! [insert Sun workstation 'General Purpose' blockquote]". Microsoft clearly has not had any issue creating their non-UNIX operating systems on there. They even had support for quite the variety of stuff too, with their now-dead CE version running on MIPS, SuperH, and PowerPC. What is it about Lisp OSes that makes them not run on 99% of CPUs?
Also, I wasn't implying you do it all by yourself, that would be crazy. But was just showing that it wouldn't necessarily take a company the size of Intel or AMD to do stuff.
so wait, to be a """real""" OOP language it can't have data types? Keep in mind that data types in C# are actually objects and have methods such as TryParse and whatnot.

Attached: 484a1a89d2ad2ad3569afa3c1f0a1c91.jpg (670x934, 56.12K)

filtered

kys retard

FPGAs are pretty much the step before ASICs. (The CPU you are using right now is 99.99% an ASIC) Not only are ASICs extremely expensive to manufacture, there is not yet a free software toolchain to produce one unlike the current situation with FPGAs.
A lisp machine refers to both the operating system AND the hardware it is running on.

Because it's easy and every drooling programmer today knows how to program in C; despite Ada being much better for device drivers/critical systems, and Pascal/Lisp being better for end-user programs.
What does this even mean? I think it's idiotic that people use C/C++ for embedded systems/rockets/drivers. They used C++ for the F-35 because of the availability of C++ programmers. But, "Worse is Better" what the industry has stagnated on.

But, "Worse is Better" is what the industry has stagnated on.

You know how I know you don't program?

Anything dependent on emulation in all its euphemism treadmill incarnations (interpreted runtime, bytecode, garbage collection, etc.) isn't a real programming language. Ultimate redpill is ALGOL/Modula/Oberon.


Reminder M$ tried and failed at exactly this with Longhorn, while Apple's resource forks and Be's BeOS both dipped their toes in the idea.

Attached: Microsoft-Windows-Code-Name-Longhorn-3.png (900x675, 1.11M)

besides your opinion, what objective facts do you have to back what you said up?

I do it as a hobby; yes, I am not employed as a programmer. Why does that matter? C/C++ *is* the wrong choice for critical systems - that should be obvious. They're replacing Ada/SPARK with C/C++ for military systems too, that should be great.
Also, this:
adapower.com/index.php?Command=Class&ClassID=FAQ&CID=328
I shudder to think what would've happened had it been programmed in C.

We already know you don't program.

I still don't understand what gives you that impression. Why is it that if you hate C/Unix you're considered a LARPer here?

You don't show any proficiency knowing what kinds of problems programmers face in actual practice. You roll around with these ideals that have no practical use, and when applied in real-world scenarios, they are proved time and again less powerful, less efficient, and costly experiments that have not aided in aiding the programmer, the user, or the company.

Yeah, no. Ada/SPARK is far, far superior to C/C++ for anything critical. The fact that they're being phased out has everthing to do with the lack of Ada/SPARK programmers - and nothing to do with C/C++ being the pragmatic solutions.

Provide proof. Otherwise your opinion is shit.

Prepare for blockquotes! OwO

you mean this?
sbir.gov/sbirsearch/detail/826011
sel4.systems/Info/Docs/GD-NICTA-whitepaper.pdf
I think it will go really well!!

You are quite a persistent idiot.
adacore.com/papers/safe-secure/
adaic.org/resources/add_content/standards/05rat/html/Rat-1-3-4.html
en.wikipedia.org/wiki/SPARK_(programming_language)
adacore.com/about-ada/benefits-and-features
adahome.com/FAQ/programming.html
en.wikipedia.org/wiki/Steelman_language_requirements

They used C++ in the F-35 JSF. Guess what? the airplane shows duplicates of the targets when networking with friendlies, confusing the pilots to the point that they don't know what is really there and what isn't.

Not exactly the best sources.
Did you write the programs for them?

Nothing you've provided except away the requirement of a properly trained and hard-working programmer. It's all bytes in the end, and whatever high-level language you use doesn't matter in the long run if you don't know how the low level works. In other words, shut the fuck up.

You're captious and braindead.
Yes, something incredibly apropos to what you were asking that supported my point - bug-prone languages like C/C++ should never be used in critical/military situations. You know how I know you don't program? You think that inherently bug-prone languages are better than (relatively) bug-free languages; because of "pragmatism" and "It's all bytes in the end" (sic) - stop drooling on your keyboard and go back to working your prole job.
Also, here's an article that directly compares C and Ada: archive.is/8F3cB
You'll find an petty, cavillous way to write it off as babble though - I just know it.

Sources are everything. If you can't provide a decent, unbiased source, then your argument is opinion, and I don't care to argue about your opinion.

Here (((you))) go again. Shut the fuck up. If you don't know the low-level then your are uneducated for any high-level application of that hardware. Shut the fuck up.

Linux isn't a UNIX derrivative. It was merely inspired by MINIX and is currently the best kernel

I just *knew* you'd pull this card. You didn't read them at all. I have no idea how you're some kind of savant "real" "pragmatic" programmer while never having heard of the safety features built into Ada/SPARK.
Another source: archive.is/tyHlI
You also need to read up on what "Steelman language requirements" are. Air traffic control systems and the fly-by-wire controls on the Boeing are all written in Ada for a reason - programming languages are tools, and some are better for the job than others.

You aren't even worth my time, mouth breather.

Good choice. Get the Barnes book, it's a pretty comfy read.

The first part of my post was meant for

All you can rely upon is ad hominem. Shut the fuck up.

All you can rely on are your Jewish parenthesis. Read about the Steelman language requirements and then realize that using C for safety-critical software is idiotic.
dwheeler.com/steelman/steeltab.htm
dwheeler.com/steelman/steelman.htm

Coming from (((you))), no thanks.

Enjoy the sound of silence friend.

Your entire post is argumentum ad populum. Why do people praise Resnais and Weerasethakul if hacks like Kubrick and Lynch are what normalfags like?

Oyyy veyyyy.

Attached: c689e0c1e160a245c575b21e9e2fdf4c1f968948bd1ca735cb21b6c9c12c1622.jpg.png (1600x1215, 716.99K)

I don't care what you think about the Jews (they are our humble, self-effacing benefactors :^)), but is the way he was spamming the jewish echo/parentheses not obnoxious? Why is it that (in recent years) you can't have discourse without meme verbiage?

Because it's becoming impossible to have discourse in general. The memes are just part of the internet's scenery.

Did (((you))) just appropriate my gender?

Is there an Ada dialect that doesn't look like pascal?

learn urbit

No. You get used to it after a while though, and in my opinion, it makes the code very easy to read back later.

I'm also a fan of the syntax - the language being case-insensitive is another blessing.

Here, a file system that works
>>>/hydrus/

I like Hydrus in principle, but its developer is almost as incompetent as the Calibre developer, and his unwarranted success doesn't redress the fact that Hydrus is a bloated, unstable pile of shit.

The reason all the better computer solutions lost is because of Moore's Law. Symbolics could not keep up with Intel and Moore's Law, not because they were inferior. We are in the perfect era to try something like a modern Lisp machine again.

6 months ago, every thread was turned into a RUST thread, a language which is now seldom mentioned. Now it's LISP machine shilled relentlessly. I'm starting to wonder if this is the same autist, someone who goes from idea to idea, and feels the need to evangelize his chosen path.

Get out halfnigger.

Honestly, came here to say this.
Have fun decrementing nock, guys.

How's unicode support?

Unicode was a mistake. Supporting hieroglyph moonrunes is completely retarded.

Into the trash it goes, then.
Of course, why would you elaborate on your statement to share your knowledge?

So what stops you people from writing your own Lisp-based user space on top of Linux?

All you need to do is realize that Linux has a language-agnostic system call interface where you just need to put the right arguments in the right register and issue one instruction. This isn't heavy wizardry, kids. You can easily create a JIT compiler that generates exactly those instructions. Your lisp could easily have (CL probably does have) an address data type with peek/poke functions.

This would allow you to interact with the kernel as a first class citizen and with zero dependencies. You can run your Lisp directly on top of Linux with exactly 0 lines of C code being executed in user space. You can scrap all that Unixy GNU garbage and write your entire user space on top of this language. It wouldn't be POSIX compliant but really who gives a fuck?

Write the init system in Lisp. Write the service manager, and the services themselves, in Lisp. Rewrite the core utilities as Lisp functions. Render graphics from Lisp using kernel mode setting and DRM -- it's literally just a few ioctl calls away. Become the god of your new Lisp world and shape your promised land with your own bare hands.

Just don't attempt to make a Lisp OS for fuck's sake. Linux is too valuable to just throw it away. Use it, build on top of it. Trust me, pretty much all of the complaints about Linux are really complaints about Linux user space programs that retards think are part of the operating system. Just scrap all that and start over -- on top of Linux.

There's few enough kernels as it is.

Writing a fully functional modern kernel is pretty trivial, even something like VxWorks or QNX fits the bill.

The main asset of something popular like Linux is its panoply of actively maintained hardware drivers.

I guess that's what it all comes down to. Hardware drivers. This shit would be easier if homemade hardware was possible. Then we could intimately tie hardware to software. Maybe that's the next technological revolution?

Do you want to realize your Lisp dream? Or do you want to waste time rewriting Linux?

Look at Emacs. It's the closest we have to the Lisp dream today. Instead of reaching out and covering all user space with Lisp, it contains Lisp inside itself, a safe space, and people little by little write Lispy interfaces to the big scary C world out there so that fellow Lispers don't have to leave Emacs ever again.

I say go out, write your Lisp compiler and let the Lisp code dominate user space.


Of course it is about the drivers. That's what I implied by "Linux is too useful to throw away". Using Linux allows your Lisp code to work with hardware minimal work on your part. You can concentrate on your dream.

As someone who occasionally pokes at meme platforms like Haiku, Plan 9, and HURD, the absolutely crippling driver situation sometimes makes me wonder why the first program kerneldevs write isn't something along the lines of NDISwrapper.

To what end? Drivers would just be yet another library. Sounds like you're trying to reinvend dpdk.

Fuck off

The benefits of Lisp machines come from the tagged architecture, garbage collection, and single address space. They're made for Lisp, but there is nothing forcing you to use Lisp. Dynamically typed languages would be fast and all programs would be simpler, especially GUI programs and compilers. RISC and UNIX are made for C, so everything else is slow, but you can still use other languages on a RISC.


>C is so bad and (((LISP))) is so great, why does every machine in existance use an operating system programmed mostly in C, and why is (((LISP))) seldom used in development?
C is brain dead and incompatible. You used to be able to combine different languages into a single program because they were compatible. The Lisp machines, Multics, and VMS are examples of that. When languages are compatible, you don't have to be stuck with a language you don't like because you can rewrite parts of the program one part at a time. Everyone outside AT&T understood the importance of this, but you don't appreciate it because you can just download GCC or Clang and make your program depend on millions more lines of bloat. With C, you are forced to do everything the way C does it, like null-terminated strings, array decay, bad memory management, broken error handling, and other bullshit. That means every language has to reinvent wheels and can't share anything with other languages. With Lisp, everything on the computer could share the same packages, objects, classes, functions, bignums, rationals, strings, arrays, structures, GC, and error handling system.

>If it costs millions in dollars than why do companies still use C and are completely fine? They wouldn't use C if it truly costed them money, and they would use (((LISP))) if it had any positives.
They have been told that C is "simple" and that the language doesn't matter at all, so they hire C weenies who had to learn C to keep their university UNIX computer running for more than 15 minutes, and because of the incompatibility and poor design of C, they start depending on null-terminated strings and other C bugs and then they can't get rid of C. When someone compares C to Lisp, Ada, Fortran, or other languages, C shills don't reply by saying how C is better, they just say some bullshit like "meme" or "it doesn't matter", which sucks. C is another example of the extreme contempt the AT&T employees who were paid to make UNIX have for their users.


Bullshit. Why is it that only people who are shilling UNIX languages like C, JavaScript, and PHP say the language doesn't matter but at the same time don't want a non-UNIX language?


I don't like Rust because there's too much UNIX brain damage and it looks like C++/Perl, but it's better than C and C++ (so is Java). I never made a Rust thread or promoted it.


Because Linux sucks. Lisp machines eliminate complexity and bloat from the computer and increase code reuse. Lisp machines don't have panics, OOM killers, broken system calls, broken signals, and all that other UNIX bullshit. They don't need 15 million lines of broken code.

I suspect that the True Un*x Way to do this is to print, notto the printer, but down an Intestine (this is what I callp*pes) that leads to some parser "utility" that is justbarely general enough to scan for the page headings andsuppress pages that don't have the right page-numbers at thetop. The result is intestined directly to the stool file --uh, that's "spoop file" -- DAMMIT! "SPOOL file" -- whosename you have to know and is different on every machine. Ofcourse if you don't do it right (and you won't) a couple ofblank lines will sneak in, and your page headings will comeout about five lines down, preceded by text from the southend of the preceding page. Your true weenie won't even tryagain, he'll be so pleased at having made this cludge almostwork.I need hardly say that I do not want to see a gleefulposting explaining how to do this in gory detail. I remindall listening weenies that I have a Symbolics machine. Ican network to your Ux*x, and when it asks me whether mylocal machine has authorized my UID and GID before it letsme delete your directory, I can simply, and truthfully, say"yes". And it will believe me. I am reminded of the Charles Addams cartoon depicting a college reunion. Underneath the "Welcome Class of '54" banner sit the alums, all of whom are bums and winos. One bum turns to another and says "I used to think it was just me, but maybe it's the damned school!"

Waste of memory.
waste of CPU cycles
But we already have that?
But you can easily do that with C.
I can easily compile programs to contain little to no bloat with gcc or clang.
With LISP, you are forced to do everything the way LISP does it
no
no
no
arguable
You don't know what you are talking about.
But C wasn't invented by AT&T.
wat

Look, you're so full of shit I don't know where to begin with. You're confusing language, operating systems, and computer architecture.

Lisp machines don't exist anymore, and are inferior to modern hardware in pretty much every measurable aspect. Linux is the most successful operating system ever created: it runs on a metric fuck ton of architectures, is supported by tech giants and hardware makers, has all the drivers you need, lets you shape its user space into whatever you want and is free software.

You can either make a software-based Lisp ecosystem that runs on top of Linux, or you can try to rewrite Linux in Lisp and fail miserably just like all the other purists who came before you.

The rest of us just ignore him and hope that he will eventually focus his interest in LISP in a constructive way.

Attached: shruglife_fullpic_1.jpg (550x400, 43.25K)

Can the Unix hater tell me the benefits of Smalltalk?

I would love to see him write a "pure Lisp" OS. Something tells me that he won't do it any time soon.

You still haven't answer my question on
1. >why does every machine in existance use an operating system programmed mostly in C
2. >and why is (((LISP))) seldom used in development?
3. >If it costs millions in dollars than why do companies still use C and are completely fine?
All your statements contridict themselves are you're on some weird downward spiral on how the shills are out to get you and your shitty language. You seem massively out of touch with reality and specifically use ad-hominim as well as many false equivilences and other logical falicies. Are you a Jew? Because you sound like one.

Let's talk about hardware complexitity. Don't (((LISP))) machines require tons of glue logic to operate to implement the interpreter and memory management instead of a normal general purpose CPU like M68K, ARM, or MIPS? If the interpreter is what makes (((LISP))) special, why haven't you gotten a BASIC stamp and made the interpreter (((LISP))) based?

Attached: hitler_and_jewish_arguments.jpg (850x446, 194.7K)

Not him, but I already told you that what you're saying is argumentum ad populum. Also, Hitler was an idiot who killed tons of Europeans and is essentially responsible for the state Europe is in now.
>Hurrdurr 14/88 EVRVPE kill all (((niggerz))) XD.
That is what you sound like.

Think about the questions you're asking, dude. What operating systems are there today besides those that were already written in C-based languages, like Windows and Unix-derivatives?
C continues to be used simply by the huge momentum it has, much like the x86 architecture continues to exist for that reason, even though it's buggy shit that needed to be replaced at least a couple decades ago.

>>>/oven/

Epic!

Use Big 5 + HKSCS then, remove kebabs and poo-in-loos

No one is forcing you to use C. In fact, most documentation for Windows and Mac API's completely ignore C bindings in favor of .NET and Swift. If you want to use something else, go ahead.

0s and 1s will work even in Monarchy

Note that he was specifically asking about the OS, not what you "can" use to write some software with.
Personally, I don't write anything new in C, and typically only modify existing C code for my needs (or to fix some bugs). On Unix, I normally just use Perl instead, since it can leverage most of the system and library stuff. But Perl itself is in C, and so are many of its modules, and the same probably also holds true for Ruby, Python, etc. So in the end, you're using C one way or another on modern systems.

If you are talking about scripting languages, then whatever the interpreter is written in is what you are using. I don't see how it's difficult to understand that regardless of whatever high-level language you use, everything is ultimately binary instructions and data.

You do realise Zig Forums is "Zig Forums the imageboard" right? We own this place

Well that doesn't sound right. If I was using C, then I'd have to deal myself with undefined behavior cases and its nefarious string handling. I use Perl to avoid that hassle (and also of course because I don't want to write a main function and explicitely do #include's for every single one-liner).
Yes, everything does come down to binary instructions in the end. But when your tools are in C, you those tools can exhibit undesired side-effects to a greater extent than if the same tools were written in Ada or even Pascal. The C problems will exist so long as the foundations (OS itself, compilers, runtimes, libraries) are writen in that language.

If you want to use a different programming language, have at it. However, I don't mind taking the extra time to allocate/free my memory and handle my strings appropriately. It's a challenge in my opinion when I have to fix bugs, a challenge which I know makes me a better programmer than I was before. I have no problems with Perl. That syntax looks difficult to master, but I don't see a pressing need for myself to learn something like that right now.

The problem is the handful of posters coming on here with their insisting demands that everyone defame and stop using C because they don't like it. That's tough nuts. They can take that back to their hugbox and instead let people have real discussions here.

You have C stockholm syndrome, much like I did. Give FreePascal a chance and free yourself.

No thanks. I am learning C because I want to learn C and enjoy learning C. I don't have time to run through another language. Besides, it's insulting to dismiss my reasons for prefering C to pitch something that is inconsequential.