If Lisp is so good, how come nobody has even made a text editor with it? No, Emacs doesn't count, it's not pure lisp

If Lisp is so good, how come nobody has even made a text editor with it? No, Emacs doesn't count, it's not pure lisp.

Attached: 1501447229545.jpg (593x796, 80.3K)

Other urls found in this thread:

hooktube.com/watch?v=o4-YnLpLgtk
common-lisp.net/project/climacs/
dreamsongs.com/WorseIsBetter.html).
heeltoe.com/retro/mit/mit_cadr_lmss.html).
cliki.net/Zeta-C
common-lisp.net/project/movitz/)
loper-os.org/).
news.ycombinator.com/item?id=1878608
en.wikipedia.org/wiki/ISPF
en.wikipedia.org/wiki/IBM_i_Control_Language
github.com/emacs-mirror/emacs/tree/master/src
twitter.com/NSFWRedditGif

The same reason why 8ch/tech is so good but nobody is making a facebook feed plugin for it. If you know what I mean...

You have absolutely no clue how Lisp works.

The core lisp interpreter was made in C so emacs isn't 100% lisp

I don't understand why this has any relevance to reality. Emacs isn't the interpreter, it is the programming features that programmer needs to develop software.

A language which can only do maths and logic is useless. Any programming language can do that. If you don't have good I/O facilities it's fucking useless.

...

...

God damnit. Anyway I've literally never programmed in jabbashit in my life. Go write me a replacement for the unix utilities in lisp. Pro tip: some of them are probably completely impossible to write in Lisp.

Emacs is powered by a lisp interpreter written in C. The rest of it is in lisp but the important backbone is done in C. So much for lisp being what stallman claims it is.

Has there ever been a Lisp compiler for x86, so that Emacs lisp interpreter could be rewritten in Lisp?

Edwin, included with MIT/GNU Scheme.

* In another article DB writes:
* > Unix programmers have a bizzare idea of
* > efficiency. Emacs misuses pointers to save a few bytes
* > (while being huge and bloated), XWindows is a pig, but
* > hey, we saved a JMP! :-)

* That's not UNIX, that's MITnix. This massive abuse of
* virtual memory seems to have come in with the MIT "free"
* software: X, the GNU stuff, and so on... --

First of all, memory for PCs (and soon for workstations)
runs for about $30/MB, and 8 additional MB take care of both
X and GNU Emacs.

In addition, I won't say much about X, which I dislike,
although if I'm not mistaken most of the bloat has occurred
because of vendor requests. X 10 was much leaner, and
provided more than sufficient functionality as far as I'm
concerned.

With respect to Emacs, may I remind you that the original
version ran on ITS on a PDP-10, whose address space was 1
moby, i.e. 256 thousand 36-bit words (that's a little over 1
Mbyte). It had plenty of space to contain many large files,
and the actual program was a not-too-large fraction of that
space.

There are many reasons why GNU Emacs is as big as it is
while its original ITS counterpart was much smaller:

- C is a horrible language in which to implement such things
as a Lisp interpreter and an interactive program. In
particular any program that wants to be careful not to crash
(and dump core) in the presence of errors has to become
bloated because it has to check everywhere. A reasonable
condition system would reduce the size of the code.

- Unix is a horrible operating system for which to write an
Emacs-like editor because it does not provide adequate
support for anything except trivial "Hello world" programs.
In particular, there is no standard good way (or even any in
many variants) to control your virtual memory sharing
properties.

- Unix presents such a poor interaction environment to users
(the various shells are pitiful) that GNU Emacs has had to
import a lot of the functionality that a minimally adequate
"shell" would provide. Many programmers at TLA never
directly interact with the shell, GNU Emacs IS their shell,
because it is the only adequate choice, and isolates them
from the various Unix (and even OS) variants.

Don't complain about TLA programs vs. Unix. The typical
workstation Unix requires 3 - 6 Mb just for the kernel, and
provides less functionality (at the OS level) than the OSs
of yesteryear. It is not surprising that programs that ran
on adequate amounts of memory under those OSs have to
reimplement some of the functionality that Unix has never
provided.

What is Unix doing with all that memory? No, don't answer,
I know, it is all those pre-allocated fixed-sized tables and
buffers in the kernel that I'm hardly ever using on my
workstation but must have allocated at ALL times for the
rare times when I actually need them. Any non-brain-damaged
OS would have a powerful internal memory manager, but who
ever said that Unix was an OS?

What is Unix doing with all that file space? No don't
answer. It is providing all sorts of accounting junk which
is excesive for personal machines, and inadequate for large
systems. After all, any important file in the system has
been written by root -- terribly informative. And all that
wonderfully descriptive information after lots of memory
consumed by accounting daemons and megabytes of disk taken
up by the various useless log files.

Just so you won't say that it is only TLA OSs and software
that has such problems, consider everyone's favorite text
formatter, TeX (I'm being sarchastic, although when compared
with troff and relatives...). The original version ran
under FLA on PDP-10s. It is also bloated under Unix, and it
also must go through contortions in order to dump a
pre-loaded version of itself, among other things.

SBCL converts Lisp directly into Assembly.

Lisp is good because it makes you realize there is more to life than programming.

Except Stallman never claimed that. It's extended predominantly in Elisp but can also be extended in other languages. In fact, the new version of Emacs deprecates Linum for a much faster C module. Why would they do that if Stallman or the Emacs devs or whomever you're trying to antagonize (and doing poorly at that) were Lisp elitists. Here's a joke collected in the Emacs docs:

Why are you even conflating Stallman with autistic Lisp elitists? Stallman may love Lisp, and he may be autistic, but he loves free software more than he loves Lisp. If this conversation were even a little less autistic, I would accuse you of being a third party actor for trying to fabricate your own narrative.

No it doesn't

It's called Eshell, you dolt.

Attached: eshell.webm (958x526, 233.86K)

hmm...

Unix shells suck. The shells of the various Lisp Machines were quite different. The Symbolics shell, later called "Dynamic Lisp Listener" allowed management of commands, completions, defaults, interactive help, etc..
See for example: hooktube.com/watch?v=o4-YnLpLgtk
The interactiveness of that Lisp Machine OS is quite a step up from what any typical shell offers. The problems of that (GUI) approach: it wasn't very sophisticated on the text terminal side; actually, it was quite bad. For development one needed an extended Common Lisp (or Zetalisp), which was a bit too complex for many users.
Really, It's not even hate. Most Unix shells are just dumb. Many people like to use primitive text based UIs with lots of corner cases, which makes them seem intelligent for remembering obscure commands and obscure options without real UI help.
Take cp. The shell does not know the various options the command takes. The shell does not know what the types and the syntax of the options is. The shell does not know which options can be combined and which not. It can't prompt for shell options. It can't check the command syntax before calling it. It can't provide any help when the syntax is wrong. It can't deal with errors during command execution. There is no in-context help. It can't reuse prior commands other that just editing them on a textual base. The output of the command is just text and not structured data. There are an endless amount of problems. There *have* been attempts to address this by putting different user interfaces on top - for example, IBM provided an extensive menu based administration tool for AIX. But no tool is perfect and pragmatically I've found shells to be far more productive than anything I've ever attempted to replace it with (on a modern OS, that is). Which is the real crux of why we use these tools. That's why there's a million different shells. You can even use Lisp-based shells like scsh and esh or Emacs. But for most part all these attempts still suffer from and don't escape the general problems.

Eshell has saved my ass a few times when I've had a frozen and unresponsive terminal so I opened up emacs to kill it

Brainfuck is turing complete. It's absolutely, indisputably impossible to write multi-threaded or networking programs in standard Brainfuck.

Zmacs you retard

All multithreaded problems are single threaded problems that some asshole made more complicated. Also, there are multithreaded dialects of BF: Weave and brainfork to name two.
brainfuck++ adds networking

And, before you tell me that those aren't standard brainfuck, I will respond that C still doesn't have standardized networking or multithreading, and C++ only has multithreading, but just got it in 2011.

This is your brain on Lisp.

Attached: scsh.png (1498x599, 136.08K)

...

How would your shell know ahead of time what some random program (including ones that don't exist yet) is going to accept as parameters?

Except it did? "ff" is a popular alias for find-file. That's why the image file opened in mpv. Plus, there's nothing spectacular about repls. Eshell's strength lies in how it can interact and be extended with Elisp, not how Eshell can execute it when you issue a command.

Why don't you try cryptocurrency mining, hashing, or video encoding using only one thread. Usually it's 3 to 4 times slower to do it your autistic single-threaded way. Sure, singlethreading is way easier, but its performance is inferior in many situations.

Perhaps, but you get the idea. In standard C one can perform I/O on arbitrary files. In brainfuck, you can only perform I/O on the standard streams. There are some things which the language makes impossible.

Fair enough. Now bring that to feature parity with VStudio and give it GUI.


ganbestigudasaiyio user

common-lisp.net/project/climacs/
haven't used it, but I did use Portable Hemlock for a while.
'pure lisp' isn't important. It's a matter of degree and location. A thin shiv of C that makes things work is how lisps themselves work under Unix. Even Forths will do that, or write the shiv in assembler. What matters is that, when you want to improve on Emacs in any way, you get write in Emacs Lisp--you don't have to write in an even less pleasant language.
With that said, Lisp is not good. Depending on why you want it, go with Erlang or an ML instead.

B-but GUI is bloat!!!
Remember: "Weenix is the operating system preferred by Unix Weenies: typified by poor modularity, poor reliability, hard file deletion, no file version numbers, case sensitivity everywhere, and users who believe that these are all advantages”.

This is how retarded you sound.

Why is there a shit tone a absolute ignorant and shit threads these days?
It seems that it's the same guy that made the thread about lisp machines...
He really sounds like a fucktard who just had his degree (if he's actually not a faggot who typed ping in the windows CMD and thinks he's a high level hacker), done with only atom, who know only java (if not javascript) and come and shit on the whole programming/hacker culture because of how "elitist" he finds it to be compared to his incompetence.

That's really revolting. But, well, it's not like Zig Forums is not /g/ now.

I'm not even commenting on the "anime pic". There are really serious guy liking anime, I'm not even say shit about it, but in the over cases it's a clear sight of ignorance.

The question too is why people are feeding such a fucktard question. The three first answer would be "install gentoo", "what (cadr '(a (b c) d)) eval to? gtfo" or "Go to Zig Forums Questions and Support and get ban faggot".

Attached: when-the-cheeky-is-breeky.jpg (675x900, 61.02K)

Ironically, this is why GNU Emacs is the ultimate Unix application. Because Emacs is a powerful, flexible, and respectful frontend for Unix tools.

What? Then why did you use them?
It's like if an american Jehovah's Witness went to preach in Africa, except he's only armed with japanese Shinto Prayers, and then complaining asian animism doesn't mesh well with JW theology, and translating from japanese to !Kung via english is a bitch.

Lisp is a family of languages, not a language.

LISPfags have always been lolcows, threads like this are far older than chans. It's a language that was put on a pedestal in academia and people based their careers around yet they ended up just sitting around being smug and condescending while everyone else who actually produced software used other languages. They've always had a cult atmosphere and been as ridiculous as audiophiles in their justifications.

Great way of moving the goal. Polite sage.

Get FUCKED.

Aren't Lispers supposed to be the ones who hate Unix? Are did you get your strawmans mixed up.

Yes they are, same fuckbrains are frothing of lisp machines were so superior to minimalist Unix, yet crying about impossibility of making complex software with lisp.

You're the only one who's frothing at the mouth right now.

There was a /lisp/ thread, with actual coding and project.
Not this pile of crap.

And no, that's not ridiculous. That's simply people who love what they're doing actually pointing out nice things, and not things that have been developed only with money and capitalism in mind. That's the same in nearly every field. What is worth doing in term of money nearly never match the nobility of it. You can take example of the pharmaceutical field. You only do what could make you very rich. So you're gonna create new molecules that you can protect behind intellectual property, and certainly not conceive and sell plant mix, that would be "worthless" in an economical view.
This logic can be applied in every field ever.

Where did Lisp touch you? Show me in the function

I'm not a lispfag but I'm open to try something different since I'm kind of getting fed up of Unix after doing the Linux and BSD shit since 1995. I'm fed up of all the unending upgrades and security patches and other related bullshit. There's got to be something better but every OS now is a fucking Unix clone, so it's like OS research hit a dead end. Also I'd like answer to

Operating system provided/mandated interface for shell options using a known format. --options could give it, for instance.

It's better to produce nothing than to produce garbage.

But following a specific format only fulfills one part of the proposed interface (like getting consistent help). That still leaves:
To do all that in Unix, you have to study the man page. The example he gave as an attempted improvement was the AIX TUI for sysadmins that had "canned" commands you could run, but that only worked for whatever small subset of commands and arguments were baked into it.

Attached: 1459006618899.jpg (451x392, 45.63K)

Because there is no other way to do modern computing. On OpenBSD they avoid using flexible pchar/*char functions; instead they limit the length using StrLCopy, etc. C is not a good language but they still work with it - the real faggots are people like Linus Torvalds - he still hates Wirth and Dijkstra. Most people still think the antiquated C rules of the world are a Good Thing. (That's why everything is so vulnerable to buffer overflow attacks, among other things.) Well, what are these experts doing today? Copying Wirth, of course (see: Go, Limbo, C compilers having stronger typing). Ruby (a shitty language, to be fair) stole Modules from Modula. But yes, Wirth was a charlatan - we'll still copy his ideas though, and they won't suspect a thing!

Thank you for no longer abusing the code tags.

"Lisp Machines are something that you think is really cool when you first learn about them, then you come to the realization that pining for them is a waste of time.

I've had a flash of inspiration recently and have been thinking about Lisp Machines a lot in the past three weeks.

But first, a digression. There's an important lesson to be learned about why Symbolics failed. I think Richard Gabriel came to the completely wrong conclusion with "Worse is Better" (dreamsongs.com/WorseIsBetter.html). There are two reasons why:

1. Out of all the LispM-era Lisp hackers, only RMS understood the value of what's now known as Free Software. (If you haven't read it yet, read Steven Levy's Hackers - it describes the MIT/LMI/Symbolics split and how RMS came to start FSF and GNU).

2. Portability is really important.

The key lesson to draw from Unix isn't that "Worse is Better," it's that survivable software is Free and portable. Free because getting software to someone's HDD is 80% of success, and portable because you don't know where people will want to use your software (there are some really weird places).

Symbolics was neither. If Genera had been Free Software, it would by definition still be around today. If Genera had been portable, it's likely Symbolics would never have gone out of business (the Alpha virtual machine would have been done sooner, with less resources, and for more systems).

Being released as Free Software today wouldn't help. Genera's predecessor, MIT CADR, was made available under an MIT-style license in 2004 (heeltoe.com/retro/mit/mit_cadr_lmss.html). There's a VM emulator which runs the code. The whole system is pretty useless.

Now on to the inspiration part:

It's possible to make a very high-performance, portable Lisp operating system on modern hardware. This has been a possibility ever since the Pentium came out. The main bottleneck to conventional Lisp runtime performance is the way operating systems manage memory allocation and virtual memory.

A type-safe runtime that has control over memory layout, virtual memory, and is aware of DMA can provide extremely high throughput for allocation and GC (this has been shown by Azure's Linux patches for their JVM), true zero-copy I/O, almost optimal levels of fragmentation, and excellent locality properties. If you go single address space (and there's no reason not to) and move paging into software (object faulting and specialized array access), you've also eliminated TLB misses.

Throw in the fact that it now becomes trivial to do exokernel-type stuff like for example caching pre-formatted IP packets, and it should be possible to build network servers that have throughput many times that of anything that kernel/user-space split OSes like Linux or FreeBSD are capable of for dynamic content (ie - not just issuing DMA requests from one device to another).

The only problem is device drivers. Lisp doesn't make writing device drivers any more fun, or reduce the number of devices you have to support.

What to do?

The reason I've been thinking about this is that I came across this: cliki.net/Zeta-C

I've heard of Zeta-C multiple times before, but for some reason this time I made the connection - "why not use Zeta-C to compile an OS kernel?"

I explored the idea further, and it seems to me that it wouldn't be an unreasonable amount of work to take the NetBSD device subsystem and have it running on top of a Lisp runtime with the necessary emulation of those parts of the NetBSD kernel that the drivers depend on. If you don't know, NetBSD's device drivers are modular - they're written on top of bus abstraction layers, which are written on top of other abstraction layers (for example, memory-mapped vs port I/O is abstracted). So the actual system twiddling bits can be neatly encapsulated (which isn't necessarily true for Linux drivers, for example).

I'm aware of Movitz (common-lisp.net/project/movitz/) and LoperOS (loper-os.org/). Movitz makes the mistake of trying not to be portable, but there's useful things there. I haven't spoken to Slava about this yet so I don't know what's going on with LoperOS. I am also aware of TUNES, and think it was an interesting waste of time.

The main thing is to get Zeta-C to work on Common Lisp. Then it's to build a new portable, boot-strappable runtime (I think the Portable Standard Lisp approach of having a SYSLISP layered on top of VOPs is the right way to go for this), and either build a compiler targeting that runtime, or adapt the IR-generating parts of one of SBCL, CMUCL or Clozure. Further bootstrapping can be done with SWANK and X11 once a basic networking stack is in place. I think such a system would be quite fun to hack on."

From here: news.ycombinator.com/item?id=1878608

Free Software != $0 price

The good ideas in AIX come from the IBM z and i ancestors. IBM wanted to put some of their technology into a workstation that was between the PC and their real computers, but since they based it on UNIX, it sucked. Every time a computer company's UNIX has a cool feature that didn't come from AT&T, it turns out that it's a bad knock-off of something from another OS, usually an OS that the same company sells or used to sell before shills talked them into hawking UNIX.

en.wikipedia.org/wiki/ISPF
en.wikipedia.org/wiki/IBM_i_Control_Language

(I tried to mail this once before, but (wouldn't 'ya knowit), the mailer on life was broken...)Yes, Virginia, there *is* something worse than Unix - it iscalled "AIX" and is currently being pushed on an unwarypublic by IBM. But I digress. In case you don't readabUsenet, here's a post I just made toalt.folklore.computers:From: RSSubject: Hardware Architectures and I/O (was: Re: Jargon file...) **FLAME!!**Date: Sun, 2 Dec 90 15:43:03 GMTLines: 53In some previous article PT writes:> Back when there were REAL(tm) computers like 780, a lot of> time and energy went into designing efficient I/O from the> CPU bus to the electrons going to the disk or tty. >Damn right, but even the 780 was a step down. Get yourKL-10 documentation set out and read about *them*. - Front-end PDP-11s that did Tops-20's command completion. - Separate I/O and memory buses. - 8-ported (that's eight, son) memory that talked to the I/O front-end machines for *real* DMA, not cycle stealing!> Sure OS's and apps have gotten bloated, but when you put a> chip like the MIPS R3000 on a machine barely more advanced> than an IBM-AT you end up with a toy that can think fast> but can't do anything. I can't really blame companies> like DEC and Sun for producing mismatched hardware,> because their marketing drones are constantly trying to> undercut each other in price. It's a hell of a lot more> expensive to ship a product with a well designed I/O> system than to drop in a "killer bitchen" CPU chip;> occasionally someone makes the attempt do design a great> piece of hardware, and you end up with something not half> bad (like the DECstation 5000, which is only crippled by> Ultrix You left out the worst offender of them all - IBM. TheRS-6000 may crank out 27 MIPS, but it can't context switchor handle interrupts worth sh*t. You can lower machineperformance to the point of unusability by FTPing a filefrom another machine on the same ethernet segment! Next time get a chance to play with an RS-6000, trythis: Pop about a dozen xterms, iconify them, put the iconsin a row, and wave the pointer back and forth over them asfast as you can. Astounding, no? The highlighting on theicons will keep bouncing back and forth long after you stopwaving the pointer. My personal record is 20 seconds.Makes a Sun-2 running display Postscript seem astoundinglyfast. RS-6000s also have an annoying tendency to "lock up" fora few seconds (5 < x < 15) and then return to normal - I'mtold that this is normal and due to paging activity. Themicrochannel card cage design is pretty bad too - sure, youcan put cards in, but God help you if you have to take themback out! And you better tighten down the retaining screwsall the way... or the first time you look at the card funnyit will pop out. To its credit, I must say it compiles GNU Emacs fasterthan any other machine I've used, but I do more with aworkstation than just run compiles. And, if you thinkUltrix is bad, it's only because you haven't tried AIX.

So, what you're saying is that there is nothing intrinsic to the problem that requires multithreading, and that I was right? Your performance constraints require multithreading, not the solution to the problem.

There is nothing that the language makes impossible. Brainfuck has a dialect that opens files. A language either is Turing complete, and thus nothing is impossible, or it is not, and thus it only solves specialized problems. Everything else is environment, which can be libraried over. There are multithreaded, networked, GUI applications written in C despite the C language not providing any of these things. The environment provides pthreads, Berkley sockets, and X Windows. As far as C is concerned, these are black boxes that do magic, but have a C interface so that they can be called as C functions. We don't change the language just because we add these magical functions. Lisp is no different.

That's some pure academia horseshit. In the real world, there are all sorts of things a "turing complete" language can be useless for. How would a driver in lisp be written without cheating and having it call out to a different language if the memory has specific physical alignment constraints, can't be moved by compaction, and needs some concept of volatile? The language exists in some imaginary space that doesn't exist which is why it's total ass for real software development.

Because everything that was learned since fore ever in computing is bad ? The Unix mindset is a compilation of past practices it's not perfect and can be corrected/forked into something better. Whine all you want but the unix practices is mostly solid and works well long term speaking.

Everything you said is so patently false that I don't even know where to start; congrats.

maybe start with a massive blockquote, mister lisp meanie!

Why do people assume that if someone isn't fellating the "eunuch way" in their post, they must be the "lisp meanie" - maybe *nix systems aren't as good as you think they are.

I feel the same way. I don't know who these people are that are pushing this Unix/Lisp dichotomy, but I feel like there's some kind of conspiracy going on to deface both.

How is a driver written in Pascal? All languages that allow writing drivers call out to either another language (usually assembly) or a metalanguage to satisfy the constraints of the hardware. All languages exist in an imaginary space. You may say, "Not so with C", but that is fundamentally incorrect. All drivers written in C do stupid things with locations in memory that are semantically well defined (by the language) and semantically well defined (by the hardware), but the coupling is entirely enforced by the programmer (unless that programmer works for certain hardware companies, then there is no coupling). That the memory corruption that C stupidly allows to happen happens to cause the hardware to behave in a certain way is outside the scope of the language, and that anomalous behavior can be replicated in any language by giving the programmer an API to do stupid. I want to reiterate that: a device driver is to any programming language a program or function that performs anomalous actions that happen to cause the hardware to behave in a certain way. The language holds the programmer's beer and hopes all turns out well for him. The fact that stupid is foundational to the C language, but has to be wrapped in APIs in better languages, is a testament to C's poor design.

It's actually starting to turn me into a mean old man. I studying up on security issues, and lay security people (security people who aren't developers), rattle on about buffer overflows. The thing about it is that buffer overflows are primarily a C/C++ problem. Java, Lisp, BASIC, shit even COBOL, largely won't let you do it. If you give a pajeet any one of these languages and have them write a string processing program, the only exploits will likely be in the implementation of the language, not their code. One has to be a professional programmer to write code that has a buffer overflow vulnerability. Give a pajeet C/C++, it will have over 9000 vulnerabilities, because the language itself lends itself to writing insecure code. And a professional programmer writing a version without those vulnerabilities will produce code two orders of magnitude longer, because the language itself makes writing correct code hard and tedious.

The fact is that C/C++ have taken over software development because programmers have a culture of writing shitty code. We have a culture that values code that looks like eye cancer, works by happenstance, and is more brittle than a marshmallow frozen in liquid nitrogen.

People were saying that C was absolute garbage when it new/novel, and they knew it would set computing back by decades (it destroyed the lessons we learned from Lisp). Somehow, people praise C today. Look how far we've regressed - what was detritus yesterday is considered the cream of the crop today. Imagine where we'b be now if Lisp and Genera won the systems programming wars, instead of C. Genera was a Symbolics Lisp Machine OS where everything was a hackable Lisp object. Another interesting OS was Oberon. Oberon had an infinite desktop with a text REPL at mouse point. Most "innovations" today are on the kernel end, and it basically all relate to performance.

we'd.
Also, I forgot the image.

Attached: genera2.png (898x1142, 169.31K)

Hey, MIT Lisp junkie, explain this
If LISP machines were designed to run LISP efficiently, why was it that the big bad UNIX and C machines, even Windows, were running LISP faster? And if these machines were running LISP faster, why is LISP still demonstrably and detrimentally slower on them compared to most other languages running on the same hardware?

Why does the cult of Good Enough pervade all of modern computing? Some of us want brilliance, not hacks. Lisp Machines, though largely forgotten, can never be un-created. My standard of comparison for _any_ technology will always be everything previously achieved by mankind, rather than what is available on the market today.

What's your point?

Probably all they've ever seen is Unix and Windows. Never so much as touched a VAX or even a simple 8-bit computer. So it's like being born in the Matrix.

It wasn't that big bad UNIX and C machines were running Lisp faster, it was that vending machine processors were running Lisp faster. Chips like the 8080 had a place in cheap hardware. And these chips were meant for hand coded assembly. Only after the chips became popular were they optimized for running C. The companies selling Lisp machines weren't selling just processors; they were selling integrated machines. You know the saying "jack of all trades, master of none". Intel is a master of making processors, but they don't really sell computers. Lisp machines were the whole computer, so the processor, while optimized for Lisp, was still inferior to the mass production ones coming out for hand coded assembly (from companies that only made processors).

For me, I don't particularly care for Lisp. I just hate what C has done to people. I'm talking to people who couldn't imagine writing a device driver in COBOL. It would be ugly, but there is nothing magical about C that makes C doable where COBOL isn't. The thing about Lisp machines is that they demonstrate this: C is a "fast" language because it is being run on hardware that has been optimized to run C. The hardware doesn't have to be optimized to run C. ENTER is still an instruction. There are good arguments from academia that if Lisp hardware had the same development budget, it would be faster hardware (if we remove certain core functions of Lisp that aren't as functional).

It's a silly excuse. Cray made specialty hardware for specialized programs and their shit was extremely fast. Sun made both generalized and specialty systems and their shit was extremely fast. They both focused on forcing programmers to work like the hardware wanted rather than the LISP machine approach of forcing the hardware to work like the mathfags wanted. Shockingly, mapping the programmer to the hardware seemed to work a lot better.

I don't thank you for ignoring what I asked. If LISP machines were designed to run LISP efficiently then why did machines, which you say, "are optimized for C", run LISP faster than the machines which were designed to run LISP?

Hey lispfags, write a text editor that is graphical, portable and has feature parity with VSCode. Needs to be 100% lisp.

I'll taunt you as eternal unaccomplished poser faggots until you do.

Attached: 1452122840821.png (644x635, 122.83K)

>>>/g/

emacs

Attached: mpv-shot0032.png (1920x1080, 1.15M)

I set the bar lower as requesting VStudio feature parity caused nervous kvetching before.

Do you know what .c and .h files are, idiot?

github.com/emacs-mirror/emacs/tree/master/src

Considering you want it to be portable, that means that I can't assume everyone has a LISPM to run it on. We need some sort of runtime in order to run the editor. The actual editor part of emacs is 100% lisp.

...

...

all (you) thread sliding to avoid this post

and your clairvoyance in calling out (you)s is right on par with your programming skills, error rate 71%.

Prove it.

Attached: Capture.PNG (307x161, 4.92K)

Terrible accuracy.

Attached: terrible.png (368x192, 9.71K)

I don't get why someone would waste time writing a text editor in Lisp, when they've already got those things for Lisp. It's like asking me to write a thing for Python but do it in Perl. Well fuck that, I don't give a shit about Python, so I'm not gonna waste my time.

Wow! How did (you) know?!

Attached: Screenshot at 2018-06-02 10-25-20.png (418x190, 11.84K)

The core point is that lispfags don't actually write software with lisp.

What about clojure? A bunch of people are using it. And what about scheme and racket? Are you just fixated on some unique version of Lisp or what?

You lisp retards are like communists, asked to prove a simple thing that would somehow would validate your claims to superiority despite all the plentiful evidence otherwise, you never deliver even that one bit. All you do is dodge.

Way to project, retard. I don't even know any Lisp at all, but apparently people use it because clojure and racket are getting a lot of attention these days. I think you're just a C tard who has something to prove, to make up for your lack of "something" in your pants, so you picked the (pointlessly) hardest language to write code for correctly to be all macho and shit. I've seen your kind of retards posting on Usenet and ganging up on newbies who fuck up some C program (which is inevitable in any language, but even moreso in C) and they love to go all aggro at him and prove how much they memorized the C standard and how they know all the undefined behavior cases (this is the retard olympics at its finest!)
Keep training at C, moron. Maybe in another 50 years when you're in the nursing home, you'll finally figure out that you could have saved yourself all kinds of trouble simplying by writing in Ada or even Pascal.

The irony is that both Ada and Pascal were considered bondage and Discipline languages; but modern C is much more restrictive than modern Ada/FreePascal.

Bondage and Discipline.

Javafags were desperate for a scripting language that would be less verbose than Java while still meeting project requirements and took whatever they could get. Having had to write Java, I don't blame them.
What about them? They're practically unused outside of academia and memes.

Attached: gnusnap.png (900x675, 674.22K)

Everything you apropooed about me was wrong. I'm not projecting anything. I'm observing. Just look at this fucking thread, all that greentext is here in some form or another.

Lispers are massive poser faggots and I absolutely fucking hate poser faggots. Now I was just playing with some filters on LineageOS gallery app, which is written in Java and oh boy it works well, filter preview is instantenous. Yes that terrible horrible forced OO Java lispers lament about.

Fine, you posers can't write a goddamn basic 21st century advanced gui text editor with git integration and all that other basic shit taken for granted now, just admit it, everyone can witness it.

Well let's lower bar again, write a photo gallery like that with a bunch of filters then. Surely you can do that much if bunch of dudes in their free time manage to do it in such a horrible inferior language and get it to run in a few hundred devices too. Fucking posers!

anything past assembly isnt good and is for brainlets

Assembly is infantilized Intel bullshit that doesn't represent the way the processor really works

Intel's one of the very few hardware companies that let you write assembly the way the processor really works but brainlets couldn't handle it and the Itanium became the Itanic. If you want your head fucked with, check out how you write assembly for one. I had to do that a long time ago and it was mind bending. Compiler authors weren't prepared at that time.

Because those machines were faster in general. Because Intel and Motorola were good at making fast hardware, and Symbolics was a bunch of programmers trying to be a hardware company.