Attempts at standardizing userlands is long forgotten, now a mess...

Attempts at standardizing userlands is long forgotten, now a mess, attempts at standardizing graphical interfaces is mostly forgotten, now a mess, attempts at standardizing inits is still being fought, is a mess as of now. It's the UNIX way and the community doesn't want to do anything about it. It's best to accept that we will continue to opt for the most bloated, least sane choices for whatever small reason to justify them until we end up with literal Windows and macOS clones.

Attached: mario-sleeping.gif (500x513, 426.87K)

Other urls found in this thread:

digibarn.com/collections/screenshots/xerox-star-8010/
cristal.inria.fr/~weis/info/commandline.html
xach.com/naggum/articles/[email protected]
xach.com/naggum/articles/[email protected]
dspace.mit.edu/handle/1721.1/5731
youtube.com/watch?v=oLRC878fEZE
elementary.io/docs/human-interface-guidelines#human-interface-guidelines
youtube.com/watch?v=r2CbbBLVaPk
twitter.com/AnonBabble

...

We must show them THE WORD and preach THE GOSPEL against their aimless, HEATHEN disarray. Only THE GOOD BOOK can save them from their SINS and show them THE WAY from which they have STRAYED. Out of it can emerge a NEW AGE of ENLIGHTENED desktop environments!

Attached: hig-mac-1992.png (843x1082, 1.31M)

The problem is that UNIXtards would be happy using a 2260 terminal and paying per hour to dial into a nearby mainframe. When we have swaths of silicon availible to us today, *why* (pray tell) am I only going to work from a text console -- is it because a GUI is "bloat"?
digibarn.com/collections/screenshots/xerox-star-8010/
Has anything really changed since the term "WIMP" was newly minted? (Or, really, since Engelbart's Mother of All Demos?) I'm going to say no.

Attached: xerox_star.jpg (1550x1564, 217.81K)

What's the point of standardizing a graphical interface when there is only going to be one implementation of it ever created?

That was the height of bugfree, user-friendly Macintosh (and computers in general).

People never knew how good they had it.

The key point is that everything on a given GUI works the same way, not that all GUIs work alike.


I would actually say that bugginess was a major downside of that era (both for Macs, and most of their competitors, e.g.: IBM, Atari, Commodore, Acorn, etc.). But the overall user-friendliness of the System 7 Macs was unparalleled, because of numerous but mutually important factors:


Really, given the fact equivalents exist for everything from AmigaOS, to IRIS, to NeXTSTEP, to RISCOS, why hasn't somebody made a retro knockoff Linux DE for the classic Mac System? Even just a Finder clone filebrowser would be an utter godsend.

That doesn't explain the lack of bugs.

cristal.inria.fr/~weis/info/commandline.html

Oops, I got those first two post links reversed, that part was in response to the other guy.

TL;DR

Mac UI being good pre-OS X was a meme. Other than the technical issues of it freezing all the time due to having no pre-emption, quacking at you for unknown reasons, merging things into the title bar in ways no one understood, buckshot windows, etc. it had just straight up on another fucking planet ideas like having to drag your disk to the trash to eject it. I don't think I ever saw that explained to someone where they didn't treat it like they were having their leg pulled.

What are buckshot windows? The quacking or error sound is easy to understand in that you can't use that area of the screen. I don't know what you are talking about with title bar or freezing issues. You didn't have to drag it to the trash to eject. You could go to the Special menu, like a normal person, and select Eject. Since you have to use Special menu to empty the Trash anyways.

...

What's going on here, Zig Forums?
We have this thread shitting about Uriel, this shitting about cat-v and countless posts about how C and Unix is shitty despite the fact that the poster is using a Unix like programmed with C OS and a web browser with at least some component written in C.


Here this red herring again, people blaming programmer laziness to Unix. The Unix philosophy states: develop small, specialized programs that work together, use text interfaces because text is universal (but is possible to use binary). There's nothing there saying produce buggy unreliable programs whenever possible.
If you are talking about the Worse is Better paper, even the author admits he is attacking a strawman.

The "you use it so you must like it" mentality is patently false -- some of the employed people here use Windows; do you think they like it? The problem with C and UNIX culture is that it takes millions of lines of code for little effect (the browsers and operating systems you're talking about). Forget lambda as a hardware primitive, Eunuch lovers cannot even imagine that CPU architectures with built-in array bounds and type checking (which obsolete the entire computer security field as it now exists, in just the same way modern medicine obsoletes bloodletting) COULD have existed, let alone DID exist. On a Lisp Machine, these checks are always performed — because on a genuine LispM they cost nothing. C and UNIX are not simple - rather, they make you do boilerplate nonsense that was already trivialized years before they were engendered. This is a quote from the inventor of Forth:
"Complexity is the problem. Moving it from hardware to software, or vice versa, doesn’t help. Simplicity is the only answer. There was a product many years ago called the Canon Cat. It was a simple, dedicated word processor; done very nicely in Forth. Didn’t succeed commercially. But then, most products don’t. I despair. Technology, and our very civilization, will get more and more complex until it collapses. There is no opposing pressure to limit this growth. No environmental group saying: Count the parts in a hybrid car to judge its efficiency or reliability or maintainability. All I can do is provide existence proofs: Forth is a simple language; OKAD is a simple design tool; GreenArrays offers simple computer chips. No one is paying any attention.” You have been brainwashed into thinking only UNIX can be simple. To the now denigrated Lispers we owe garbage collection, all the roots of the modern GUI, dynamic typing, lexical scope, the very idea of a single-user computer workstation, and countless other innovations which so many people believe to have simply dropped from the sky. As an addendum, please read these:
xach.com/naggum/articles/[email protected]

xach.com/naggum/articles/[email protected]

Prove it.

Prove what? Greenspun's tenth rule? Look around you at what modern computing is. Now, rivet your eyes onto the world of 1979: dspace.mit.edu/handle/1721.1/5731

You haven't proved anything.

I'm so tired of you knuckle draggers asking delphic questions of me over and over like talking heads, and then when I try to ascertain what you could be asking, give a cogent response, you tell me I haven't done x and the answer you're looking for is y; but y is not defined anywhere.

I assume he means
not that the Zig Forumsyp realizes that the burden of proof is on him

Prove it. It should be simple enough for you instead of defelcting from the issue.

...

You realize there are language-specific ISAs other than the ones you kvetch about, right? Like the MAJC, PicoJava, and Jazelle, for Java. Or the Novix NC4000 and MISC M17 for Forth. Or, for that matter, AT&T Hobbit for C.

None of these solve the fact that storage/memory-hogging runtimes and stop-the-world GC are an inherent flaw in emulatedinterpretedbytecode languages that no amount of ASIC acceleration can solve.

The microcomputers which dethroned your overpriced Symbolics machines ran Lisp code faster than dedicated hardware.
There's still absolutely nothing stopping someone from making a Lisp OS today targeting moden ISAs. Plan 9 fags have 9front, BeOS fags have Haiku, there's loads of niche operating systems out there and even fucking Rustfags have their own meme microkernel. Lispfags, on the other hand, have made less progress at realizing their their grandose dreams than fucking Hurd.

...

Show me one single counterexample. Incremental doesn't cut it.

Think of GIMP, where an application was made out of lots of little windows. That style was popularized by Apple.

Even Apple recognizes that it was bad.

BeOS was the paradigm that you needed, but you did not deserve. You rejected her, and she died for your sins.

Attached: serial_experiments_lain_stinger.jpg (476x345, 24.41K)

Yes, these languages are useless and the people who extol them are nothing more than navel gazing hipsters.
Totally nihil ad rem comment. Yes, they existed; how is that apropos? You already know what I'm going to say -- they're deficient.
Lisp compilers can go to machine code (SBCL). If I write Common Lisp code and run it with an interpreter, does that make Common Lisp a scripting language? If I compile that same Common Lisp code, does that make Common Lisp a programming language? What if my compiled Common Lisp code is eval'ing other Common Lisp code? Is Common Lisp then simultaneously a scripting language and a programming language?
I don't know about other people, but *I* am not saying we have to only ever use Lisp; because on a Lisp Machine everything is compatible. You could use eval and apply, for example. So, you could use other programs on your machine without having to re-implement them yourself.

Nigger detected.

Attached: Terry Davis Command Line-pvjSNoGL1-g.webm (640x360, 1.25M)

...

Year of the Haiku when?

Show me a Lisp "binary" without the megabytes of runtime boilerplate typical of "standalone" scripting languages, the sort of sub-KB embedded microcontroller snippet any real compiled language easily churns out.

I hate threads like these. They bring the fogies out of the woodworks, and we have to watch as they stimulate their desiccated circumcised penises while talking about the "halcyon days" -- it's horrific.

look up on-the-fly garbage collection

He's just going to use this as an excuse to push Lisp Machines again.

Standardizing userlands is a risky step that led to systemd-blob and freedesktop.org GNOME dictatorship.
I've heard good press about NetBSD userland on Hurd and/or Minix but have not tried it myself.

What is windowmaker?

I write software in Chicken Scheme

Still doesn't eliminate the penalty, though at least it's theoretically practical for all types of applications.


Can you point to some innovative, obscure new UI design concepts that blow away what was done 3-4 decades ago?


I said Macintosh, not NeXT.

But what if I want to go out of bounds or use intermix types in stupid ways?
When you say they cost nothing, what you mean to say is that you've already payed the cost in advance, so if you don't use it you're just paying for nothing. You think extra hardware features are just free?

It doesn't even need to be electronic.

Like other compiled lisplikes, it has a lot of bloat. It hides it within a huge runtime library.
user@anon:~/tmp$ cat hello-world.scm(print "Hello, world!")user@anon:~/tmp$ csc hello-world.scm && strip hello-world && ./hello-worldHello, world!user@anon:~/tmp$ ls -alt hello-world-rwxr-xr-x 1 user anon 10672 Jul 2 01:48 hello-worldanon@user:~/tmp$ ldd hello-world linux-vdso.so.1 (0x00007fffd28bb000) libchicken.so.8 => /usr/lib/libchicken.so.8 (0x00007f620c4a0000) libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f620c190000) libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007f620bf80000) libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f620bbe0000) /lib64/ld-linux-x86-64.so.2 (0x00007f620ce00000anon@user:~/tmp$ ls -atl /usr/lib/libchicken.so.8-rw-r--r-- 1 root root 4477320 Dec 14 2016 /usr/lib/libchicken.so.8
It's also a toy language.
Does CHICKEN support native threads?Native threads are not supported for two reasons. One, the runtime system is not reentrant. Two, concurrency implemented properly would require mandatory locking of every object that could be potentially shared between two threads. The garbage-collection algorithm would then become much more complex and inefficient, since the location of every object has to be accessed via a thread synchronization protocol. Such a design would make native threads in CHICKEN essentially equivalent to Unix processes and shared memory.

There's no way to magically get bounds checking for free.

Sleep tight, don' let the bed bugs bite! :)
youtube.com/watch?v=oLRC878fEZE

Attached: tired____snores__by_chibiirose-d5hu2hv.png (825x780, 488.33K)

Well, some folks are trying.

elementary.io/docs/human-interface-guidelines#human-interface-guidelines

youtube.com/watch?v=r2CbbBLVaPk

Think before you vomit your emotions into the reply form.

no u

Meanwhile Google, Microsoft are slowly going back to dumb terminal with pretty UI phoning the mainframe for software and license.

Why so butt hurt?

Literally stop feeding trolls, it's easy to tell which posters here are trolls.

Just have your CPU trap on out of bounds reads and writes.

Year of the BSD desktop when?

Kill yourself, retard. I don't want to be forced to subscribe to your shitty view of how software should be. Linux is great because you can put whatever you want on top of it. You can literally scrap everything and rewrite it all.

aka decision fatigue
aka demoralisation

Lisp isn't designed to be freestanding like C was. Lisp binaries are about as complex as Go binaries.


So what? The compiler will just generate a C version of the runtime and compile it along with your program. Just because it compiles to C doesn't mean the Lisp runtime gets eliminated.

lol, just make a new standard.

These aren't decisions lusers like you will be making anyway. You will just download whatever is popular. Meanwhile, the people that care enough to try to make something better will just do their thing. Maybe it will fail. Maybe it will be the next thing you'll be mindlessly using. Nobody knows until it's done.

Yep, just another useless luser OS. When are we going to get systems designed for experts to use, without dumbing programs down to fit into some retard's worldview?

Standards mean you don't have to waste time making sure your shit works for 999 vastly different systems

t.non-retard

Reported.

top kek
I'll keep posting anyway

You don't HAVE to do that, moron. Simply make the decision to use Linux and start actually programming for it and using all the features it has instead of making some lowest common denominator 'portable' abstraction layer that makes your software garbage. Iromically, it's standards that make people do this. Retards obsess over shit like POSIX to this day, even though its shit and every actual implementation differentiates themselves through non-standard features.


It doesn't have to be text, retard. Where did I say it had to be text-based interface?

You said it yourself, the problem with POSIX is that people are not actually following the standard.
But still, there's a shitton of software that will compile perfectly fine on BSDs and Linuxes alike. Why? Thanks to POSIX.
Is POSIX a great standard? No. Is it better than no standard? Certainly.

Wait a second. You're literally arguing that all the non-POSIX features shouldn't exist because they're not in POSIX. The world should just stop until the POSIX gods come down from their sky thrones and design some shitty interface. You're retarded, kiddo.

Software compiles fine because they use the same API. Usually, it's glibc. POSIX doesn't even run code, it's just toilet paper I wipe my ass with.

Absolutely not. Where did you get that idea?

By saying they're not following the standard.

what a meme
the ideal normalfag-free operating system is simply a blank sandbox of easy-to-use-but-esoteric tools that a user uses to sculpt the GUI and interface.

Right up there with "human readable format"


Other poster's point is quite obviously "if anything you're doing could be done in compliance with a a preexisting standard, it should be"


or
Mountains of braindead open source frontends say otherwise

Yeah, if you want your program to be junk, go ahead and use POSIX garbage like select instead of epoll or kqueue. Hey, at least it's portable garbage!

Everything in *N*X/Linux/Windows is inherently 1000% garbage, the only reason to use any of them is compatibility and preexisting codebases.

There's no point in standardizing userlands any more, every OS and distro has its own ideas and (if commercial) lock-ins to make switching difficult. Red Hat's a total mess of filth now, big surprise that the government's favorite Linux is trash.

For once, the macfag is right.

It hurts. You could theoretically just take bits and pieces of what already exists that you can trust, and replace the bloated, unauditable shit with your own work, but the real dilemma is that nobody has the time for that and web browsers throw a wrench into that plan by being so damn bloated that you could never write your own even if you wanted to (and I don't) and still be capable of browsing the normalfag sites that you might need down the line. HTML, JS, CSS have all gotten way out of hand. All the desktop environments are either too bloated or don't do what I want. I hate Linux just as much as I hate Windows and Mac. Fucking CIA niggers ruin everything. There's simply no way to trust anything. I think Qubes, live distros, or having a separate machine for each purpose is the right idea at this point, but I really wish to go back to having just one computer, running one operating system, and just having things work the way I want them to. Preferably without the blue pill. A man can dream.


First thing I do in GIMP is enable single window mode.

Attached: 1511395906.webm (640x512, 330.22K)

Tripfaging doesn't necessarily reduces quality it depends on the context. But it reduces anonymity and goes against all the principals of user boards so could you please consider not using it when not necessary ?

It does not reduce anonymity in the slightest. His real name is obviously not "[k00l]shamoanjac"

bait

This tbh.

Actually taking the time to read this book. So far, a lot of it is stuff that's pretty obvious nowadays, but I think it still might be worth a read. Thank you user - I think more or less the entire industry (webdevs and UNIX lovers alike) has forgotten how to make good, clean, intuitive, and lean software. I never really loved Macs, but I think their use of skeuomorphism (they call them "metaphors" in the book) was always really charming.

In thinking about this over the years, I imagine the only real solution is since the security/threading/process management in "today's" VAX-derived OSs are such shit to stuff the entire thing inside a tiny, tightly written, mature RTOS (i.e.: QNX, OS-9, VxWorks, etc.). Once this was done, chop every tiny part of the browser when running (individual elements like text streams, images, stylesheets, scripts etc.) into separate processes atop the RTOS kernel, which would protect the host OS from having to directly touching the morass of filth inside the browser. This would allow better scaling of the browser's features, far finer threading, better separation of security, stricter control over scheduling for interactivity, and make it impossible for an entire page (let alone the whole browser) to slow down the host OS/crash/leak memory from a single corrupt or malignant asset.


While I've always had a special fondness for the specific designs chosen for System 7-era Macs, I think the main lesson to take is simply the idea of HIGs themselves: Write a HIG, keep it as much the same as you can for as long as possible, and ensure every whether 1st or 3rd party conforms to it.

Attached: Standing The Test of Time.png (847x424, 507.34K)