There's the good kind of overengineering and then there's people that step so far over the edge that it's horrifying and fascinating at the same time.

Give examples, particularly funny ones, about unnecessary complex programs, systems, devices. It doesn't have to be software, although that's probably where most examples will be found.
What's an existing sensible substitute for your example?

Systemd, dbus, pulseaudio, whatever are of course allowed, as long as you give horrifyingly fascinating details.

I'll start with
A cross-platform "app" to move things to the trash, with companion apps to empty the trash, delete files and folder and its own API. Of course written entirely in Javascript.
I'd replace it with either nothing or a shell oneliner.

Attached: 46486415314.png (980x996, 17.74K)

Other urls found in this thread:;a=history;f=src/cat.c;h=3c319511c767f65d2e420b3bff8fa6197ddbb37b;hb=HEAD

Windows 10.
Who thought that putting 20 different card games in it is a good choice?!

Can someone tell me if Emacs (GNU/Emacs?) qualifies?

That hunk of junk is a prime example.

Windows 98 had a bunch of extra bullshit screensavers/themes/sounds/backgrounds etc iirc

Things like: ?
I think the CoC is unironic though?

Attached: ITS SO BEAUTIFUL.jpg (1920x1080, 311.77K)

Microsoft® Excel®

Attached: bulk_rename_tool.jpg (650x442, 128.35K)

i see ur a man of culture too fam

before learning powershell (or scripting) this was very useful for renaming downloaded media. no-one uses the same naming convention
I'd rather one over-featured tool than 50 smaller ones, at least when using GUIs in which inter-program piping doesn't really exist

At first I thought "that wouldn't be cross platform".
Then I noticed that this thing has almost 30 lines of code just in cli.js alone. That is ten times the number of platforms it claims to support.

It's more of an editor platform that's programmable and extendable, so you're not obligated to fill it with features to have it work as intended. I'd say no, even though it has a lot of stuff in by default.

just wow.

I'm more fascinated than horrified tbh. Pretty amazing.

Attached: 154854548746.jpg (693x342, 18.91K)

I'd like to complain about the average page weighing around 2MB. But that's nothing compared to the fact that browsers now ship with a fucking compiler and that while browsing the web you just download and execute code written by hell knows who. I'd like to say that ideally a browser shouldn't have to compile anything, rather it should render pages and possibly execute some very simplistic scripts.

But that wouldn't be indicative of how insane the industry is.

Almost everything that has anything to do with the web is overengineered. A couple of examples:
Just imagine. C, a low-level language, is compiled into a subset of JS, a dynamically typed and absurdly high-level interpreted language, which is then compiled before running. It's not that it is overengineered in the way it's done, rather, the very existence of this thing is overengineering. I highly recommend checking out the "code generation" part on the wikipedia page. Some solid examples of just how insane this is.
Needs no explanation. A fucking browser engine is bundled with your program. And not just some engine, it's chromium, which is almost an operating system of itself, it even contains drivers. So your clipboard manager or whatever the fuck you wish to implement with this wonderful technology is gonna carry a joystick driver among tons of other garbage.

I'm really curious about the alternative to the current state of things. Web isn't used for documents anymore, it's a GUI framework for applications. How would anyone make this mess less haram without just saying "We should stop making web apps."?

That entire program in linux:
for i in `ls -1 `;do mv $i `echo $i|sed -e 's///'`;donechange out ls for find, stat, etc. as required, and expand sed script as necessary for multiple changes.

Nothing will beat the pile of spaghetti a novice OOP programmer will generate when trying to implement "patterns". I've seen monstrosities that make an abstract-hammer-factory-generator-proxy look sensible in comparison.

I'll pick on myself a bit. I wanted a single keystroke to dim the laptop and secondary screen. So I wrote an AutoHotkey script for the keyboard binding, which called a powershell script that presented a GUI with dimmer options, which then called a .NET library I wrote in C#, which makes several attempts to dim the screen using various methods, as not all are compatible with all GPU's and monitors. It was horrendous but it worked.

You cannot. "Web app" is essentially a fancy word for next gen proprietary software.

"Stringreturners" unironically made me think of Django. Thank god I never had to work with java a lot.

"Web App" is a fancy term for "software I don't need to install to run". The problem is distribution and web browsers "solved" it.

All web browsers intended for general use and whities.

I would actually use that. What is the point of GUIs if not that? The muh bloated "minimalism" meme, where mere calculator has 200k loc and has memory footprint bigger than creators eweenie and has on top of that only the most basic featurs.

Excel devs are pure magikikes.

it shows, #staylobotimized

Except they haven't really and that isn't really a problem.
Crossplatform apps were made possible quite a long time ago by stuff like java and qt.

With web apps they get nearly complete control, they can move crucial functionality over to the server side, thus preventing "piracy" way more effectively than with traditional proprietary software. Web apps are perfect spyware and spying is often more profitable than "selling" software. There's a reason normies got free win10 updates. Also webapps are cheap to produce as there are tons of web developers already out there.

Since non-software is allowed, let me present you with the most over-engineered mess: hygiene products. Take for example soap: it's a solved problem and the last big innovation was making it liquid. Where do you go from there? You bloat it up with shit like fifty different aromas. Who cares how the soap smells, do you go around smelling other people's hands?

Or how about shower gel? For men it was additives like "power kick" or "super sport" and for women it was "silk-smooth" components. How the fuck is this supposed to work? Do they expect me to soap up and stand there wet and naked in the shower for twenty minutes waiting for the "re-energizing particles" to get absorbed through my skin before I rinse it off?

Or toothpaste with its useless stripes. I remember the marketing as a kid: the white component is for shining white teeth, the blue component is for fresh breath and the red component is for healthy gums. Because that's how it works, the components have to be color-coded. Not that it matters anyway because once you put it in your mouth it's all mushed together anyway. Is there anyone who thinks the stripes do anything? I also remember when I was a kid all the tooth paste would have "sugar-free" written on it. Like WTF, is there tooth paste that is not sugar-free?

Shit like this why I learned the shell. It is literally faster to just write a snippet like from scratch than to figure out this mess.

Attached: serveimage.jpeg (1200x800, 142.48K)

Or you could just use thunar's bulk rename

Ha! Great examples.

The whole of society pretty much. Transport is a bloat - we should have everything we need right under our nose. I mean not that long ago we grew our own food, now we go to the supermarket to buy it (and a bunch of shit we don't need). Jobs are bloat - most are useless and could be replaced by AI even right now. Filling out papers - a massive fucking bloat, why do we even need that.

1. Only allow the scripting environment to make visual changes, and calculations. Do not let it dynamically load resources or communicate (or maybe make a permission system where normal scripts are fine, but you'll have to allow scripts to make connections).
2. Make your website so that disabling JS will not make your website unusable. It's fine if some fancy effects are missing, but usability should not depend on JS.
There should be more steps, but that's the only 2 that come to my mind right away.

Every time

rm(){ mv $1 ~/.trash/;}

I installed Cygwin in Windows just to use as a file manager - Bash is literally the best file manager because you can easily rename massive amounts of files

There are so many edge cases with your shell script; you won't even need a filename with newlines for it to stumble.
It works fine, until it doesn't, and then you have to figure out what exactly happened to fix the results, unless you want to rollback to a backup you might not have.
Your best bet would be to use either of the thoroughly developed tools from the existing mess:
This is also an example that shows why some "overengineering" might actually be required for a tool to work correctly in weirder situations.

Game platforms like steam - which is actually a communicator, a web browser and a shop (did I miss something?). Discord is a funny example too - it started as communicator, now it's a game platform, not to mention it contains a web browser too.

Also ATX motherboards and tax system in my country.

you might want to change that torm(){ mv "[email protected]" ~/.trash/; }if you intend to delete more than one item.

You can contract it too by dropping the sed and using shell parameter expansion instead.
for i in `find -name ""`; do mv $i ${i//};done

It was a simple example. If your file naming/download saving is such a disorganized shit you require weird edge cases you probably have bigger issues to address, and such useful tools as rename are worth getting to know.

Pic 1 is what happens when you put autists chasing after fictional paradigms&theories in charge of your military R&D, pic 2 is much of the same but with war-profiteering jews at the helm instead.

Attached: ClipboardImage.png (900x657 8.17 MB, 92.56K)

such a predictable argument.
It is better to have a ready-to-use script that handles everything correctly than to spend time checking for edge cases and dealing with incorrect results and making sure all your filenames are okay (which might not even be yours) and don't contain anything unexpected.
Also, test.

Attached: abc def rtxo̘̠̞̪̥̫̙ͦp̻͔̘̺̯̾ͫ͊͌ͨ̓ͅî̸ş͈͉͕́̆f̼̪ͦ͗̾͗a̿ͮ͋ͤ͐̌ͯg̕z.jpg (700x979 505.47 KB, 505.47K)

Which almost always contains toxic fluoride. At least all the garbage ones with the age warning.
It's probably mushed together from the start. Afterwards it gets colored and then shoved in the tube together.
White would just be way too boring - t. toxic toothpaste manufacturers
sage for offtopic

Funny note about edge cases: wget and firefox leave the %01 quoted, curl -O doesn't even try to unquote the name, resulting in 'filename too long' on ext4 / tmpfs.

That's not gonna work if there are files with spaces

Overengineered or not?
GNU coreutils cat:;a=history;f=src/cat.c;h=3c319511c767f65d2e420b3bff8fa6197ddbb37b;hb=HEAD
Can you say that some of the things in these are always invalid?
On the other hand, should a standard with a 15M PDF spec even exist?

Stop larping
rename 's/faggot/(You)/' *faggot*

No shit. Did anyone claim it would? No.

Also no shit.


It was implied that this replacement program would replace the original while maintaining it's features.
Bulk Rename Tool works very well with spaces and executables. A replacement which doesn't is not a replacement.

Did you happen to see the film Wall-E?

First, you don't need the parameter -1 with ls if the output is not the terminal. You can see this by doing ls|cat. It's strange to have a program trying to identify the target of the output, but that`s what happen.
Now, mass file renaming is incredible easy to mess up, hard to fix and I did my share of it. Today, I used Emacs' wdired for this task, C-u C-x-d, put the parameter -R for recursive listing and C-x C-q to activate wdired mode and you can edit filenames like you're editing trivial text, inclusive with regexp find/replace. After you're done, C-c C-c to finish or C-c ESC if you made a big mistake.
Your solution will break with spaces and other special characters in the file name.

Before Emacs, I went through this ritual since I had to be careful to not mess up:
First, the quoted original names:
ls -Q>/tmp/before
The parameter -Q is exclusive to GNU, but you can quote names with sed.
Then, the basic modified file names without spaces or upper-case letters:
ls|tr '[A-Z ]' '[a-z_]'>/tmp/after
Then I used an editor to clear the "after" file even more like removing '!' or '[]', you can do this with tr or sed, but is error prone in my experience.
Finally, I pasted the results, edit it to place 'mv --' at every line and ran it
paste /tmp/before /tmp/after>/tmp/final

Thinking about it now, I could have used find with -type f in combination with -print0 and xargs to easy up the task a little, but whatever.

Mozilla Thunderbird
instead of being clean simple mail client, it is bloatware that has fucking web browser engine inside it (firefox). and this shit needs 100MB of space.

or was this thread more for overengineering in features, not in code and bloat? but thunderbird still could fit

Microsoft Office and LibreOffice

You can do this with literally everything. protip: including your own suckless life

Congrats you just overwrote a file in your trash folder and you have no idea where any of these things even came from.

Example: KDE

can someone explain to me what's the fucking point of trash?
if I click delete file I want it deleted, not in trash
if I accidentally deleted file I didn't wanted to, I would recover it with file recovery software, or from backup. but never happened to me to delete a file I didn't want to delete

So you can revert a deletion that you did accidentally.
like a true UNIX weenie

Counterpoint: quite too often, emails have a text/html body. While you certainly could open each such body though file and in a browser, that would be a loss of convenience not worth saving extra 100M.
Also, it supports RSS... which might indeed be a bloat.

but to display a html document you don't need entire browser engine, with cookies, javascript and other shit

That's a lot more lines than I would have expected.

I remember someone posting how /bin/true was written on different BSDs, Linux and whatnot as an example of a very simple program. I think most of the differences had their reason for being there, but it was interesting nonetheless.

I'm no suckless autist. And no, you can't replace everything with nothing. My point was that the tools needed to get the desired behaviour are already in place - so why have an unnecessary complex replacement? I also reiterated upon that by saying you could have a shell function that does the same with barely any effort.
The files in the trash could also be moved into a date-ordered separate folder inside the trash by a cronjob, so you don't overwrite files. Should you intend to keep the trash more than a few days.

The point is not strip every task down to a mere shell script with reduced functionality. The point is to avoid having unnecessary complex redundancies, which aren't fully understood, for things that are already in place.

why don't you just kill yourself then, because since you're useless, you're bloat too.

absolute state of Zig Forums mongoloids and humanity in general. go outside and just be a fucking human rather than a ZOGbot for fucks sake

No ... seriously ... wtaf?

Attached: wtaf.jpg (606x419, 72.76K)

this guy just so f--king gets me

while accepting that this is just redundant words: corporations should have FFFFUUUUU ... continued to ignore the internet and left the rest of us to step around the personal webpages of 255 animated gifs and get on with the real business of internets

Attached: love-this-post-sloth.png (637x475, 637.3K)

That screenshot is somewhat atypical, as such tools don't usually have every single boolean operator expanded onscreen at once.

I think you're forgetting the best part of Electron, that elevates it from "*yawn*, another typical webdev brainfart" to true Lovecraftian masterpiece.

Instead of using your web browser, or just the IE/Edge/Safari/Chrome library provided by every single modern OS, it includes its own copy of Chromium. Every single Electron "app" you install puts its own redundant copy of Chromium & Node on your drive, with the justification that using anything except the one single version of Chromium the "programmer" targeted might "introduce bugs" into a platform (HTML, CSS, JS) allegedly adopted for the specific reason that it's standardized.

On top of this, even if you have multiple Electron apps using the exact same version of every single framework, they will still eat up >100MB on your drive for every single Electron app installed, and when you run them simultaneously, each one will load all >100MB of its identical copy of Electron into RAM.

On a similar note, I would nominate the inclusion of invariably outdated copies of FFMPEG or whatever inside web browsers themselves, instead of just passing the content of and tags to the OS.

Only real solution would be what's already happening on cellphones with "site-specific apps", but done right. Each new service has its own standard (like eMail, IRC, FTP, USENET, Gopher, etc.) and maybe an official testbed reference client/server. In the case of 99% of web 2.0 sites including imageboards, some sort of CRUD Db protocol would probably encompass all of their functionality, and allow you to use native GUI widgets instead of cobbling everything together in HTML/CSS/JS.

I totally disagree but let's pretend this is true.
So user how come the OS didn't even bother to write it. I want you to actually think about that if you haven't already.

It would have been a better joke if they were using Spring properly. It's bad enough as is without going overboard with wrapper classes.


How come the maker of your hammer didn't bother to build your house?

Attached: 1441084904745.jpg (380x380, 50.17K)

Maximalism is a good thing, but hardly anyone does it because it requires a lot of work and using your brain. It's sometimes called "Big Design Up Front" and the Worse is Better paper calls it the "big complex system." What it means is that the system is designed completely before it is implemented. Any mistakes and bugs found in testing will be fixed before the final system, never pushed onto users. The quality of everything is better in addition to doing more and having more features.

Nothing in UNIX was designed like this, if you can call it designed at all. UNIX bullshit sucks because it was originally minimalist and expanded to allow it to be used where it wasn't originally meant to be used, even though it already sucked at its original job. JavaScript was "designed" in 10 days. C with Classes was only "14% larger" (IIRC) than C. C compilers were originally able to run on a PDP-11.

That's pure UNIX kludgery. Compile a UNIX language "designed" for systems programming on a PDP-11 into a UNIX language "designed" for web scripting and use it for applications running on a browser.

That's the UNIX method of "code reuse" (actually anti-reuse) known as static linking. Even UNIX linker itself "reuses" the obsolete tape archiver "ar" format that was replaced by "tar" in the 70s.

This is the only example of actual "maximalism" in this entire thread.

It doesn't do the same thing as that program at all and it doesn't even work on all file names. This is the kind of brain damage the UNIX shell causes. You could also write a file renamer in Brainfuck, but that doesn't mean a Brainfuck interpreter is a file renamer.

HTTP is more UNIX bullshit. It even has spelling errors from the broken UNIX spell checker.

That's literally the same justification used by UNIX weenies for static linking instead of dynamic linking. After all, C and POSIX are "standards" too. Dynamic linking solves this problem on small and large scales. Dynamically linked libraries since Multics were able to share the same memory, so there was only one copy in RAM on the entire computer. The UNIX method of static linking requires a separate copy for each program. The same philosophy that leads to each program having a separate copy of printf for "bug compatibility" also leads to them having a separate copy of Chromium.

>On a similar note, I would nominate the inclusion of invariably outdated copies of FFMPEG or whatever inside web browsers themselves, instead of just passing the content of and tags to the OS.
That's more of the UNIX static linking philosophy. There should be one copy of each type of video or audio decoder on the computer (unless the user specifically wants alternative programs for some reason) and every program that handles video or audio should be able to use it automatically.

Please tell me it doesn't always have to be like this.Please tell me that this is only the result of force-fittingcode into 640K to run on .5 MIP machines. Please tell methat when Unix runs on 1 MIP workstations with a few meg ofmemory it will finally grow up.What? You say this is on a many-MIP multiprocessor machinewith tens of megs of memory? And gigabytes of disk (but noroom for user stuff)? Never mind. I'll go quietly. Ipromise.ARGHHHH!!!!!!!!

HTTP is literal garbage. I'm currently building an HTTP 1 and 2 server for fun. Or so I thought. It isn't fun at all :(


You know how people make fun of *nix graphical user interfaces for being ugly and clunky?
Well, this is a case where the Windows offerings are vastly, vastly uglier.

Attached: bulk-rename-utility-550x376.png (550x376, 42.1K)


Dwarf Fortress.

I usually find your pasta amusing, as I absolutely despise *N*X and its descendants, but isn't the *N*X philosophy's stance that anything complex enough to be linked should instead be a completely separate program piped to by the user through a shell script?

It's still in pre-alpha. The game is supposed to be Toady One's magnum opus, the old-fashioned kind that takes most of your career to build.

But they were not installing by default, you had to explicitly select them in the installer.

elegant as fuck

I came across that abomination when I needed something equivalent to "A Better Finder Renamer" that I used back in my Mac days. Pic related, dig the difference?

Attached: app_icon_with_screenshot.png (1000x550, 168.02K)

That toothpaste looks like the French flag. Then again, it could just be white. A white flag is fine too.

Are you playing dumb? The issue with your shell script is taht it doesn't do evrything that fuckugly GUI does out of the box, and the effort required to expand its functionality to parity is enormously larger than the effort required to understand the fuckugly GUI.

A simple example that shows perfectly why the FOSS community is in general unsuccesful, and why exceptions like VLC made it.

We don't exist to be "useful" though. But we've been brainwashed so hard by capitalism that that's what we expect.

Also, note that ABFR and its stablemates date back to the System 7 days, back when "Mac user" meant people who still cared about writing soundly designed GUIs instead of being vacuous hipsters.


There's software designed by code monkeys. There's software designed by physics guys. Then there's old ham radio software

Attached: multipsk.gif (1152x864, 124.82K)

Big design up front is not really what I meant when I started the thread - I thought it would become clearer after reading the OP. The term maximalism was more meant as an antagonist to minimalism how it seems to be commonly understood. You seem to just define maximalism different for the sake of your argument - but if it sparks discussion: good. That being said, basically all of your answers reference unix, not sure if that's a sensible approach. Don't get sidetracked every few seconds.
In the OP tried to distinguish between good and bad kinds "overengineering": where there's an admirable amount of effort and complexity that serves a purpose, the effort is directed and solves a problem that is worth being solved (the last point not necessarily in case of overengineering) and other possible examples.

Also: I'm aware that suckless and others advocate for static linking that still keeps results small, but I wouldn't necessarily call that common practice even among unix users. The common practice seems to be dynamic linking.
That being said, the line between good and bad specimen may not always be broad.

Looks like a military operating panel

Attached: 97-0.png (1165x768, 854.5K)

That's the "ideal" UNIX philosophy, but even the biggest weenies realized that that wasn't usable back in the 70s. That's why UNIX is stuck with some shitty hack based on an obsolete predecessor of "tar" ("ar") that hasn't been used for anything else since the 70s.

What I'm saying is that most of the examples in this thread of "maximalist" software that sucks are UNIX bullshit, not maximalist at all. They started out minimalist (including C++) and gradually grew because the minimalist "designs" were inadequate. When I said maximalist is big design up front, I meant that being big is part of the intended philosophy, not something that happened because they didn't think about 90% of the purpose of their software.

That's because C and UNIX are the reasons computing sucks. The more you know about what was done in the 60s and 70s, the more you will hate C and UNIX too. It's not just the fact that UNIX and C are worse than 80s technology like Lisp machines and Xerox computers, it's that they're worse in 2019 than what we already had in the 60s. If these weenies went into plumbing in the 60s instead of programming and had the same level of "success", the entire Western world would be shitting in the street and spending billions of dollars to "research" whether to wipe with a leaf or their bare hand (with suckless advocating you shouldn't do it at all), all because they didn't know how to install a toilet (and still haven't learned 50 years later).

The multiple copies of Chromium and Electron come from the philosophy of UNIX static linking. There was nothing wrong with static linking for its time, in the 50s and early 60s. Most of those computers didn't have an OS and file system that we have today or that Multics and other 60s mainframes had. Dynamic linking was invented to reuse code by sharing code and it works. UNIX weenies use "dynamic linking" when they make system calls, and they're the same thing on the computers that ran Multics, but they don't understand this because that's not what the PDP-11 did.

Subject: who hates what... From: PD It's worth noting that the originators of UNIX, almost to a man, despise current UNIX implementations, and UNIX as it is being touted these days. UNIX-as-perceived is the product of people like Bill Joy, someone whose value can be judged from both the appearance and implementation of vi.It's probably also true that Gary Kildall (who wrote CP/M)hates MS-DOS. And I know a lot of old RSTS/E people whowouldn't wish VAX/VMS on their worst enemies. (actually, Iwouldn't wish Unix on my worst enemies either) Just becausethe original product was smaller (creating less totallossage) doesn't mean that it was any less horrendus on apercentage basis. And Unix is fertile ground for huntingfor lossage. I mean, how many {operating systems, softwarepackages, computers} are so horrible that they can support amailing list upon which people talk about how badly theysuck.

Underrated post.

Care to expand on why this isn't usable?
Another classic example of the unix haters fag blaming common developer kludges on Unix. People lazily tacking on unfitting features instead of either making it fit or designing a new, better tool is an old problem predating Unix and even computing: kludges and niggerrigging have existed since the dawn of mankind and will continue until our extinction.
Much of modern plumbing is unnecessary, overengineered waste disposal which wastes valuable fertilizers. A mix of composting toilets, proper waste treatment plants, and smaller communities surrounded by farmland would be much more efficient in the long term.

Macfags get the fuck out of this board with your $1000 overheating craptops that don't have a cooling system that actually cools the components but instead just channels air into a separate section that somehow is supposed to cool the computer.
If your computer reaches 80c when playing banished you should throw it in the fucking garbage.
but you can't because you paid $1000 fucking dollary doos for it

And what I said was that you just redefined what the thread is about in order to talk about Unix again. It's getting tiresome, really. Instead of further talking about the big design up front approach, you just rant about Unix.
That may or may not be, but the reason for you getting sidetracked is that you are unable to leave Unix out of the equation, no matter what the discussion is about.
Chromium and Electron are good examples of "unix" software? Have you read what I wrote about static linking? You disregard whatever is being said and just talk about again.
conductor we have a problem, conductor we have a problem

Attached: 4565465465465.jpg (480x211, 9.41K)

Because it doesn't work. Pipes need multiple processes and context switches to do what the same program on a good OS would do with a plain function call or memory access. It's not only slower, it requires programs to be split up in unnatural ways and needs all this extra code to serialize and copy data and read it back. Programmers have to do more work just to make things slower and less reliable. You can probably imagine how bad it would suck if every .o file had to be a separate executable in a separate process. That's why UNIX copied 20 year old (at the time) assembly technology. Combining multiple .o files into one program is done by the UNIX linker, which is based on the shitty "ar" tape archive format.

They are not common developer kludges, they are found only in UNIX. They might be common today, but that's only because of the spread of C and UNIX. It was the equivalent to Pajeet software, some cheap outsourced crap companies could license without having to hire good programmers. AT&T was able to get away with it because they didn't make the hardware. Blaming the PDP-11 or VAX hardware for UNIX software bugs didn't matter to AT&T.

For examples of big design up front, look at the design of Ada and Multics.

"Systemd, dbus, pulseaudio" are Linux software, and Linux is a clone of UNIX. "Of course written entirely in Javascript" is more UNIX bullshit because JavaScript is a UNIX language based on C and Java. You're giving examples of UNIX (technically "UNIX-like") bloatware, so how can I avoid blaming UNIX? The only non-UNIX software are that bulk file renamer and that ham radio program.

They're good examples of how the UNIX way doesn't scale to modern software, and yes, Chromium and Electron themselves are literally UNIX software. Chromium is based on WebKit from KDE. Electron combines Chromium with node.js, based on the Chromium V8 JavaScript engine.

Compare that to how it would be done on a normal GUI system like Lisp machines, Xerox Alto, pre-OS X Macs, and even Windows. There was software with embedded Internet Explorer since Windows 98, basically doing the same thing as Electron. Users never needed multiple copies of Internet Explorer bundled with each program. The UNIX way has been like this ever since each C program that used printf needed to bundle its own copy.

Suckless programs are small despite the redundancy because they don't do 90% of what users want. Compare them to real software that had real memory constraints like the old Macs, Lisp machines, and Xerox Alto. That was not 10% software, but they had far smaller programs and memory requirements than this suckless garbage.

Subject: why Unix sucksSome Andrew weenie, writing of Unix buffer-length bugs, says:> The big ones are grep(1) and sort(1). Their "silent> truncation" have introduced the most heinous of subtle bugs> in shell script database programs. Bugs that don't show up> until the system has been working perfectly for a long time,> and when they do show up, their only clue might be that some> inverted index doesn't have as many matches as were expected.Unix encourages, by egregious example, the mostirresponsible programming style imaginable. No errorchecking. No error messages. No conscience. If a studenthere turned in code like that, I'd flunk his ass.Unix software comes as close to real software as TeenageMutant Ninja Turtles comes to the classic Three Musketeers:a childish, vulgar, totally unsatisfying imitation.

Yep, you're delusional. Kludges have existed outside computing for millennia yet according to you compsci was somehow immune to this common human trend before those dastardly weenies and AT&T (which was bad unlike the other wise corporations and universities) discovered the kludge and magically convinced everyone else to do it too.
If we play the "based on" game, all this leads back to Multics.
JavaScript is a Multics language.
Chromium and Electron can trace their roots back to Multics.
Systemd, dbus, and pulseaudio are all Multics software too.
Nevermind that many of these took great influence from Microsoft Windows (itself based off OpenVMS) and how all these examples of bloat are from software and OSes deviating from their source's design philosophy. According to you, anything bad in software comes from Unix and its philosophy so you refuse to consider how other operating systems and impure design could also have a negative impact.
t. someone who's never used it
That is a compiler issue. Older compilers often optimized more for memory use than speed and running software through them would probably produce even smaller binaries, especially the non-X11 programs.

As for static linking, it isn't nearly as redundant as people think because statically linking a library or program doesn't include the entire thing, only the components the program actually uses. Shit like Electron is a bad example because it bundles an entire browser with your program whether you need its functionality or not (you never do), not to mention how the most commonly libc, glibc, intentionally nerfs static linking anyways. glibc ensures static linking is rarely used outside niche musl distros.
Static linking is not ideal compared to Multics' dynamic linking. It is, however, superior to Unix's godawful take on dynamic linking for everything except proprietary software.

Attached: 1306700560228.png (480x480, 127.17K)