Can we please talk about how UN*X shell is still not fixed in 2018?

Can we please talk about how UN*X shell is still not fixed in 2018?

All they need to do is add "path" and "flag" data types separate to strings. Currently in shell everything is a string and it breaks everything.

Attached: unix meme.jpg (1040x690, 61.44K)

Other urls found in this thread:

github.com/koalaman/shellcheck/wiki
opengroup.org/openbrand/register/
twitter.com/SFWRedditVideos

how many servers are you responsible for?

two

what shell?

You're complaining about fundamental Unix limitations, not about the shell. Good luck fixing that without throwing out everything.

You're too retarded to see the real problem. What would fix POSIX would be to standardize a simple format for tabular and tree data. For tabular, simply use RS and FS variables (also, forbid ASCII FS..US in filenames), for trees, I don't know what would be best, but something even simpler than JSON would be appreciable.

UNIX can't be fixed. UNIX has the same problems it had in the 70s and 1991 except now the shills are better at tricking users into not realizing they are problems.

Date: Fri, 1 Mar 91 14:27:50 -0800Subject: A Joke In Every SentenceIn a fit of work avoidance, I just flipped open a copy ofThe Sun Observer, and read the following from a ``helpfulhints'' type column. It is NOT A JOKE. Using noclobber and read only files only protects you from a few occasional mistakes. A potentially catastrophic error is typing rm * .o instead of rm *.o In the blink of an eye, all of your files would be gone. A simple yet effective preventive measure is to create a file called -i in the directory in which you want extra protection: touch ./-i In the above case, the * is expanded to match all of the filenames in the directory. Because the file -i is alphabetically listed before any file except those that start with the character: !#$ percent&`()*+,. The rm command sees the -i file as an argument. When rm is executed with the -i argument, files will not be deleted unless you verify the action. This still isn't perfect. If you have a file that starts with a comma in the directory, it will come before the file starting with a dash and rm will not get the -i argument. SunOS's make utility creates file starting with a comma if the make file understands hidden dependencies by having the following line in the make file: .KEEP-STATE: The -i file also won't save you from errors like rm [a-z]* .oThe bit about files that appear before -i is garbled in theoriginal. I blame nroff.

Go use Windows, faggot. Your retarded quote made by a retard doesn't even relate to the thread. Also, nobody wants to discuss with an autist confusin philosophy and implementation (inb4 not handling errors and all this bullshit are part of the philosophy, they're not).

Sad. Many such cases!

Attached: comfy windows.png (1200x756, 548.78K)

You are a superpleb.

There is only one flaw in Unix shells.
zfs destroy or zpool destroy.
Pretty easy to fix though, just write a wrapper that provides verification for the destroy command.

How is this related to sh?

True Unix has never been tried.

UNIX was a mistake.

You guys do know other shells exist, right?

Is bash the only shell that has these problems?

That's pretty clever failsafe for important dirs.
I wouldn't call it an issue, who names their files with a starting - anyway?

who names their files starting with a . ... oh wait like half my home directory.
Watch as I transform a bug into a feature.

Who doesn't pay attention to what they're about to do before hitting Enter on an "rm" command?

Oh wait retards.

see OP pic

...

Yes, there is more than one command where you should pay attention to what you fucking typed before hitting enter. Computers are not supposed to be used by goldfish that can only remember one thing.

Attached: Aaaaah.jpg (1000x559, 50.14K)

Does your editor warn you when you try to exit without saving?

Shell is a thin wrapper around fork and exec. Exec/c don't have this problem, args wil be passed to the process unsplit.

There are certain requirements for something to be a POSIX shell. Same goes for unix. Unquoted vars being split on $IFS is one of those things

Not him, but vim wont let you :quit an unsaved buffer. You can still ZQ or :q! without trouble though.

Passing unsplit strings to a process is doable even in bourne sh, if unnecessarily tedious.
Telling the process which arguments are flags and which ones aren't (the topic of this thread) is impossible to do consistently. Not all software supports the -- separator.

sudo apt-get install -y powershell

Or, like, use Python.

jesus you're retarded

Attached: d280ffcbfb1393d2a5dceb2e13a6a4eadcb32a2529d3b30b9c012869c1624058.jpeg (470x470, 42.7K)

the virgin linux loser
-must have autistic colors for the few cli programs he uses
-complains when distros dont add user to sudoers by default
-thinks i3 + vim makes him look cool
-would use a systemd distro if it wasn't uncool
-unironically compiles everything from source
-bloated core components
-doesn't care about his package manager as long as it has dependency resolution
THE CHAD BSD BEASTIE
-has disabled colors from appearing on his system
-runs everything as root. fears nothing
-"yeah, i'd use dwm if it weren't so bloated"
-doesn't have to worry about an init being sane, they all are
-thanks god each day for binary package management
-minimal, fast core components. starts an hour long rant in irc when his system doesn't boot in less than 3 seconds
-urges devs to make a back-end package manager with no dependency resolution and to make it the default package manager

Yes, it is, and that's by design. Being able to write shell scripts is just a convenience for gluing simple things together. If you need a more powerful scripting system then use a proper scripting language:

#!/usr/local/bin/guile -s!#(system* "touch" "look ma, spaces.txt")

From the documentation:
- Scheme Procedure: system* . args Execute the command indicated by ARGS. The first element must be a string indicating the command to be executed, and the remaining items must be strings representing each of the arguments to that command. This function returns the exit status of the command as provided by `waitpid'. This value can be handled with `status:exit-val' and the related functions. `system*' is similar to `system', but accepts only one string per-argument, and performs no shell interpretation. The command is executed using fork and execlp. Accordingly this function may be safer than `system' in situations where shell interpretation is not required. Example: (system* "echo" "foo" "bar")

BTW, system* will also not carry out shell expansion, so
(system* "rm" "*")
Will actually try to delete the file named '*' and if it doesn't exists it will throw an error:
scheme@(guile-user)> (system* "rm" "*")rm: *: No such file or directory$1 = 256

You can use 'system' (without the asterisk at the end) if you really want to expand wildcards. It's all about choosing the right tool for the job.

That's not the problem this thread complains about. Try:
#!/usr/local/bin/guile -s!#(system* "touch" "-foo")
This problem lies deeper than the shell.

(Not to mention that sh has no trouble with 'touch "look ma, spaces.txt"' so you haven't demonstrated anything sh can't trivially do, even though guile offers plenty)

You have to fight the shell to get what Guile (or any other scripting language) gives you for free. Yes, you could put quotation marks around the file name, but what if instead of a fixed file name you splice in the output of another command? As shell scripts grow more complex you have to write them more defensively. I use a linter, so I find most of the obscure details before the code even runs, but just look at all this shit (the rules SCxxxx):
github.com/koalaman/shellcheck/wiki

Only the shell have to support it and bash do it in the right way.

No.
Run "touch -foo". Notice that it doesn't create a file named -foo. You have to use "touch -- -foo".
Now install jpegoptim and make a JPG file called -foo.jpg. Run "jpegoptim -foo.jpg" and "jpegoptim -- -foo.jpg". Notice that neither work.

jpegoptim "-foo.jpg"

You don't understand how shells work.
Quotes are for the shell's convenience. They don't get passed to the process,

okay, but does it work

No.
If you want to gain an understanding of what processes can and can't see, play around with passing arguments to this Python script:
#!/usr/bin/env python3import sysprint(sys.argv)
It's very simple, if unexpected to beginners.

...

So lisp has a list data type and that lets you work with spaces correctly

but you still have the problem of paths whose name starts with -. Since it to shell itself they're all just strings.

You'd need to go deeper to fix this. it can't be repaired just by making a better way to call system.

Yeah, you're right. I was mostly looking at the OP pic and how it's a non-issue when using a scripting language. The only real solution for this is to forget about the shell and use system calls through a scripting language's FFI.

Try single quotes or percent signs.

How fucking hard Is it to add quotes, you fucking mong?

int execv(const char *path, char *const argv[]);
Look at that shit.

A solution is to specify a new ENV-based protocol. Let executables signal that they support it, in which case options can be passed in through environment variables prefixed with OPT.
It's a gigantic hack, but it gives you proper key-value pairs.

God I hate you "Unix" fuckheads. Unix was a proprietary system 50 years ago, no sane person is using it anymore.

Bash is the shell of the GNU operating system.

Unix is also a rough standard that's used as the basis of a distressingly large share of widely used operating systems.
Which name would you use for this occasion?

unix-like

Just alias rm as rm -i

Mac OS X users are often obnoxious, but I've never had any reason to question their sanity.
opengroup.org/openbrand/register/

UN*X shell is unfixable nigger. adding two types isn't gonna fix shit (will it will fix 1/10000000 problems)

Every OS that exists right now is basically the same bullshit as UNIX, nigger.

see OP pic

um you're pretty lost my 'Doze using buddy. most of the people here use Linux which is a UNIX.

Attached: 1518140158872.png (750x1020, 438.77K)

TempleOS
RiscOS
FreeDOS

Everything in UNIX was fixed by Plan 9. Try out it's shell, rc, it truly is better than the inferior UNIX sh.

Does rc solve the particular problem expressed in this thread's OP's text?

How many are YOU responsible for?

BSD has won by attribution

In this scenario, you're like a feminist screaming about how you should be allowed to shit in your husband's cereal and piss on his sheets and acts surprised when he reacts appropriately.

...

And GNU is Not Unix.

The question was for him.
Obviously every single other piece of software guards against loss of user work because the assumption is user work is valuable. Unix-like don't because the user work is assumed to be worthless and temporary or backed up on a fucking external tape.
It's a server OS the ~1.0% desktop usage speaks volumes.

No its because normie OS tools are for retards the lowest common denominator. The trash can (so you can restore), the auto save (because they don't know how to save), the icloud (because they would rather apple store there shit than have to deal with backups themselves).

Saving at all is for normies with shit-tier memories.

That would make Plan 9 a clone of Multics, but it's not. It doesn't use segmentation, it still has null-terminated strings, the commands are mostly ports of UNIX commands, it's still written in C, it still has broken fixed-length buffers, and dynamic linking was removed instead of fixed.


No, it just made the documentation worse so you have to look at the source code to figure anything out. AT&T shills learned in the 80s that bad documentation and bad code can force programmers to learn C in order to understand and fix the OS, which is why so many C weenies appeared even though everyone knew C sucks.


"The user work is assumed to be worthless and temporary" is true, but backups are assumed to be worthless and temporary too.
It sucks at servers even more, but companies can afford to hire a permanent weenie and buy more expensive hardware that can handle the wasted cycles and slow UNIX I/O.

Date: Fri 12 Apr 91 13:27:28-PDTSubject: Re: Un*x Is The Pits ... Stanford had a system called "labrea". It was (is?) avax 750 with 10 fuji eagles (4.5 Gbytes, which was a lotwhen it was first around...) Tape handling has always been a real weak point of unix.Any real operating system has much better backup/restorecapabilities, and a lot of these are 10s of years old...

From: BW OK... /home/gn is back on-line and ready for use. There were several read errors on the old disk while the files were being transferred, and it's impossible to tell exactly which files were affected... If anyone finds any trashed or incomplete files, let me know and I'll try to retreive them from backup tapes.Doesn't it give you that warm, confident feeling to knowthat things aren't Quite Right, but that you'll have to wait'til you need something to know Just What? Life on theEdge.Get with the program -- remember -- 90% is good enough. Ifit's good enough for Ritchie, it's good enough for me!

why though?

Normalfags have no business using a shell just like they have no business playing around in a construction site. Powerful tools are dangerous.

jpegoptim ./-foo.jpg

I live to suffer.

That works. But what if you're writing a script, and don't have fine control over the filenames? You can't blindly prepend ./, it doesn't work for absolute filenames. So now you have to check the first character of each filename, which is mildly annoying in bash and an absolute pain in bourne sh.
And that's without getting into tools that take arbitrary string arguments that don't represent filenames, and so don't support that workaround.
Unix is terrible around the edges. Central cases are nice and simple, edge cases are impossible to deal with consistently, if at all.

You can do jpegoptim `pwd`/-foo.jpg

No, you can't. Let's say the filename is inside $fname, and we're in /home/user. The `pwd`/$fname for /tmp/foo.jpg becomes /home/user//tmp/foo.jpg, which is wrong.
Another problem is that you didn't quote it, so if there are spaces in the directory path it breaks.

then use find

I'm thinking of a case in which the filename was passed in by the user of the script and there's no other reason to mangle it.

woah you are so cool

Hard hats don't really get in the way at all. Real coal miners don't wear respirators because they are a PITA.

What's the analogy even mean? Making it so that mv, cp and rm don't assume your data is easily replaced is nowhere near adding some terrible user burden. Nor is fixing retarded bugs.

when confronted with logical refutations these people just sperg out with false analogies instead of ever conceding defeat.

THIS is why you need a "path" data type, separate from the "flag" data type.

Why the fuck am I typing RM then

inb4 *rm

What the fuck do you even want? If you want your hand held through basic file operations then use a desktop's UI. If you want to do low level operations that are often unsafe use a shell.

Can't you just use quotes?

Unix filenames are pretty bad, in some cases they're even executable. The alternative is breaking compatibility and convenience though so just be less shit. A single find command can fix all your bullshit filenames on your whole system at once.

I know right OP?
If only they were using a binary format with well defined boundaries, like a null characters. But maybe systemd could be managing a shell too, that will fix the problem!

Attached: extremely painful.png (226x261, 73.69K)

quotes help but do not completely resolve the issue.

having separate data types for paths, flags and strings would fix it 100%.

nul separated chars is actually a very very good way to work. I learned it from uriels non-harmful list (STOMP).

STOP GIVING HIM IDEAS!

Attached: 1417833922538.gif (264x320, 943.66K)

That Bitch deserved it.

t. Java nigger

Everything being a string is what makes Bash and similar shells so powerful and simple. Delimiting shit isn't that hard. Git gud pls

Attached: 1YqdJUgT.jpg (251x251, 12.01K)

>>>/telaviv/
>>>/bog/
>>>/facebook/

Actually, between Intel, AMD, ARM, IBM, and Nvidia, Nvidia is the ONLY one without an R&D Center in Israel, that makes Nvidia the least Jewish of them all

voidoptions() { fprintf(stderr, "{\"flags\": {" "\"--help:-h\": [\"void\", \"Shows help text\"]," "\"--version:-v\": [\"void\", \"Shows version\"]," "\"--depth\": [\"int\", 0, 100, 20, \"Recursion limit\"]," "},\"args\": [" "[\"file\", \"path\", \"File to read\"]," "[\"report?\" \"path?\", \"Output file to write a report to\"]," "]}");}//...intmain(int argc, char **argv) {//.. if (option == "--OPTIONS") { options(); return 0; }//...}
The shell will then automatically attempt to call --OPTIONS on programs and try to get valid JSON, and then validate the arguments. Of course this is still a hack to duct tape UNIX, but it's better than nothing. Now you can have the shell understand flags, paths and value types.

True jews don't make it obvious they're jews.

What a (((coincidence))).