Bash appreciation thread

If you don't need complex data structures or fast execution speeds, Bash is often the best tool for automation. The only time Bash is a poor choice is when you have to parse a user's input. For example, Bash perfectly adequate and secure for generating a static website, but horribly insecure for parsing the responses of users on a dynamic website with web forms (not to mention SLOW).

What is the most interesting or complex thing you've ever done with a Bash script? For me it was a script that crawls a GitBub Enterprise server with it's REST API and mirrors all repos and branches in case the official GHE backup scripts break or we want to switch off GHE to something else. I also have written a script that creates pajeet tickets (Jira tickets) from a Pajeet CI™ (Jenkins) pipeline with custom fields and the whole nine yards.

Other urls found in this thread:

repo.or.cz/q3cpma-shell-scripts.git/blob/HEAD:/portability.csv
repo.or.cz/misc-tools.git/tree:
github.com/midipix-project/slibtool
twitter.com/SFWRedditVideos

I believe you meant "shell scripting", and you want to write with POSIX-compliance in mind, because not everyone uses Bash.

Is that even that big of an issue? I see the point in using POSIX whenever feasible, but as long as you declare the shebang to be #!/bin/bash it should be obvious to the system that the script will only run in Bash.

...

Why would you bother writing POSIX compliant "whenever feasible" and ignore all the useful additional features that Bash has to offer when you're confining yourself and your users to Bash anyway?

This. Bash is useful af

Bet you didn't know it was created by a black man.

As an expert in POSIX sh scripting, I believe it's the right tool if:
1) You really know ALL the pitfalls (shellcheck can be useful for newbies) of sh. This include making your script work with or detect filenames with embedded newlines.
2) You know the difference between POSIX sh and bash
3) You made a list like repo.or.cz/q3cpma-shell-scripts.git/blob/HEAD:/portability.csv
4) You don't have to fall into quoting hell (that means using eval)
The only tools I really had to do in C were repo.or.cz/misc-tools.git/tree: natural sorting, multibyte string cutting (tfw GNU coreutils still aren't multibyte aware), url {de,en}coding and html decoding.

That's called a nigger in this part of town, boyo.

kek

That's only because bash is garbage though, as an example:
#!/usr/bin/env bashx=10echo {1..10} # output: 1 2 3 4 5 6 7 8 9 10echo {1..$x} # output : {1..10}
This stupid bullshit makes it so you have to use seq instead, but the very reason bash has sequence expressions is because using the seq binary is slow, yet there's no way around this but an ugly eval, in zsh this isn't an issue, {1..$x} works as expected.

I'm no civic nationalist, but I actually typed nigger out initially (reflex) , and figured I'd give him an honorary upgrade. You're not really a nigger if you can make something as useful as bash.

You're right. We should protest this by using another shell as the basis of our scripts, thus making our's equally un-Posix. Better yet, why not just use Python and stop the dick swinging?

That startup time, tho...

bash is the cancer of the Linux userland and why real developers were so happy to replace it with potteringware despite all its problems. It's no surprise it's nigger tech.

We were going to use python to replace shellscript. That was RedHat's plan in the late '90s and why they promoted it so heavily. But the startup time never really improved and killed any hope of that.

Relax, it wasn't as though he was considered honorary Aryan.

No thanks friend. Bash scripting is a necessary evil for me, I use it only for the simplest of scripts. In 99% of the time, I'd rather use Perl to do my scripting; I actually hate reading Perl code even if I wrote the code myself.

if you don't need functions that can return some fucking value as well.

If you don't need consistent run-times across different versions.

Personally I think biggest drawback of bash is no fucking returns. Everything is a fucking procedure in traditional sense

It's just parts of ksh and csh cobled together in an unholy mess.

It's just parts of ksh and csh cobled together in an unholy mess.

Just be consistant in your function writing. Use return for error codes, stdout for actual returns and stderr of info/warn/errs.

Didn't say there aren't some uses. But you're definitely touching to dirty parts of sh scripting once you start using eval.

What do you even want to return? If you want to return a string do the following:
where_to_poo () { echo 'loo'}echo "Poo in $(where_to_poo)."

Return values are for exit codes, same as with the return values of commands.

Why use bash or perl when you can use python?
1. Python runs on every operating system
2. It comes preinstalled on most operating systems
3. it has returns,
4. its easier to read,
5. it has more libraries,
6. it can be maintained and expanded easier,
7. can do most things that you would do with bash in fewer lines of written code.
Unless you are on something like a home router there is no reason not to use python for automation instead of bash unless you want to frustrate yourself

It's slow as fuck to start so can't replace scripts that get called frequently, it's not guaranteed to be present as part of either UNIX or LSB, and it's 5 megs even for a minimal install so is difficult to budget for in firmware.

which is why i said most
which is why i mentioned the bit about being on things like a home router


got me there

Where are the pipelines? Also, it's slow as shit.

Python works with Unix pipes in pretty much the same ways as any other programming language which supports them

Meaning that you have to use four statement to open, write into, read from then close the pipe instead of just doing cmd1 | cmd2.

Don't try to compare a glueing language to a scripting language, it doesn't work.

It's much more tedious than a proper shell language, but it's not quite as bad as you think.
Send bytes into command, get bytes out:
>>> import subprocess as sp>>> sp.run(['tac'], input=b'foo\nbar\n', stdout=sp.PIPE).stdoutb'bar\nfoo\n'
The same thing, but with strings:
>>> sp.run(['tac'], input='foo\nbar\n', stdout=sp.PIPE, universal_newlines=True).stdout'bar\nfoo\n'
Chain three commands (seq 10 | sort -R | head -n 1) and read the output:
>>> seq = sp.Popen(['seq', '10'], stdout=sp.PIPE)>>> shuf = sp.Popen(['sort', '-R'], stdin=seq.stdout, stdout=sp.PIPE)>>> head = sp.Popen(['head', '-n', '1'], stdin=shuf.stdout, stdout=sp.PIPE)>>> head.stdout.read()b'3\n'

But that's a big pain, honestly. Sure, emulating sh in python is better than the opposite, but python is clearly not made for this.
The concept of an extremely high-level (untyped; or typed like AWK) language to act as boilerplate between other scripts and binaries is something that must be kept. If you standardize the good parts of the modern shells (mainly arrays), you'd only have to fix the lack of consistency of the various *utils (mainly, we would need a good tabular to pass stuff around. Some RS and FS env variables, with at least a couple (\t, \n sounds nice) forbidden in filenames).

TCL

[code]
puts [ exec tac

i am retard?

It's okay, it happens to the best of us

It's not, though. Compare Popen to coprocs. It's nearly impossible to tame coprocs and make it safe and sane and what people usually do instead is just use named pipes and a giant ball of shellscript to manage them. It's nearly impossible to make a reliable solution with shellscript whereas you can do it in a few lines of python.

I should add, when comparing python and shell solutions, consider what a fully reliable solution requires. Shellscript has an evil property: it's very easy to get the 80% solution, but nearly impossible to turn it into the 100% solution. It entices you to start your work in it as it's easier than anything else, then after you're committed you realize your code will always be buggy shit that can't be fixed without an order of magnitude more code and you should have used something else.
libtool is a good example of that trap. It's horribly buggy and inefficient due to shellscript but too large and complex for anyone to want to rewrite it in a better language. So we've suffered with it for decades. Probably most of the "compile" time of a gentooman's box is just running libtool.

yes

Performance isn't an argument, since Python is a lot slower than sh and even bash. Python, like sh, should never be used for anything more than scripting (and libtool is more than scripting).
By the way, you might be interested in github.com/midipix-project/slibtool

What does this even mean?
How do you compare the performance of Python and bash? What do you mean by "sh", given that it's a language with no de facto canonical implementation, other than maybe bash?

I suppose you could do this:
$ cat bench.shi=0while [ $i -lt 1000000 ]; do i=$((i+1))done$ cat bench.pyi = 0while i < 1000000: i += 1$ time bash bench.sh6.98user$ time dash bench.sh1.78user$ time python3 bench.py0.15user$ time python2 bench.py0.07user
But that's uninteresting because most shell scripts don't involve anything like that.
You could instead compare a Python solution with a proper shell solution that uses external commands, but then you're not testing the performance of the shell itself, and the bash/sh contrast is moot.
People successfully use Python in the real world for much more than scripting. You can still think that that's a bad idea, but it's not comparable to sh, which is basically impossible to use properly for more than scripting. It's really weird to equate the two like that.

Well, you're right, it means nothing. By sh, I meant the most popular implementations (baring a symlink to bash). That would be the ash variants: dash, netbsd ash and busybox ash.

It's faster because the real work is done by ultra optimized C binaries, most of the time. But yeah, the language itself is pretty slow (because it really doesn't need to be fast).
On an interesting note, I did an experiment some time ago (calculting word frequency with some regexp filtering on some big text) and mawk was like five to seven time faster than Python.

How did you write the Python implementation? collections.Counter can give some surprising speed-ups because it implements a certain three-line function in C.
mawk probably is better at that kind of task in general though.

BASH is the cancer of UNIX.

I was a split -> filter -> map (for tolower) ->Counter