I only discovered this recently, and it’s very handy.
Piping scripts directly to bash is a security risk. You can always download the scripts, inspect them and run locally if you so choose.
Submitted 1 week ago by downhomechunk@midwest.social to selfhosted@lemmy.world
https://community-scripts.github.io/ProxmoxVE/
I only discovered this recently, and it’s very handy.
Piping scripts directly to bash is a security risk. You can always download the scripts, inspect them and run locally if you so choose.
Piping scripts directly to bash is a security risk. You can always download the scripts, inspect them and run locally if you so choose.
This entire trend needs to die. Package managers exist. Use them. Shun and shame sites that promote shell script installers.
Apples and oranges.
Package managers only install a package with defaults. These helper scripts are designed to take the user through a final config that isn’t provided by the package defaults.
No need to be elitist about such things.
Package managers only install a package with defaults. These helper scripts are designed to take the user through a final config that isn’t provided by the package defaults.
This is trivially solved by having a “setup” script that is also installed by the package manager.
No, package installers support configuration. Plenty of packages (e.g. postfix) prompt for configuration at install time.
Apples and oranges.
Package managers only install a package with defaults. These helper scripts are designed to take the user through a final config that isn’t provided by the package defaults.
Whether there’s a setup wizard doesn’t have anything to do with whether the tool comes from a package manager or not. Run “apt install ddclient”, for example, it’ll immediately guide you through all configuration steps for the program instead of just dumping a binary and some config text files in /etc/.
So that’s not the bottleneck or contradiction here. It’s just very unfortunate that setup wizards are not very popular as soon as you leave Windows and OSX ecosystems.
Heellll no, the scripts are publically available to read over if you’re sketched out. They save you so much time to actually get to using the service. 98% of my homelab is from these same helper scripts too.
RIP tteck
Have you ever looked at what was once ttek scripts? They’re a spaghetti of calls to other scripts. It’s not pretty.
You can install with package managers and include with it a helper script to setup the service. No big deal.
But can you spot the difference between http://myservice.com/script.sh
and http://myserv1ce.com/script.sh
if you use a font that doesn’t make it clear? If you get people used to just copy/pasting/running scripts then there’s a risk they’ll run something entirely different by accident.
There’s no good reason to install things this way.
I don’t like that an adversary could modify that link or its contents without much detection or any logging.
When you compare it to package managers that have immutable versioning that’s a big downfall. If someone were modifying pypi or npm packages I would be surprised if it went undetected.
Realistically is that an issue, probably not. But I do try and reduce my exposure when I can.
Fun fact, a malicious server can detect the difference between you loading the script for inspection in your browser, and you doing curl | sh
, and could serve an entirely different script.
Yeah - it’s remarkable that I receive pushback about it. I guess it’s down to the technical immaturity of your average home-gamer vs. people who support Linux systems for a living?
That’s why I also self host the scripts I’ve vetted…
Piping scripts directly to bash is a security risk
Nobody has ever explained why. What is the difference between executing a script directly from curl, and adding a repository which downloads a package which contains a script.
The URL can point to a different file. People can post maliciously similar URLs and trick you into running something else.
With a repository you have some semblance of “people have looked at this before”. Packages are signed and it will provide a standard way to uninstall and upgrade in the future.
There’s literally no good reason to replace it with a shell script on a website.
IMO these kinds of poor man’s automation scripts are only useful to novice sysadmins but those are exactly the kind of people who shouldn’t be running scripts they piped from the internet for both the fact that it’s risky behaviour and the fact they don’t then get the experience doing this manually for themselves to move on from being novice.
That said, let’s not gate keep. If novices don’t want to gain experience actually doing sysadmin work and level up their abilities and just want stuff that will probably work but that they’ll not be able to fix easily if it doesn’t, at least it’s a starting point and when things break some of them will look deeper.
That said, let’s not gate keep.
This shouldn’t be an excuse for promoting risky behavior.
I asked repository maintainers and they said “LXC is not for apps” and of course docker is a good way to waste your weekends. So we don’t have repositories, we have scripts.
If you disagree, go tell them
discuss.linuxcontainers.org/t/…/14946
Until then, people who have sacrificed enough of their weekend to the linux gods will be pipe internet text into their root consoles
Until then, people who have sacrificed enough of their weekend to the linux gods will be pipe internet text into their root consoles
“I’ll do what’s easy even if it’s not good” is a terrible approach to, well, anything. I would expect people in this community to look for guidance on what the best way to do things is. Seems I’m wrong.
Custard@lemmy.world 1 week ago
Didn’t the creator of these scripts recently pass away, or am I misremembering?