I set up a quick demonstration to show risks of curl|bash and how a bad-actor could potentially hide a malicious script.
It’s nothing new or groundbreaking, but I figure it never hurts to have another reminder.
Submitted 1 day ago by K3can@lemmy.radio to selfhosted@lemmy.world
https://blog.k3can.us/posts/2026/feb/dontcurlbash/
I set up a quick demonstration to show risks of curl|bash and how a bad-actor could potentially hide a malicious script.
It’s nothing new or groundbreaking, but I figure it never hurts to have another reminder.
Use our easy bash oneliner to install our software!
Looks inside script
if [ $(command -v apt-get) ]; then apt-get install app; else echo “Unsupported OS”
Still less annoying than trying to build something from source in which the dev claims has like 3 dependencies but in reality requires 500mb of random packages you’ve never even heard of, all while their build system doesn’t do any pre comp checking so the build fails after a solid hours of compilation.
Oh, people will keep using it no matter how much you warn them.
Proxmox-helper-scripts is a perfect example. They’ll agree with you until that site comes up, and then its “it’ll never, ever get hacked and subverted, nope, can’t happen, impossible”.
Wankers.
I was looking at that very thing last night.
But then I realized, “why can’t immich just create usable packages like we had before?” and moped back out.
But, for a moment, I was sure a little inspection and testing would make the Internet equivalent of an NYC MTA coinsucker magically safe. It looked so eeeeasy.
Yes this has risks. At the same time anytime you run any piece of software you are facing the same risks, especially if that software is updated from the internet. Take a look at the NIST docs in software supply chain risks.
Not completely correct. A lot of updaters work with signatures to verify that what was downloaded is signed by the correct key.
With bash curl there is no such check in place.
So strictly speeking it is not the same.
Signatures do not help if your distribution infra gets compromised. See Solarwinds and the more recent node.js incidents.
This is a bit like saying crossing the street blindfolded while juggling chainsaws and crossing the street on a pedestrian crossing while the light is red for cars both carry risk. Sure. One’s a terrible idea though.
But those are two very different things, I can very easily give you a one liner using curl|bash that will compromise your system, to get the same level of compromise through a proper authenticated channel such as apt/pacman/etc you would need to compromise either their private keys and attack before they notice and change them or stick malicious code in an official package, either of those is orders of magnitude more difficult than writing a simple bash script.
I would feel more comfortable running curl bash from a trusted provider than doing apt get from an unknown software repo. What you are trying to do is establish trust in your supply chain, the delivery vehicle is less important.
Apt is great
Never have I ever piped curl to bash.
An alternative that will avoid the user agent trick is to curl | cat, which just prints the result of the first command to the console. curl >> filename.sh will write it to a script file that you can review and then mark executable and run if you deem it safe, which is safer than doing a curl | cat followed by a curl | bash (because it’s still possible for the 2nd curl to return a different set of commands).
You can control the user agent with curl and spoof a browser’s user agent for one fetch, then a second fetch using the normal curl user agent and compare the results to detect malicious urls in an automated way.
A command line analyzer tool would be nice for people who aren’t as familiar with the commands (and to defeat obfuscation) and arguments, though I believe the problem is NP, so it won’t likely ever be completely foolproof. Though maybe it can be if it is run in a sandbox to see what it does instead of just analyzed.
Running arbitrary text from the internet through an interpreter… what could possibly go wrong.
Curl bash is no different than running an sh script you dont know manually…
True, but this is specifically about scripts you think you know, and how curl bash might trick you into running a different script entirely.
No, it is different, as it adds an entire layer of indirection and unknown to the mix, increasing the risk in the process.
Anytime I see a project that had this in their install instructions, I don’t use that project.
It shows how dumb the devs are
Yes, this is the correct approach from a security perspective.
You mean blindly running code is bad? /s
I never thought about opening it in a browser. I always used curl to download such a script and view it where it was supposed to be run.
a more cautious user might first paste the url into the address bar of their web browser to see what the script looks like before running it.
Wow, I never thought anyone would be that dumb.
Why wouldn’t they just wget it, read it, and then execute it?
Oh the example in the article is the nice version if this attack.
Checking the script as downloaded by wget or curl and the piping curl to bash is still a terrible idea, as you have no guarantee you’ll get the same script in both cases:
Acronyms, initialisms, abbreviations, contractions, and other phrases which expand to something larger, that I’ve seen in this thread:
| Fewer Letters | More Letters |
|---|---|
| DNS | Domain Name Service/System |
| HTTP | Hypertext Transfer Protocol, the Web |
| PiHole | Network-wide ad-blocker (DNS sinkhole) |
[Thread #111 for this comm, first seen 23rd Feb 2026, 04:40] [FAQ] [Full list] [Contact] [Source code]
Good bot
This helped a lot. I had no clue I could post the curl string in the URL bar of a browser to view the script. Thanks for the education!
You had no idea you could paste a url into a browser’s location bar ?
You didn’t knew that the tool to handle URLs written in C (very creatively named C-Url) was handling URLs? It’s also written in C if you didn’t knew.
Shit are URLs esoteric knowledge now?
@K3can@lemmy.radio love the early 2000s stylesheet/color theme of your blog 🙂
ssfckdt@lemmy.blahaj.zone 2 hours ago
I’m a bit lost with
You… You just… You just dump the curl output to file and examine that and then run it if its good
Just a weird imagined sequence to me.
martini1992@lemmy.ml 1 hour ago
Worse than that, the server can change it’s response based on user agent so you need to curl it to a file first, a browser could be served a completely different response.
K3can@lemmy.radio 13 minutes ago
Which is exactly what is demonstrated in the post. 🙃