I recently read the blog post “In Praise of –dry-run”. I love --dry-run mode! It can be immensely useful to avoid Bad Things™️ from happening. The blog post reminded me of a small terminal/shell/Bash trick that I have used for years, but never shared with the Internet:
An example Link to heading
Imagine you want to download many profile pictures from the Internet. All the images are stored in a CSV file containing 1) the name of the person and 2) a URL to the image. Something like this:
name,url
peter,https://images.example-cdn.io/gallery/pt.png
lisa,https://cdn.fictionalassets.net/ui/icons/liz.png
maria,https://media.mocksite.org/uploads/2026/01/mariaa.png
caren,https://assets.imaginarylab.com/products/c.png
chris,https://static.placeholderhub.dev/images/christoffer_columbus.png
You would like to download every image and store it locally on your disk as {name}.png. Ie. the urls https://images.example-cdn.io/gallery/pt.png would be downloaded to peter.png.
The way I would approach this would be as a 3-step process:
- Generating the shell commands that would do all the work. (dry-run)
- Reviewing the commands.
- Executing all the commands.
Let me go through these steps one after the other:
Generating the shell commands (dry-run) Link to heading
Initially, I would figure out how to download one image and verify that the command succeeded:
$ curl -o peter.png 'https://images.example-cdn.io/gallery/pt.png'
$ ls
peter.png
$
I would probably also want curl to fail on error, output errors, and exit non-zero on error:
$ rm peter.png
$ curl --fail --show-error --silent -o peter.png 'https://images.example-cdn.io/gallery/pt.png'
$ ls
peter.png
$
My next step would be to generate the shell command for that same image. First step would be to skip the first line of the CSV file:
$ cat images.csv | tail -n +2
peter,https://images.example-cdn.io/gallery/pt.png
lisa,https://cdn.fictionalassets.net/ui/icons/liz.png
maria,https://media.mocksite.org/uploads/2026/01/mariaa.png
caren,https://assets.imaginarylab.com/products/c.png
chris,https://static.placeholderhub.dev/images/christoffer_columbus.png
Since I would only want to recreate my curl execution, I would focus on the first line:
$ cat images.csv | tail -n +2 | head -n 1
peter,https://images.example-cdn.io/gallery/pt.png
From here on I would generate the curl command using my tool of choice, awk:
$ cat images.csv | tail -n +2 | head -n 1 \
| awk -F, '{print "curl --fail --show-error --silent -o", $1 ".png", $2;}'
curl --fail --show-error --silent -o peter.png https://images.example-cdn.io/gallery/pt.png
Notice how this would not execute anything. It would simply output what the shell command would look like. Dry-run, FTW! To reduce the likelihood of any weird escape sequences1, I would probably also wrap the URL in single quotes:
$ cat images.csv | tail -n +2 | head -n 1 \
| awk -F, '{print "curl --fail --show-error --silent -o", $1 ".png", "'\''" $2 "'\''" ;}'
curl --fail --show-error --silent -o peter.png 'https://images.example-cdn.io/gallery/pt.png'
Removing head -n 1, I would now have all the commands needed to download the images:
$ cat images.csv | tail -n +2 \
| awk -F, '{print "curl --fail --show-error --silent -o", $1 ".png", "'\''" $2 "'\''" ;}'
curl --fail --show-error --silent -o peter.png 'https://images.example-cdn.io/gallery/pt.png'
curl --fail --show-error --silent -o lisa.png 'https://cdn.fictionalassets.net/ui/icons/liz.png'
curl --fail --show-error --silent -o maria.png 'https://media.mocksite.org/uploads/2026/01/mariaa.png'
curl --fail --show-error --silent -o caren.png 'https://assets.imaginarylab.com/products/c.png'
curl --fail --show-error --silent -o chris.png 'https://static.placeholderhub.dev/images/christoffer_columbus.png'
Reviewing and testing a command Link to heading
Once I had all the shell commands generated, I would skim through them to see if anything looked odd. I would also pick one command at random, copy/paste it into my terminal, and execute it to see that it works:
$ curl --fail --show-error --silent -o maria.png 'https://media.mocksite.org/uploads/2026/01/mariaa.png'
$ ls
maria.png
peter.png
$
Executing all the commands Link to heading
If all looks good, I would pipe all the output into a shell (sh, bash, …) to execute:
$ cat images.csv | tail -n +2 \
| awk -F, '{print "curl --fail --show-error --silent -o", $1 ".png", "'\''" $2 "'\''" ;}' \
| sh
$ ls
caren.png
chris.png
lisa.png
maria.png
peter.png
$
Bonus: Slow commands Link to heading
If the execution was really slow, I would use pv instead of cat to get an ETA and the number of images downloaded per second:
$ pv --line-mode images.csv | tail -n +2 \
| awk -F, '{print "curl --fail --show-error --silent -o", $1 ".png", "'\''" $2 "'\''" ;}' \
| sh
4,00 0:00:05 [10 /s] [=====================================> ] 66% ETA 0:15:00
$
If I needed to parallelise downloads, I would use xargs:
$ pv --line-mode images.csv | tail -n +2 \
| awk -F, '{print "curl --fail --show-error --silent -o", $1 ".png", "'\''" $2 "'\''" ;}' \
| xargs -P10 -I{} sh -c '{}'
4,00 0:00:05 [10 /s] [=====================================> ] 66% ETA 0:15:00
$
to download 10 files in parallel.
Closing thoughts Link to heading
Generating shell commands, verifying them, and then piping them into a shell can be a powerful way to quickly get repetitive tasks done in a safe manner.
I am well aware of other escape attack vectors here, but let’s assume the file came from a trusted source! ↩︎