the main reason I use wget is because it automatically retries downloads, which is vital for my not-so-great internet, and an option I wish was on in curl by default, as half the world uses curl, and then keep trying to automatically download 400MB files, which I have trouble finishing.
Then, if you look at the script, it's basically that, with some minor cleaning up and nicer error handling.
and you can use `-q` to have it exclude the curlrc and use only args you pass to it. curl has such an amazing amount of power, but also means it has a lot of options to utilize a lot of things folks take for granted (exactly like retries, ect)
Makes me think that probably a lot of unattended shell scripts out there should use -q in case someone has a .curlrc altering the behaviour of curl, and thus breaking the expectations of the script.
same can be said for wget with --no-config (and really any app that runs on any system) - if you're in the business of automating it, those config-features need to not be ignored (or down right overwritten to your scripts desire)
in docker-containers though, its more safe to assume no pre-loaded rc files are embedded (but 100% want to check your source-container for things like that) - but for running in some users workspace, the need to be careful is real.
most of the time though, those options should be safe; only really need to not have auto-retries are when trying to get super accurate date (like observing odd behavior, you dont want thing to simply "auto-work" when finetuning something, or triaging an upstream-server or dns issue)
i often write my scripts execing calls like `\curl -q ...` such that i get no user's alias of curl and ensures a consistant no-config amongst my system and others (although gnu curl vs mac curl (and other binaries of gnu vs mac vs busybox) are always the real fun part). (if the user has their own bash-function of curl then its on them for knowing their system will be special compared to others and results maybe inconsistent)
otherwise you have to specify -O for every URL specified. For the same reason, remove $1 and rely on all parameters being added at the end of line. The example above will only download the first URL.
(I personally think this should have been the default from the beginning, -O should have set the behaviour for all following parameters until changed, but that is too late to change now.)
There is also --remote-header-name (-J) which takes the remote file name from the header instead of the URL, which is what wget does.
> There is also --remote-header-name (-J) which takes the remote file name from the header instead of the URL, which is what wget does.
I don't think that's the case, that behavior is opt-in as indicated in wget's manpage:
> This can currently result in extra round-trips to the server for a "HEAD" request, and is known to suffer from a few bugs, which is why it is not currently enabled by default.
Curl does not handle reconnections or use caching like wget, for example, by default. And many other things which includes actually quite many arguments. You can see the list from this project.
MacOS comes with curl but no wget by default. Sure you can easily install it with homebrew. But it would be nice to have wget functionality by default.
More comparable comparison here is that you have scissors which are more complex to use and will give better results when used correctly. But you are lazy and get new basic scissors.
I felt weird reading the titles too, because maybe 2-3 years ago I have downloaded with `curl`, so why this?
And also, I am pretty sure `wget` can do it and better too.
As others pointed out, you can do that, you can also set them in .curlrc, or you can write a script if you want to allow for multiple URLs to be downloaded in parallel (not possible with an alias), or now you can just use wcurl :)
Note: wcurl sets a bit more flags than that, it also encodes the whitespaces in the URL and does parallel downloading of multiple URLs.
Sadly I can’t since it is dependent upon the util-linux version of getopt which means it fails on bsd and macOS systems. Understandable since it is always available on the specific target the script was written for and it does make life easier.
I knew getopt was linux-specific but I thoughts the only impact was that the long form argument (--opt) would not work. I turns out it doesn't run at all instead.
We should be able to fix this within the next few days, thank you!
wcurl does a lot more than that, though: parallel downloads, retries, following redirects (and other things, that are all written in the blog post).
There's also some value in providing a ready-to-use wrapper for users that install the curl package. Maybe wcurl will also show up in other distributions after a while, especially since Daniel also likes the idea.