Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I appreciate it, but why not just:

   alias wcurl="curl -# -O $1" 
(I just tested this on MacOS).


The alias is closer to:

    alias wcurl = curl -# -O $1 --location --remote-name --remote-time --retry 10 --retry-max-time 10 --continue-at -"
the main reason I use wget is because it automatically retries downloads, which is vital for my not-so-great internet, and an option I wish was on in curl by default, as half the world uses curl, and then keep trying to automatically download 400MB files, which I have trouble finishing.

Then, if you look at the script, it's basically that, with some minor cleaning up and nicer error handling.


FWIW, I think you can add some of those options to .curlrc if you wanted them to be added every time you invoke curl.


correct; .curlrc for the win!

and you can use `-q` to have it exclude the curlrc and use only args you pass to it. curl has such an amazing amount of power, but also means it has a lot of options to utilize a lot of things folks take for granted (exactly like retries, ect)


Makes me think that probably a lot of unattended shell scripts out there should use -q in case someone has a .curlrc altering the behaviour of curl, and thus breaking the expectations of the script.

Maybe even should be a shellcheck rule.


same can be said for wget with --no-config (and really any app that runs on any system) - if you're in the business of automating it, those config-features need to not be ignored (or down right overwritten to your scripts desire) in docker-containers though, its more safe to assume no pre-loaded rc files are embedded (but 100% want to check your source-container for things like that) - but for running in some users workspace, the need to be careful is real.

most of the time though, those options should be safe; only really need to not have auto-retries are when trying to get super accurate date (like observing odd behavior, you dont want thing to simply "auto-work" when finetuning something, or triaging an upstream-server or dns issue)

i often write my scripts execing calls like `\curl -q ...` such that i get no user's alias of curl and ensures a consistant no-config amongst my system and others (although gnu curl vs mac curl (and other binaries of gnu vs mac vs busybox) are always the real fun part). (if the user has their own bash-function of curl then its on them for knowing their system will be special compared to others and results maybe inconsistent)


Oh wow, I thought only function could take parameters !

It must be relatively recent because old those old answers do not mention it : https://stackoverflow.com/questions/7131670/make-a-bash-alia... https://stackoverflow.com/questions/34340575/zsh-alias-with-...

On the other hand I tested it in both bash and zsh, and it works in both


Instead of -O you want to use

  --remote-name-all
otherwise you have to specify -O for every URL specified. For the same reason, remove $1 and rely on all parameters being added at the end of line. The example above will only download the first URL.

(I personally think this should have been the default from the beginning, -O should have set the behaviour for all following parameters until changed, but that is too late to change now.)

There is also --remote-header-name (-J) which takes the remote file name from the header instead of the URL, which is what wget does.


> There is also --remote-header-name (-J) which takes the remote file name from the header instead of the URL, which is what wget does.

I don't think that's the case, that behavior is opt-in as indicated in wget's manpage:

> This can currently result in extra round-trips to the server for a "HEAD" request, and is known to suffer from a few bugs, which is why it is not currently enabled by default.


Curl does not handle reconnections or use caching like wget, for example, by default. And many other things which includes actually quite many arguments. You can see the list from this project.


Why not just use wget if you need all that? Wget does yet more useful things.


MacOS comes with curl but no wget by default. Sure you can easily install it with homebrew. But it would be nice to have wget functionality by default.


Then you would need two tools instead of one


I've been trying to tell people my hammer is perfect for cutting their hair, they won't listen.

Until... I show them.

They become unresponsive, but their hair is removed.

Job done.


More comparable comparison here is that you have scissors which are more complex to use and will give better results when used correctly. But you are lazy and get new basic scissors.


Well no, I was suggesting "the golden hammer." ... where every problem is a nail.


So? The tools are miniscule. Use the right tool for the right job. Shoehorning curl into the role of wget is silly.


BUT BUT BUT - neither wget nor curl let me read my email!

BRB, forking curl to add email client support.

https://programmingphilosophy.org/posts/zawinskis-law/


Good news then, because curl supports IMAP!


> Shoehorning curl into the role of wget is silly.

How is it silly if the identical functionality is there, but you just need to use more command-line arguments?


The functionality is not identical. Wget is much more high level. You should not need to be an HTTP and curl expert to download a file lol.


I felt weird reading the titles too, because maybe 2-3 years ago I have downloaded with `curl`, so why this? And also, I am pretty sure `wget` can do it and better too.


As others pointed out, you can do that, you can also set them in .curlrc, or you can write a script if you want to allow for multiple URLs to be downloaded in parallel (not possible with an alias), or now you can just use wcurl :)

Note: wcurl sets a bit more flags than that, it also encodes the whitespaces in the URL and does parallel downloading of multiple URLs.


“or now you can just use wcurl :)”

Sadly I can’t since it is dependent upon the util-linux version of getopt which means it fails on bsd and macOS systems. Understandable since it is always available on the specific target the script was written for and it does make life easier.


argh, we are looking into fixing that now.

I knew getopt was linux-specific but I thoughts the only impact was that the long form argument (--opt) would not work. I turns out it doesn't run at all instead.

We should be able to fix this within the next few days, thank you!


> We should be able to fix this within the next few days, thank you!

It's fixed now, should work in non-linux environments.


wcurl does a lot more than that, though: parallel downloads, retries, following redirects (and other things, that are all written in the blog post).

There's also some value in providing a ready-to-use wrapper for users that install the curl package. Maybe wcurl will also show up in other distributions after a while, especially since Daniel also likes the idea.


> parallel downloads, retries, following redirects

There's aria2


I felt the same but upon checking they do more than I expected. I like it.


You often want redirection too, so -L for --location.


Totally agree! That one has burned me a lot. I generally use 'curl -LO' now instead of wget.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: