> This is one often repeated reason why some users reach for wget instead of curl on the command line.
Is there a reason not to do that? I've always used wget and curl interchangeably, because both meet my moderate needs. Is there a reason I should avoid wget?
sometimes when using containers, curl was already installed and I use that. Other times it is wget. I might want to skip adding "apt-get install wget" etc
and wget is -o ; when i have a head full of tasks and code i only remember that they are different and tend to wget things, unless of course i am on a bsd and reach for fetch
Is curl able to retry from the middle of a file? Beyond the sane defaults that's where wget has been nice for me; if the connection drops it can pick right back up where it left off.
Yes, curl can resume partial downloads. You need to pass `-C -` to do so, which is a pretty good example of "the command line flags aren't easy to remember"
It's also yet another example of "could you please just do the right thing by default".
curl wants to be a swiss army knife, and so you'll need to configure everything just right so you don't accidentally get the bottle opener instead of the downloader. wget just downloads, and does it well.
I preferred this for years but beware of new management, telemetry and little mention of it. This is from the desktop app but I've seen Little/Open Snitch warn about the CLI as well:
HTTPie creator here. Just a clarification: HTTPie is still under the same management, and the CLI doesn’t have any telemetry. Little Snitch is probably warning you about requests to packages.httpie.io/latest.json [0]. That is done to let you know when a new version becomes available. It’s a static file hosted through GitHub pages [1] without any tracking. However, there’s currently a bug [2] that prevents the response from getting cached so that the request fires frequently for some users. It’s already fixed in a PR [3] and will be part of the upcoming release.
Good to know, however updates are a job for my package manager. Imagine dig or awk phoning home for updates? I could see a flag to ask for it specifically.
This is what I've been using for a while now as well. I wasn't aware of the telemetry in the CLI, as mentioned in another reply. That's something I'll need to look into.
Was fully expecting wcurl to be an alternative name for curl on Windows, because Microsoft made "curl" an alias for their own http downloader command in Powershell.
`curl.exe` is actually the real deal curl, now. Though it's got less built options than a standard Linux distro version. I don't know if they still alias it in PowerShell, but you can still use curl.exe to get it.
So if you're wget-ing a file from a server that uses HTTPS and HSTS, but you only specify http:// (or no protocol at all), then the next time you wget from that server, it will automatically change the URL to https://
If you have an closed system, then you have two options: use plain http if you really trust the environment, or use your own CA and have a trusted https. Having an untrusted https and disabling it is a double waste of time.
That's ok, that's how you normally do it. But then the second step is adding that CA to the trusted store on all relevant clients, so that it can actually get verified. (Otherwise why bother with the CA, just self-sign everything individually)
So let me get this straight: your IT won't do something, you're too lazy to add one flag to your scripts, so your solution is to ask that everyone has their security downgraded instead? That's... one way to approach tech issues.
Just don't do that. Some of us (hello) live in countries that perform or tried to perform HTTPS MITM on a massive scale, and only had to roll back because so much well behaving shit broke.
If software suddenly started accepting invalid certificates, they would have no incentive of rolling it back. HTTPS would make zero sense then.
This doesn't make it a good idea to break HTTPS by default. Defaults matter, if everything ignored HTTPS errors by default, I would be talking to you over a MITMed connection right now. Because so much software stopped working, they had to roll back that braindead idea in less than a day.
A MITM situation is relevant even without a credential and isn't at all about privacy: an attacker can swap out a different file for the one you wanted to download.
Add the signing authorities to your systems certificate store if it's that big of an annoyance. Or make your own custom alias that includes -k. But this absolutely cannot be default. HTTPS ensures that you are connected to the server you think you are and that no one is messing with your data in transit.
I totally understand this isn’t popular. But even if it doesn’t originate from a certificate chain, it is still encrypted between you and the website. Having the certificate chain lets you know the certificate is part of a chain of trust and prevents MITM
If you're downloading and running a binary or script (pretty common use of curl), anyone on your local network (and beyond) will be able to modify the file and thus take over your machine.
> - Auto follow HTTP redirects by providing -L option by default
The script already passes "--location", the non-shortened version of the argument.
For the other things maybe the both safer and more scalable approach is the script should see if it's being run interactively and default to interactively prompting the user for what action to take on detectable things (like missing basic auth or invalid certs for secure domains) or logging an example version of the same command to run as part of the stderr output otherwise. Apart from avoiding debate by taking a stance on what the default should be this will still be allowed by e.g. the Debian repo maintainers.
Good point on —location. I normally use -L. So I missed that
It does default to https as the proto-default if no scheme is provided - in your example it could default to interactively asking the user however this may fail in automated scenarios and/or hang.
Anyways it’s a good thought exercise. It’s hard to satisfy all use cases equally
I tried it out, it works. Personally I'd write something like this in Rust not shell. It's not hard at all, you basically can just ignore the borrow checker because all the program is doing is spawning processes and a bit of text processing.
The main issue with writing it in shell instead is incorrect error handling. I see a case that's a bit sketchy there. For now maybe don't expose wcurl to user/attacker controlled input.
Is there a reason not to do that? I've always used wget and curl interchangeably, because both meet my moderate needs. Is there a reason I should avoid wget?