Andrew Pennebaker
https://github.com/mcandre/cheatsheets/blob/master/curl.md
curl is a command line Web page downloader
$ apt-get install curl
$ brew install curl
C:\> chocolatey install curl
$HOME/.curlrc
- HTTP
- HTTPS
- FTP
- SFTP
$ curl <URL> | less
$ curl -C <URL>
$ curl "<URL>"
$ curl -O <URL>
$ curl -o <filename> <URL>
-A Firefox
-v
-X POST
-H "Content-Type: application/json
-d '{ ... }'
Or
-d @<JSON file>
See curlmirror.pl.
Instead of hdfs://<host>:8020/<path>
, use:
$ curl http://<host>:50075/streamFile/<path>
Instead of couchdb://<authority>
, use:
$ curl http://<host>:5984/<authority>
- wget specializes in webcrawling.
- lftp specializes in FTP transfers.
- scp specializes in SSH file transfers.
- WWW::Mechanize is a Perl library for fine-tuned Web crawling.