                                  _   _ ____  _     
  Project                     ___| | | |  _ \| |    
                             / __| | | | |_) | |    
                            | (__| |_| |  _ <| |___ 
                             \___|\___/|_| \_\_____|
NAME
       curl  -  get  a  URL with FTP, TELNET, LDAP, GOPHER, DICT,
       FILE, HTTP or HTTPS syntax.

SYNOPSIS
       curl [options] url

DESCRIPTION
       curl is a client  to  get  documents/files  from  servers,
       using  any  of  the  supported  protocols.  The command is
       designed to work without user interaction or any  kind  of
       interactivity.

       curl offers a busload of useful tricks like proxy support,
       user authentication, ftp upload, HTTP post,  SSL  (https:)
       connections, cookies, file transfer resume and more.

URL
       The  URL  syntax  is  protocol  dependent.  You'll  find a
       detailed description in RFC 2396.

       You can specify multiple URLs or parts of URLs by  writing
       part sets within braces as in:

        http://site.{one,two,three}.com

       or  you  can get sequences of alphanumeric series by using
       [] as in:

        ftp://ftp.numericals.com/file[1-100].txt
        ftp://ftp.numericals.com/file[001-100].txt    (with lead-
       ing zeros)
        ftp://ftp.letters.com/file[a-z].txt

       It  is  possible  to  specify up to 9 sets or series for a
       URL, but no nesting is supported at the moment:

        http://www.any.org/archive[1996-1999]/vol-
       ume[1-4]part{a,b,c,index}.html

OPTIONS
       -a/--append
              (FTP)  When  used  in  a ftp upload, this will tell
              curl to append to the target file instead of  over-
              writing  it.  If the file doesn't exist, it will be
              created.

       -A/--user-agent <agent string>
              (HTTP) Specify the User-Agent string to send to the
              HTTP  server.  Some badly done CGIs fail if its not
              set to "Mozilla/4.0".   To  encode  blanks  in  the
              string,  surround  the  string  with  single  quote
              marks.  This can also be set with  the  -H/--header
              flag of course.
       -b/--cookie <name=data>
              (HTTP)  Pass  the  data  to  the  HTTP  server as a
              cookie.  It  is  supposedly  the  data   previously
              received  from  the server in a "Set-Cookie:" line.
              The data should be  in  the  format  "NAME1=VALUE1;
              NAME2=VALUE2".

              If no '=' letter is used in the line, it is treated
              as a filename to  use  to  read  previously  stored
              cookie  lines  from,  which  should be used in this
              session if they match. Using this method also acti-
              vates  the  "cookie  parser"  which  will make curl
              record incoming cookies too, which may be handy  if
              you're   using   this   in   combination  with  the
              -L/--location option. The file format of  the  file
              to  read  cookies from should be plain HTTP headers
              or the netscape cookie file format.

              NOTE that the file specified  with  -b/--cookie  is
              only  used  as  input. No cookies will be stored in
              the file. To store cookies, save the  HTTP  headers
              to a file using -D/--dump-header!

       -B/--ftp-ascii
              (FTP/LDAP)  Use  ASCII transfer when getting an FTP
              file or LDAP  info.  For  FTP,  this  can  also  be
              enforced  by using an URL that ends with ";type=A".

       -c/--continue
              Continue/Resume  a  previous  file  transfer.  This
              instructs  curl  to  continue appending data on the
              file where it was previously left, possibly because
              of a broken connection to the server. There must be
              a named physical file to  append  to  for  this  to
              work.  Note: Upload resume is depening on a command
              named SIZE not always present in all  ftp  servers!
              Upload resume is for FTP only.  HTTP resume is only
              possible with HTTP/1.1 or later servers.

       -C/--continue-at <offset>
              Continue/Resume a previous  file  transfer  at  the
              given  offset. The given offset is the exact number
              of bytes that will  be  skipped  counted  from  the
              beginning  of  the  source file before it is trans-
              fered to the destination.  If  used  with  uploads,
              the  ftp  server  command  SIZE will not be used by
              curl. Upload resume is for FTP only.   HTTP  resume
              is only possible with HTTP/1.1 or later servers.

       -d/--data <data>
              (HTTP)  Sends  the specified data in a POST request
              to the HTTP server. Note  that  the  data  is  sent
              exactly as specified with no extra processing.  The
              data is expected to  be  "url-encoded".  This  will
              cause curl to pass the data to the server using the
              content-type     application/x-www-form-urlencoded.
              Compare to -F.

              If  you  start the data with the letter @, the rest
              should be a file name to read the data from,  or  -
              if  you want curl to read the data from stdin.  The
              contents of the file must already be url-encoded.

       -D/--dump-header <file>
              (HTTP/FTP) Write the HTTP  headers  to  this  file.
              Write  the  FTP file info to this file if -I/--head
              is used.

              This option is handy to use when you want to  store
              the  cookies  that  a  HTTP  site sends to you. The
              cookies could then be read in a second curl  invoke
              by using the -b/--cookie option!

       -e/--referer <URL>
              (HTTP)  Sends the "Referer Page" information to the
              HTTP server. Some badly done CGIs fail if it's  not
              set. This can also be set with the -H/--header flag
              of course.

       -E/--cert <certificate[:password]>
              (HTTPS) Tells curl to use the specified certificate
              file  when  getting a file with HTTPS. The certifi-
              cate must be in PEM format.  If the optional  pass-
              word isn't specified, it will be queried for on the
              terminal. Note that this certificate is the private
              key and the private certificate concatenated!

       -f/--fail
              (HTTP)  Fail  silently (no output at all) on server
              errors. This is mostly done  like  this  to  better
              enable  scripts  etc  to  better  deal  with failed
              attempts. In normal cases when a HTTP server  fails
              to  deliver  a document, it returns a HTML document
              stating so (which  often  also  describes  why  and
              more).  This flag will prevent curl from outputting
              that and fail silently instead.

       -F/--form <name=content>
              (HTTP) This lets curl emulate a filled in  form  in
              which  a  user  has pressed the submit button. This
              causes curl to POST  data  using  the  content-type
              multipart/form-data   according  to  RFC1867.  This
              enables uploading of binary files etc. To force the
              'content'  part  to be read from a file, prefix the
              file name with an @ sign.  Example,  to  send  your
              password  file  to  the server, where 'password' is
              the name of the  form-field  to  which  /etc/passwd
              will be the input:
              curl -F password=@/etc/passwd www.mypasswords.com

              To  read  the file's content from stdin insted of a
              file, use - where the file name should've been.

       -h/--help
              Usage help.

       -H/--header <header>
              (HTTP) Extra header to use when getting a web page.
              You  may  specify any number of extra headers. Note
              that if you should add a custom header that has the
              same  name  as  one of the internal ones curl would
              use,  your  externally  set  header  will  be  used
              instead  of  the  internal  one. This allows you to
              make even trickier stuff than curl  would  normally
              do.  You  should not replace internally set headers
              without knowing perfectly well what you're doing.

       -i/--include
              (HTTP) Include the HTTP-header in the  output.  The
              HTTP-header  includes things like server-name, date
              of the document, HTTP-version and more...

       -I/--head
              (HTTP/FTP) Fetch the HTTP-header only! HTTP-servers
              feature  the  command  HEAD  which this uses to get
              nothing but the header of a document. When used  on
              a FTP file, curl displays the file size only.

       -K/--config <config file>
              Specify  which  config  file to read curl arguments
              from. The config file is a text file in which  com-
              mand  line arguments can be written which then will
              be used as if they were written on the actual  com-
              mand  line. If the first column of a config line is
              a '#' character, the  rest  of  the  line  will  be
              treated as a comment.

              Specify  the  filename as '-' to make curl read the
              file from stdin.

       -l/--list-only
              (FTP) When listing an FTP  directory,  this  switch
              forces  a name-only view.  Especially useful if you
              want to machine-parse the contents of an FTP direc-
              tory  since the normal directory view doesn't use a
              standard look or format.

       -L/--location
              (HTTP/HTTPS)  If  the  server  reports   that   the
              requested  page has a different location (indicated
              with the header line Location:) this flag will  let
              curl attempt to reattempt the get on the new place.
              If used together with -i or -I,  headers  from  all
              requested pages will be shown.

       -m/--max-time <seconds>
              Maximum  time  in  seconds that you allow the whole
              operation to take.  This is useful  for  preventing
              your  batch jobs from hanging for hours due to slow
              networks or links going down.   This  doesn't  work
              properly in win32 systems.

       -M/--manual
              Manual. Display the huge help text.

       -n/--netrc
              Makes  curl scan the .netrc file in the user's home
              directory for login name and password. This is typ-
              ically  used  for  ftp  on unix. If used with http,
              curl will enable user authentication. See  netrc(5)
              for  details on the file format. Curl will not com-
              plain if that file hasn't the right permissions (it
              should  not be world nor group readable). The envi-
              ronment variable "HOME" is used to  find  the  home
              directory.

              A  quick  and very simple example of how to setup a
              .netrc  to  allow  curl  to  ftp  to  the   machine
              host.domain.com with user name

              machine host.domain.com user myself password secret

       -N/--no-buffer
              Disables the buffering of  the  output  stream.  In
              normal  work  situations,  curl will use a standard
              buffered output stream that will  have  the  effect
              that  it will output the data in chunks, not neces-
              sarily exactly when the data arrives.   Using  this
              option will disable that buffering.

       -o/--output <file>
              Write  output  to  <file> instead of stdout. If you
              are using {} or [] to fetch multiple documents, you
              can  use  '#'  followed  by  a number in the <file>
              specifier. That variable will be replaced with  the
              current string for the URL being fetched. Like in:

                curl http://{one,two}.site.com -o "file_#1.txt"

              or use several variables like:

                curl http://{site,host}.host[1-5].com -o "#1_#2"

       -O/--remote-name
              Write  output to a local file named like the remote
              file we get. (Only the file part of the remote file
              is used, the path is cut off.)

       -P/--ftpport <address>
              (FTP)  Reverses  the  initiator/listener roles when
              connecting with ftp. This switch makes Curl use the
              PORT  command  instead  of  PASV. In practice, PORT
              tells the server to connect to the client's  speci-
              fied  address  and port, while PASV asks the server
              for an ip address and port to connect to. <address>
              should be one of:

              interface   i.e "eth0" to specify which interface's
                          IP address you want to use  (Unix only)

              IP address  i.e  "192.168.10.1" to specify exact IP
                          number

              host name   i.e "my.host.domain" to specify machine

              -           (any  single-letter  string) to make it
                          pick the machine's default

       -q     If used as the first parameter on the command line,
              the $HOME/.curlrc file will not be read and used as
              a config file.

       -Q/--quote <comand>
              (FTP) Send an arbitrary command to the  remote  FTP
              server,  by  using the QUOTE command of the server.
              Not all servers support this command, and  the  set
              of  QUOTE  commands are server specific! Quote com-
              mands are sent BEFORE the transfer is taking place.
              To  make  commands  take  place  after a successful
              transfer, prefix them with  a  dash  '-'.  You  may
              specify any amount of commands to be run before and
              after the transfer. If the server  returns  failure
              for  one of the commands, the entire operation will
              be aborted.

       -r/--range <range>
              (HTTP/FTP) Retrieve a byte  range  (i.e  a  partial
              document) from a HTTP/1.1 or FTP server. Ranges can
              be specified in a number of ways.

              0-499     specifies the first 500 bytes

              500-999   specifies the second 500 bytes

              -500      specifies the last 500 bytes

              9500      specifies the bytes from offset 9500  and
                        forward

              0-0,-1    specifies   the   first   and  last  byte
                        only(*)(H)

              500-700,600-799
                        specifies 300 bytes from offset 500(H)

              100-199,500-599
                        specifies   two   separate   100    bytes
                        ranges(*)(H)

       (*) = NOTE that this will cause the server to reply with a
       multipart response!

       You should also be aware that many HTTP/1.1 servers do not
       have this feature enabled, so that when you attempt to get
       a range, you'll instead get the whole document.

       FTP range downloads only support the simple syntax 'start-
       stop'  (optionally  with  one  of the numbers omitted). It
       depends on the non-RFC command SIZE.

       -s/--silent
              Silent mode. Don't show  progress  meter  or  error
              messages.  Makes Curl mute.

       -S/--show-error
              When  used with -s it makes curl show error message
              if it fails.

       -t/--upload
              Transfer the stdin data to the specified file. Curl
              will read everything from stdin until EOF and store
              with the supplied  name.  If  this  is  used  on  a
              http(s) server, the PUT command will be used.

       -T/--upload-file <file>
              Like  -t,  but  this  transfers the specified local
              file. If there is no file  part  in  the  specified
              URL,  Curl  will  append  the local file name. NOTE
              that you must use a trailing / on the  last  direc-
              tory  to really prove to Curl that there is no file
              name or curl will think that  your  last  directory
              name is the remote file name to use. That will most
              likely cause the upload operation to fail. If  this
              is  used  on a http(s) server, the PUT command will
              be used.

       -u/--user <user:password>
              Specify user and password to use when fetching. See
              README.curl  for  detailed  examples  of how to use
              this. If no password is specified,  curl  will  ask
              for it interactively.

       -U/--proxy-user <user:password>
              Specify   user   and  password  to  use  for  Proxy
              authentication. If no password is  specified,  curl
              will ask for it interactively.

       -v/--verbose
              Makes  the  fetching more verbose/talkative. Mostly
              usable for debugging. Lines starting with '>' means
              data  sent by curl, '<' means data received by curl
              that is hidden in normal cases and  lines  starting
              with '*' means additional info provided by curl.

       -V/--version
              Displays  the  full  version  of  curl, libcurl and
              other 3rd party  libraries  linked  with  the  exe-
              cutable.

       -w/--write-out <format>
              Defines  what to display after a completed and suc-
              cessful operation. The format is a string that  may
              contain  plain  text mixed with any number of vari-
              ables. The string can be specified as "string",  to
              get  read  from  a  particular  file you specify it
              "@filename" and to tell curl  to  read  the  format
              from stdin you write "@-".

              The  variables present in the output format will be
              substituted by the value or text that  curl  thinks
              fit,  as  described below. All variables are speci-
              fied like %{variable_name} and to output a normal %
              you  just write them like %%. You can output a new-
              line by using \n, a carrige return with  \r  and  a
              tab space with \t.

              NOTE:  The  %-letter  is  a  special  letter in the
              win32-environment, where all occurrences of %  must
              be doubled when using this option.

              Available variables are at this point:

              url_effective  The  URL that was fetched last. This
                             is mostly meaningful if you've  told
                             curl to follow location: headers.

              http_code      The numerical code that was found in
                             the last retrieved HTTP(S) page.

              time_total     The total time, in seconds, that the
                             full operation lasted. The time will
                             be displayed with millisecond  reso-
                             lution.

              time_namelookup
                             The  time,  in seconds, it took from
                             the start until the  name  resolving
                             was completed.
              time_connect   The  time,  in seconds, it took from
                             the start until the connect  to  the
                             remote  host  (or  proxy)  was  com-
                             pleted.

              time_pretransfer
                             The time, in seconds, it  took  from
                             the start until the file transfer is
                             just about to begin.  This  includes
                             all  pre-transfer commands and nego-
                             tiations that are  specific  to  the
                             particular protocol(s) involved.

              size_download  The  total amount of bytes that were
                             downloaded.

              size_upload    The total amount of bytes that  were
                             uploaded.

              speed_download The average download speed that curl
                             measured for the complete  download.

              speed_upload   The  average  upload speed that curl
                             measured for the complete  download.

       -x/--proxy <proxyhost[:port]>
              Use  specified  proxy.  If  the  port number is not
              specified, it is assumed at port 1080.

       -X/--request <command>
              (HTTP) Specifies a custom request to use when  com-
              municating  with  the  HTTP  server.  The specified
              request will be used instead of the  standard  GET.
              Read  the  HTTP  1.1  specification for details and
              explanations.

              (FTP) Specifies a custom FTP command to use instead
              of LIST when doing file lists with ftp.

       -y/--speed-time <time>
              If  a download is slower than speed-limit bytes per
              second during a  speed-time  period,  the  download
              gets  aborted.  If  speed-time is used, the default
              speed-limit will be 1 unless set with -y.

       -Y/--speed-limit <speed>
              If a download is slower than this given  speed,  in
              bytes  per  second,  for speed-time seconds it gets
              aborted. speed-time is set with -Y and is 30 if not
              set.

       -z/--time-cond <date expression>
              (HTTP) Request to get a file that has been modified
              later than the given time and date, or one that has
              been modified before that time. The date expression
              can be all sorts of date strings or if  it  doesn't
              match  any  internal ones, it tries to get the time
              from a given file name instead! See the GNU date(1)
              man page for date expression details.

              Start  the  date expression with a dash (-) to make
              it request for a document that is  older  than  the
              given  date/time,  default  is  a  document that is
              newer than the specified date/time.

       -3/--sslv3
              (HTTPS) Forces curl to use SSL version 3 when nego-
              tiating with a remote SSL server.

       -2/--sslv2
              (HTTPS) Forces curl to use SSL version 2 when nego-
              tiating with a remote SSL server.

       -#/--progress-bar
              Make  curl  display  progress  information   as   a
              progress bar instead of the default statistics.

       --crlf (FTP)  Convert LF to CRLF in upload. Useful for MVS
              (OS/390).

       --stderr <file>
              Redirect all writes to stderr to the specified file
              instead.  If  the  file  name is a plain '-', it is
              instead written to stdout. This option has no point
              when  you're  using a shell with decent redirecting
              capabilities.

FILES
       ~/.curlrc
              Default config file.

ENVIRONMENT
       HTTP_PROXY [protocol://]<host>[:port]
              Sets proxy server to use for HTTP.

       HTTPS_PROXY [protocol://]<host>[:port]
              Sets proxy server to use for HTTPS.

       FTP_PROXY [protocol://]<host>[:port]
              Sets proxy server to use for FTP.

       GOPHER_PROXY [protocol://]<host>[:port]
              Sets proxy server to use for GOPHER.

       ALL_PROXY [protocol://]<host>[:port]
              Sets proxy server to use  if  no  protocol-specific
              proxy is set.
       NO_PROXY <comma-separated list of hosts>
              list  of  host  names that shouldn't go through any
              proxy. If set to a asterisk '*'  only,  it  matches
              all hosts.

       COLUMNS <integer>
              The  width  of  the  terminal.   This variable only
              affects curl  when  the  --progress-bar  option  is
              used.

EXIT CODES
       There  exists  a  bunch of different error codes and their
       corresponding error messages that may  appear  during  bad
       conditions.  At  the  time of this writing, the exit codes
       are:

       1      Unsupported protocol. This build  of  curl  has  no
              support for this protocol.

       2      Failed to initialize.

       3      URL malformat. The syntax was not correct.

       4      URL  user  malformatted.  The  user-part of the URL
              syntax was not correct.

       5      Couldn't resolve proxy. The given proxy host  could
              not be resolved.

       6      Couldn't  resolve  host.  The given remote host was
              not resolved.

       7      Failed to connect to host.

       8      FTP weird server reply. The server sent  data  curl
              couldn't parse.

       9      FTP access denied. The server denied login.

       10     FTP  user/password  incorrect.  Either  one or both
              were not accepted by the server.

       11     FTP weird PASS reply. Curl couldn't parse the reply
              sent to the PASS request.

       12     FTP weird USER reply. Curl couldn't parse the reply
              sent to the USER request.

       13     FTP weird PASV reply, Curl couldn't parse the reply
              sent to the PASV request.

       14     FTP  weird  227  formay.  Curl  couldn't  parse the
              227-line the server sent.
       15     FTP can't get host. Couldn't resolve the host IP we
              got in the 227-line.

       16     FTP  can't  reconnect. Couldn't connect to the host
              we got in the 227-line.

       17     FTP couldn't set binary. Couldn't  change  transfer
              method to binary.

       18     Partial  file.  Only  a part of the file was trans-
              fered.

       19     FTP couldn't RETR file. The RETR command failed.

       20     FTP write error. The transfer was reported  bad  by
              the server.

       21     FTP  quote  error.  A  quote command returned error
              from the server.

       22     HTTP not found. The requested page was  not  found.
              This return code only appears if --fail is used.

       23     Write  error.  Curl  couldn't write data to a local
              filesystem or similar.

       24     Malformat user. User name badly specified.

       25     FTP couldn't STOR file. The server denied the  STOR
              operation.

       26     Read error. Various reading problems.

       27     Out  of memory. A memory allocation request failed.

       28     Operation timeout. The  specified  time-out  period
              was reached according to the conditions.

       29     FTP  couldn't  set  ASCII.  The  server returned an
              unknown reply.

       30     FTP PORT failed. The PORT command failed.

       31     FTP couldn't use REST. The REST command failed.

       32     FTP couldn't use SIZE. The SIZE command failed. The
              command  is  an  extension to the original FTP spec
              RFC 959.

       33     HTTP range error. The range "command" didn't  work.

       34     HTTP  post  error. Internal post-request generation
              error.
       35     SSL connect error. The SSL handshaking failed.

       36     FTP bad download resume. Couldn't continue an  ear-
              lier aborted download.

       37     FILE  couldn't  read file. Failed to open the file.
              Permissions?

       38     LDAP cannot bind. LDAP bind operation failed.

       39     LDAP search failed.

       40     Library not found. The LDAP library was not  found.

       41     Function  not  found.  A required LDAP function was
              not found.

       XX     There will appear more error codes here  in  future
              releases.  The  existing  ones  are  meant to never
              change.

BUGS
       If you do find  any  (or  have  other  suggestions),  mail
       Daniel Stenberg <Daniel.Stenberg@haxx.nu>.

AUTHORS / CONTRIBUTORS
        - Daniel Stenberg <Daniel.Stenberg@haxx.nu>
        - Rafael Sagula <sagula@inf.ufrgs.br>
        - Sampo Kellomaki <sampo@iki.fi>
        - Linas Vepstas <linas@linas.org>
        - Bjorn Reese <breese@mail1.stofanet.dk>
        - Johan Anderson <johan@homemail.com>
        - Kjell Ericson <Kjell.Ericson@haxx,nu>
        - Troy Engel <tengel@sonic.net>
        - Ryan Nelson <ryan@inch.com>
        - Bjorn Stenberg <Bjorn.Stenberg@haxx.nu>
        - Angus Mackay <amackay@gus.ml.org>
        - Eric Young <eay@cryptsoft.com>
        - Simon Dick <simond@totally.irrelevant.org>
        - Oren Tirosh <oren@monty.hishome.net>
        - Steven G. Johnson <stevenj@alum.mit.edu>
        - Gilbert Ramirez Jr. <gram@verdict.uthscsa.edu>
        - Andr's Garc'a <ornalux@redestb.es>
        - Douglas E. Wegscheid <wegscd@whirlpool.com>
        - Mark Butler <butlerm@xmission.com>
        - Eric Thelin <eric@generation-i.com>
        - Marc Boucher <marc@mbsi.ca>
        - Greg Onufer <Greg.Onufer@Eng.Sun.COM>
        - Doug Kaufman <dkaufman@rahul.net>
        - David Eriksson <david@2good.com>
        - Ralph Beckmann <rabe@uni-paderborn.de>
        - T. Yamada <tai@imasy.or.jp>
        - Lars J. Aas <larsa@sim.no>
        - J"rn Hartroth <Joern.Hartroth@telekom.de>
        - Matthew Clarke <clamat@van.maves.ca>
        - Linus Nielsen <Linus.Nielsen@haxx.nu>
        - Felix von Leitner <felix@convergence.de>
        - Dan Zitter <dzitter@zitter.net>
        - Jongki Suwandi <Jongki.Suwandi@eng.sun.com>
        - Chris Maltby <chris@aurema.com>
        - Ron Zapp <rzapper@yahoo.com>
        - Paul Marquis <pmarquis@iname.com>
        - Ellis Pritchard <ellis@citria.com>
        - Damien Adant <dams@usa.net>
        - Chris <cbayliss@csc.come>
        - Marco G. Salvagno <mgs@whiz.cjb.net>

WWW
       http://curl.haxx.nu

FTP
       ftp://ftp.sunet.se/pub/www/utilities/curl/

SEE ALSO
       ftp(1), wget(1), snarf(1)

LATEST VERSION

  You always find news about what's going on as well as the latest versions
  from the curl web pages, located at:

        http://curl.haxx.nu

SIMPLE USAGE

  Get the main page from netscape's web-server:

        curl http://www.netscape.com/

  Get the root README file from funet's ftp-server:

        curl ftp://ftp.funet.fi/README

  Get a gopher document from funet's gopher server:

        curl gopher://gopher.funet.fi

  Get a web page from a server using port 8000:

        curl http://www.weirdserver.com:8000/

  Get a list of the root directory of an FTP site:

        curl ftp://ftp.fts.frontec.se/

  Get the definition of curl from a dictionary:

        curl dict://dict.org/m:curl

DOWNLOAD TO A FILE

  Get a web page and store in a local file:

        curl -o thatpage.html http://www.netscape.com/

  Get a web page and store in a local file, make the local file get the name
  of the remote document (if no file name part is specified in the URL, this
  will fail):

        curl -O http://www.netscape.com/index.html

USING PASSWORDS

 FTP

   To ftp files using name+passwd, include them in the URL like:

        curl ftp://name:passwd@machine.domain:port/full/path/to/file

   or specify them with the -u flag like

        curl -u name:passwd ftp://machine.domain:port/full/path/to/file

 HTTP

   The HTTP URL doesn't support user and password in the URL string. Curl
   does support that anyway to provide a ftp-style interface and thus you can
   pick a file like:

        curl http://name:passwd@machine.domain/full/path/to/file

   or specify user and password separately like in

        curl -u name:passwd http://machine.domain/full/path/to/file

   NOTE! Since HTTP URLs don't support user and password, you can't use that
   style when using Curl via a proxy. You _must_ use the -u style fetch
   during such circumstances.

 HTTPS

   Probably most commonly used with private certificates, as explained below.

 GOPHER

   Curl features no password support for gopher.

PROXY

 Get an ftp file using a proxy named my-proxy that uses port 888:

        curl -x my-proxy:888 ftp://ftp.leachsite.com/README

 Get a file from a HTTP server that requires user and password, using the
 same proxy as above:

        curl -u user:passwd -x my-proxy:888 http://www.get.this/

 Some proxies require special authentication. Specify by using -U as above:

        curl -U user:passwd -x my-proxy:888 http://www.get.this/

 See also the environment variables Curl support that offer further proxy
 control.

RANGES

  With HTTP 1.1 byte-ranges were introduced. Using this, a client can request
  to get only one or more subparts of a specified document. Curl supports
  this with the -r flag.

  Get the first 100 bytes of a document:

        curl -r 0-99 http://www.get.this/

  Get the last 500 bytes of a document:

        curl -r -500 http://www.get.this/

  Curl also supports simple ranges for FTP files as well. Then you can only
  specify start and stop position.

  Get the first 100 bytes of a document using FTP:

        curl -r 0-99 ftp://www.get.this/README  

UPLOADING

 FTP

  Upload all data on stdin to a specified ftp site:

        curl -t ftp://ftp.upload.com/myfile

  Upload data from a specified file, login with user and password:

        curl -T uploadfile -u user:passwd ftp://ftp.upload.com/myfile

  Upload a local file to the remote site, and use the local file name remote
  too:
 
        curl -T uploadfile -u user:passwd ftp://ftp.upload.com/

  Upload a local file to get appended to the remote file using ftp:

        curl -T localfile -a ftp://ftp.upload.com/remotefile

  NOTE: Curl does not support ftp upload through a proxy! The reason for this
  is simply that proxies are seldomly configured to allow this and that no
  author has supplied code that makes it possible!

 HTTP

  Upload all data on stdin to a specified http site:

        curl -t http://www.upload.com/myfile

  Note that the http server must've been configured to accept PUT before this
  can be done successfully.

  For other ways to do http data upload, see the POST section below.

VERBOSE / DEBUG

  If curl fails where it isn't supposed to, if the servers don't let you
  in, if you can't understand the responses: use the -v flag to get VERBOSE
  fetching. Curl will output lots of info and all data it sends and
  receives in order to let the user see all client-server interaction.

        curl -v ftp://ftp.upload.com/

DETAILED INFORMATION

  Different protocols provide different ways of getting detailed information
  about specific files/documents. To get curl to show detailed information
  about a single file, you should use -I/--head option. It displays all
  available info on a single file for HTTP and FTP. The HTTP information is a
  lot more extensive.

  For HTTP, you can get the header information (the same as -I would show)
  shown before the data by using -i/--include. Curl understands the
  -D/--dump-header option when getting files from both FTP and HTTP, and it
  will then store the headers in the specified file.

  Store the HTTP headers in a separate file:

        curl --dump-header headers.txt curl.haxx.nu

  Note that headers stored in a separate file can be very useful at a later
  time if you want curl to use cookies sent by the server. More about that in
  the cookies section.

POST (HTTP)

  It's easy to post data using curl. This is done using the -d <data>
  option.  The post data must be urlencoded.

  Post a simple "name" and "phone" guestbook.

        curl -d "name=Rafael%20Sagula&phone=3320780" \
                http://www.where.com/guest.cgi

  While -d uses the application/x-www-form-urlencoded mime-type, generally
  understood by CGI's and similar, curl also supports the more capable
  multipart/form-data type. This latter type supports things like file upload.

  -F accepts parameters like -F "name=contents". If you want the contents to
  be read from a file, use <@filename> as contents. When specifying a file,
  you can also specify which content type the file is, by appending
  ';type=<mime type>' to the file name. You can also post contents of several
  files in one field. So that the field name 'coolfiles' can be sent three
  files with different content types in a manner similar to:

        curl -F "coolfiles=@fil1.gif;type=image/gif,fil2.txt,fil3.html" \
        http://www.post.com/postit.cgi

  If content-type is not specified, curl will try to guess from the extension
  (it only knows a few), or use the previously specified type (from an earlier
  file if several files are specified in a list) or finally using the default
  type 'text/plain'.

  Emulate a fill-in form with -F. Let's say you fill in three fields in a
  form. One field is a file name which to post, one field is your name and one
  field is a file description. We want to post the file we have written named
  "cooltext.txt". To let curl do the posting of this data instead of your
  favourite browser, you have to check out the HTML of the form page to get to
  know the names of the input fields. In our example, the input field names are
  'file', 'yourname' and 'filedescription'.

        curl -F "file=@cooltext.txt" -F "yourname=Daniel" \
             -F "filedescription=Cool text file with cool text inside" \
             http://www.post.com/postit.cgi

  So, to send two files in one post you can do it in two ways:

  1. Send multiple files in a single "field" with a single field name:
 
        curl -F "pictures=@dog.gif,cat.gif" 
 
  2. Send two fields with two field names: 

        curl -F "docpicture=@dog.gif" -F "catpicture=@cat.gif" 

REFERER

  A HTTP request has the option to include information about which address
  that referred to actual page, and curl allows the user to specify that
  referrer to get specified on the command line. It is especially useful to
  fool or trick stupid servers or CGI scripts that rely on that information
  being available or contain certain data.

        curl -e www.coolsite.com http://www.showme.com/

USER AGENT

  A HTTP request has the option to include information about the browser
  that generated the request. Curl allows it to be specified on the command
  line. It is especially useful to fool or trick stupid servers or CGI
  scripts that only accept certain browsers.

  Example:

  curl -A 'Mozilla/3.0 (Win95; I)' http://www.nationsbank.com/

  Other common strings:
    'Mozilla/3.0 (Win95; I)'     Netscape Version 3 for Windows 95
    'Mozilla/3.04 (Win95; U)'    Netscape Version 3 for Windows 95
    'Mozilla/2.02 (OS/2; U)'     Netscape Version 2 for OS/2
    'Mozilla/4.04 [en] (X11; U; AIX 4.2; Nav)'           NS for AIX
    'Mozilla/4.05 [en] (X11; U; Linux 2.0.32 i586)'      NS for Linux

  Note that Internet Explorer tries hard to be compatible in every way:
    'Mozilla/4.0 (compatible; MSIE 4.01; Windows 95)'    MSIE for W95

  Mozilla is not the only possible User-Agent name:
    'Konqueror/1.0'             KDE File Manager desktop client
    'Lynx/2.7.1 libwww-FM/2.14' Lynx command line browser

COOKIES

  Cookies are generally used by web servers to keep state information at the
  client's side. The server sets cookies by sending a response line in the
  headers that looks like 'Set-Cookie: <data>' where the data part then
  typically contains a set of NAME=VALUE pairs (separated by semicolons ';'
  like "NAME1=VALUE1; NAME2=VALUE2;"). The server can also specify for what
  path the "cookie" should be used for (by specifying "path=value"), when the
  cookie should expire ("expire=DATE"), for what domain to use it
  ("domain=NAME") and if it should be used on secure connections only
  ("secure").

  If you've received a page from a server that contains a header like:
        Set-Cookie: sessionid=boo123; path="/foo";

  it means the server wants that first pair passed on when we get anything in
  a path beginning with "/foo".

  Example, get a page that wants my name passed in a cookie:

        curl -b "name=Daniel" www.sillypage.com

  Curl also has the ability to use previously received cookies in following
  sessions. If you get cookies from a server and store them in a file in a
  manner similar to:

        curl --dump-header headers www.example.com

  ... you can then in a second connect to that (or another) site, use the
  cookies from the 'headers' file like:

        curl -b headers www.example.com

  Note that by specifying -b you enable the "cookie awareness" and with -L
  you can make curl follow a location: (which often is used in combination
  with cookies). So that if a site sends cookies and a location, you can
  use a non-existing file to trig the cookie awareness like:

        curl -L -b empty-file www.example.com

  The file to read cookies from must be formatted using plain HTTP headers OR
  as netscape's cookie file. Curl will determine what kind it is based on the
  file contents.

PROGRESS METER

  The progress meter exists to show a user that something actually is
  happening. The different fields in the output have the following meaning:

  % Total    % Received % Xferd  Average Speed          Time             Curr.
                                 Dload  Upload Total    Current  Left    Speed
  0  151M    0 38608    0     0   9406      0  4:41:43  0:00:04  4:41:39  9287

  From left-to-right:
   %             - percentage completed of the whole transfer
   Total         - total size of the whole expected transfer
   %             - percentage completed of the download
   Received      - currently downloaded amount of bytes
   %             - percentage completed of the upload
   Xferd         - currently uploaded amount of bytes
   Average Speed
   Dload         - the average transfer speed of the download
   Average Speed
   Upload        - the average transfer speed of the upload
   Time Total    - expected time to complete the operation
   Time Current  - time passed since the invoke
   Time Left     - expected time left to completetion
   Curr.Speed    - the average transfer speed the last 5 seconds (the first
                   5 seconds of a transfer is based on less time of course.)

  The -# option will display a totally different progress bar that doesn't
  need much explanation!

SPEED LIMIT

  Curl offers the user to set conditions regarding transfer speed that must
  be met to let the transfer keep going. By using the switch -y and -Y you
  can make curl abort transfers if the transfer speed doesn't exceed your
  given lowest limit for a specified time.

  To let curl abandon downloading this page if its slower than 3000 bytes per
  second for 1 minute, run:

        curl -y 3000 -Y 60 www.far-away-site.com

  This can very well be used in combination with the overall time limit, so
  that the above operatioin must be completed in whole within 30 minutes:

        curl -m 1800 -y 3000 -Y 60 www.far-away-site.com

CONFIG FILE

  Curl automatically tries to read the .curlrc file (or _curlrc file on win32
  systems) from the user's home dir on startup. The config file should be
  made up with normal command line switches. Comments can be used within the
  file. If the first letter on a line is a '#'-letter the rest of the line
  is treated as a comment.

  Example, set default time out and proxy in a config file:

        # We want a 30 minute timeout:
        -m 1800
        # ... and we use a proxy for all accesses:
        -x proxy.our.domain.com:8080

  White spaces ARE significant at the end of lines, but all white spaces
  leading up to the first characters of each line are ignored.

  Prevent curl from reading the default file by using -q as the first command
  line parameter, like:

        curl -q www.thatsite.com

  Force curl to get and display a local help page in case it is invoked
  without URL by making a config file similar to:

        # default url to get
        http://help.with.curl.com/curlhelp.html

  You can specify another config file to be read by using the -K/--config
  flag. If you set config file name to "-" it'll read the config from stdin,
  which can be handy if you want to hide options from being visible in process
  tables etc:

        echo "-u user:passwd" | curl -K - http://that.secret.site.com

EXTRA HEADERS

  When using curl in your own very special programs, you may end up needing
  to pass on your own custom headers when getting a web page. You can do
  this by using the -H flag.

  Example, send the header "X-you-and-me: yes" to the server when getting a
  page:

        curl -H "X-you-and-me: yes" www.love.com

  This can also be useful in case you want curl to send a different text in
  a header than it normally does. The -H header you specify then replaces the
  header curl would normally send.

FTP and PATH NAMES

  Do note that when getting files with the ftp:// URL, the given path is
  relative the directory you enter. To get the file 'README' from your home
  directory at your ftp site, do:

        curl ftp://user:passwd@my.site.com/README

  But if you want the README file from the root directory of that very same
  site, you need to specify the absolute file name:

        curl ftp://user:passwd@my.site.com//README

  (I.e with an extra slash in front of the file name.)

FTP and firewalls

  The FTP protocol requires one of the involved parties to open a second
  connction as soon as data is about to get transfered. There are two ways to
  do this.

  The default way for curl is to issue the PASV command which causes the
  server to open another port and await another connection performed by the
  client. This is good if the client is behind a firewall that don't allow
  incoming connections.

        curl ftp.download.com

  If the server for example, is behind a firewall that don't allow connections
  on other ports than 21 (or if it just doesn't support the PASV command), the
  other way to do it is to use the PORT command and instruct the server to
  connect to the client on the given (as parameters to the PORT command) IP
  number and port.

  The -P flag to curl allows for different options. Your machine may have
  several IP-addresses and/or network interfaces and curl allows you to select
  which of them to use. Default address can also be used:

        curl -P - ftp.download.com

  Download with PORT but use the IP address of our 'le0' interface:

        curl -P le0 ftp.download.com

  Download with PORT but use 192.168.0.10 as our IP address to use:

        curl -P 192.168.0.10 ftp.download.com

HTTPS

  Secure HTTP requires SSL libraries to be installed and used when curl is
  built. If that is done, curl is capable of retrieving and posting documents
  using the HTTPS procotol.

  Example:

        curl https://www.secure-site.com

  Curl is also capable of using your personal certificates to get/post files
  from sites that require valid certificates. The only drawback is that the
  certificate needs to be in PEM-format. PEM is a standard and open format to
  store certificates with, but it is not used by the most commonly used
  browsers (Netscape and MSEI both use the so called PKCS#12 format). If you
  want curl to use the certificates you use with your (favourite) browser, you
  may need to download/compile a converter that can convert your browser's
  formatted certificates to PEM formatted ones. This kind of converter is
  included in recent versions of OpenSSL, and for older versions Dr Stephen
  N. Henson has written a patch for SSLeay that adds this functionality. You
  can get his patch (that requires an SSLeay installation) from his site at:
  http://www.drh-consultancy.demon.co.uk/

  Example on how to automatically retrieve a document using a certificate with
  a personal password:

        curl -E /path/to/cert.pem:password https://secure.site.com/

  If you neglect to specify the password on the command line, you will be
  prompted for the correct password before any data can be received.

  Many older SSL-servers have problems with SSLv3 or TLS, that newer versions
  of OpenSSL etc is using, therefore it is sometimes useful to specify what
  SSL-version curl should use. Use -3 or -2 to specify that exact SSL version
  to use:

        curl -2 https://secure.site.com/

  Otherwise, curl will first attempt to use v3 and then v2.

RESUMING FILE TRANSFERS

 To continue a file transfer where it was previously aborted, curl supports
 resume on http(s) downloads as well as ftp uploads and downloads.

 Continue downloading a document:

        curl -c -o file ftp://ftp.server.com/path/file

 Continue uploading a document(*1):

        curl -c -T file ftp://ftp.server.com/path/file

 Continue downloading a document from a web server(*2):

        curl -c -o file http://www.server.com/

 (*1) = This requires that the ftp server supports the non-standard command
        SIZE. If it doesn't, curl will say so.

 (*2) = This requires that the wb server supports at least HTTP/1.1. If it
        doesn't, curl will say so.

TIME CONDITIONS

 HTTP allows a client to specify a time condition for the document it
 requests. It is If-Modified-Since or If-Unmodified-Since. Curl allow you to
 specify them with the -z/--time-cond flag.

 For example, you can easily make a download that only gets performed if the
 remote file is newer than a local copy. It would be made like:

        curl -z local.html http://remote.server.com/remote.html

 Or you can download a file only if the local file is newer than the remote
 one. Do this by prepending the date string with a '-', as in:

        curl -z -local.html http://remote.server.com/remote.html

 You can specify a "free text" date as condition. Tell curl to only download
 the file if it was updated since yesterday:

        curl -z yesterday http://remote.server.com/remote.html

 Curl will then accept a wide range of date formats. You always make the date
 check the other way around by prepending it with a dash '-'.

DICT

  For fun try

        curl dict://dict.org/m:curl
        curl dict://dict.org/d:heisenbug:jargon
        curl dict://dict.org/d:daniel:web1913

  Aliases for 'm' are 'match' and 'find', and aliases for 'd' are 'define'
  and 'lookup'. For example,

        curl dict://dict.org/find:curl

  Commands that break the URL description of the RFC (but not the DICT
  protocol) are

        curl dict://dict.org/show:db
        curl dict://dict.org/show:strat

  Authentication is still missing (but this is not required by the RFC)

LDAP

  If you have installed the OpenLDAP library, curl can take advantage of it
  and offer ldap:// support.

  LDAP is a complex thing and writing an LDAP query is not an easy task. I do
  advice you to dig up the syntax description for that elsewhere, RFC 1959 if
  no other place is better.

  To show you an example, this is now I can get all people from my local LDAP
  server that has a certain sub-domain in their email address:

        curl -B "ldap://ldap.frontec.se/o=frontec??sub?mail=*sth.frontec.se"

  If I want the same info in HTML format, I can get it by not using the -B
  (enforce ASCII) flag.

ENVIRONMENT VARIABLES

  Curl reads and understands the following environment variables:

        HTTP_PROXY, HTTPS_PROXY, FTP_PROXY, GOPHER_PROXY

  They should be set for protocol-specific proxies. General proxy should be
  set with
        
        ALL_PROXY

  A comma-separated list of host names that shouldn't go through any proxy is
  set in (only an asterisk, '*' matches all hosts)

        NO_PROXY

  If a tail substring of the domain-path for a host matches one of these
  strings, transactions with that node will not be proxied.


  The usage of the -x/--proxy flag overrides the environment variables.

NETRC

  Unix introduced the .netrc concept a long time ago. It is a way for a user
  to specify name and password for commonly visited ftp sites in a file so
  that you don't have to type them in each time you visit those sites. You
  realize this is a big security risk if someone else gets hold of your
  passwords, so therefor most unix programs won't read this file unless it is
  only readable by yourself (curl doesn't care though).

  Curl supports .netrc files if told so (using the -n/--netrc option). This is
  not restricted to only ftp, but curl can use it for all protocols where
  authentication is used.

  A very simple .netrc file could look something like:

        machine curl.haxx.nu login iamdaniel password mysecret

CUSTOM OUTPUT

  To better allow script programmers to get to know about the progress of
  curl, the -w/--write-out option was introduced. Using this, you can specify
  what information from the previous transfer you want to extract.

  To display the amount of bytes downloaded together with some text and an
  ending newline:

        curl -w 'We downloaded %{size_download} bytes\n' www.download.com

MAILING LIST

  We have an open mailing list to discuss curl, its development and things
  relevant to this.

  To subscribe, mail curl-request@contactor.se with "subscribe <your email
  address>" in the body.

  To post to the list, mail curl@contactor.se.

  To unsubcribe, mail curl-request@contactor.se with "unsubscribe <your
  subscribed email address>" in the body.


