                         README file for Sslurp! 2.64
                         ============================

Contents:        Description
                 How to install
                 Where to find the latest version
                 Revision History
                 Contacting the author


                                   Description
                                   ===========

Sslurp! can retrieve Web pages from a HTTP (WWW) server. It can be configured
to follow all hyperlinks on the page that lead to other pages on the same
server. Images on the pages can be retrieved as well. All pages are stored on
disk and can be viewed later using your web browser.

Sslurp! also contains a simple proxy server for viewing the downloaded pages
and for filtered Internet access.

Sslurp! can make use of a proxy HTTP server, speeding up the whole procedure.
Sslurp! requires at least one HPFS partition!

Sslurp! is Freeware, i.e. you can use and distribute it for free. However,
you may not sell or rent it! Sslurp! is still copyrighted software.


                                 How to install
                                 ==============

Installation is simple. First, unpack the ZIP file (hey, you already did it!).
Second, copy the files to some directory. Start the program. Select
"Setup/Options" from the menu to configure the software. Press F1 to get help.

If you upgrade from a beta version and you like to keep your settings, you
must rename SPIDER.INI or WSUCK.INI to SSLURP.INI!


                        Where to find the latest version
                        ================================

New versions are uploaded to these locations:

http://www.nefkom.net/miho/sslurp/ - This is the official Sslurp homepage.
                                     You can download the latest version
                                     there. The FAQ list is available there
                                     as well.


                                   To-Do list
                                   ==========

This is a list of ideas for upcoming versions:

- Pause/continue downloading, with option to delete items from pending list.



                             Revision History (> 1.0)
                             ========================

Version 2.64:

  # Line breaks in quotes within HTML tags are now ignored instead of
    interpreted as whitespace.


Version 2.63:

  * Proxy attempts to return correct content-type when file was not downloaded
    with Sslurp.
  * When a receive error occurred before receiving the item body, a partial
    re-try is scheduled.
  # When a connection attempt fails Sslurps pauses for 1/2 second to
    reduce CPU load.


Version 2.62:

  * Possible crash with internal proxy and very long URLs.


Version 2.6:

  # Better error handling in internal proxy
  + Retry number and link level are displayed in the URL list.
  * Updated items may have been appended to local files instead of replacing
    them.
  + Command line option -H to hide task list entry
  # Forward slash added to server URLs


Version 2.5:

  * Options were not saved when -N command line option was used.
  + Recognizes BASE tag
  # Better error handling in internal proxy


Version 2.41:

  * HTTP 1.1 servers may have sent extra bytes in a retrieved item ("chunked
    transfer encoding"). Modified request.


Version 2.4:

  * URLs starting with "." or ".." were resolved uncorrectly.
  + The internal proxy prints out the matching filter entry when a URL is
    filtered out.
  + You can specify file name extensions of inline images.


Version 2.3:

  + When an item was retrieved incompletely from a HTTP 1.1 server Sslurp
    continues the partial download instead of retrieving the whole item
    when re-trying the download (incomplete downloads have to be made with
    Sslurp 2.3 or newer for this feature to work; older incomplete downloads
    are handled like before).
  * Removed some problems with line breaks within links.
  + background images within table elements (TABLE, TR, TD, TH) are now
    recognized and handled like other inline images.


Version 2.2:

  # More items in the history drop down list (35).
  # Changed order of items in history when existing item is reselected.
  + Proxy status is displayed in the status line.
  * Possible memory leak in proxy.
  + Menu item to reload the proxy's filter definition file at runtime.


Version 2.1:

  * Retry limit did not work.
  * Posting form data via Sslurp's proxy server did not work correctly for
    Netscape 4.04 (don't know about other browsers). The connection
    stalled.


Version 2.0:

  + "Date", "Expires" and "Content-Type" header lines are stored in file's
    EAs.
  + Sslurp now contains a simple HTTP proxy server. See online help for
    details.
  * Sslurp retried some downloads when it shouldn't.
  * Single CRs in HTML tags were not recognized correctly.


Version 1.8:

  * When specifying an URL on the command line, Sslurp did not start.
    downloading automatically.
  # Changed order of pages in setup notebook.
  + Ability to retry failed downloads.
  + Timestamp of stored items is set to date of last modification provided
    by the server. Sslurp still stores the same timestamp in the item's EAs
    and uses that for conditional (If-Modified-Since) requests.
  + The URL of retrieved items is stored in the .SUBJECT EA.


Version 1.7:

  + After starting with one URL the entry line and dropdown list stay active.
    Another URL can be entered and is queued for download.
  * Extremely long URLs may have caused a crash when Sslurp tried to save it
    to disk.
  # Escape sequences (%xx) in URLs are now de-escaped when building the local
    file name.
  + Displays size of retrieved files in pending list.
  * Sslurp crashed when trying to retrieve an item that was only partially
    retrieved in an earlier session.


Version 1.6:

  # Handles HTML comments differently
  # Added "Host" header line for multi-hosted web servers


Version 1.5:

  + Sslurp calculates the cps rate after receiving an item.
  * List was not cleared correctly in automated mode
  * "!" in links was not processed correctly
  + New command line option "-O" to specify a different file for logging.
  # Changed log file format to use a 4-digit year
  + Generates a "Referer" header line in requests
  * Command line options -E and -X didn't work correctly


Version 1.4:

  # New user interface. Now a list of processed and pending URLs
    is displayed. The status information is updated when the URL is processed.
  * URLs with trailing "/" have matched any file name extension. Now these
    URLs only match with "html".
  # Incomplete downloads are no longer recorded as "successful", i.e. they're
    re-loaded next time.


Version 1.3:

  # Drop down list is closed when "Start" button is pressed.
  # Pressing ENTER in the drop down list is equivalent to pressing the
    "Start" button.
  * Colons in URLs are converted.
  # Options are displayed in debug log messages


Version 1.2:

  * Problems with downloading Applets.
  * Extremely long HTML tags are now skipped. They may have caused crashes.
  # Leading whitespace in URLs is now skipped.
  * Some characters in URLs were unnecessarily converted for local storage.
  * Exclusion by extension didn't work correctly.


Version 1.1:

  * APPLET tag without CODEBASE attribute was not processed.
  # Option "Max link level" does no longer apply to images and applets.
  + Possibility to exclude a set of link extensions from being downloaded.
  + "Use proxy" option is available as command line switch



                              Contacting the author
                              =====================

SSlurp! was written by Michael Hohner. You can reach him at

  Internet:   miho@nefkom.net


13. March 2001, Michael Hohner
