===============
File 3INTRO.TXT
  Introduction
===============
Formatted with non-proportional font (Courier)
[Text essentially unchanged since last test, history updated]


Vesselin Bontchev (c) 1994, Klaus Brunnstein (c) 1997-2002


With growth of internetworking, and with growing complexity of systems
and software, threats to individual and enterprise computing grow
equally. A growing number of users and institutions become ever more
dependent upon availability, reliability and functional behaviour of
their IT and Network systems. Moreover, storage, processing and transfer
of sensitive information requires protective measures against malevolent
attackers and malicious software. 

For some time, malicious software was mainly understood to be of "viral
nature": when such pieces of software entered one`s PC, it could spread
by self-replication, either on the system level (boot, MBR and DIR
viruses) or via infected programs (*.COM, *.EXE, *.SYS, *.BAT etc). 

With proliferation of networks, new forms of malicious software
appeared. Besides malicious software which is able to self-replicate
("viruses"), the diversity of malicous software grows further: trojan
horses, droppers, intended (though imperfect) viruses, hoaxes
(=malevolent jokes), hostile agents (such as hostile Java applets and
malicious ActiveX controls) and other forms of malicious net-"work"
(such as email bombs or worms) appear. As users cannot distinguish
whether such malevolent software is "just viral" or otherwise dangerous,
the detection of more general forms of malicious software becomes a
major requirement from the side of customers. 

With growing numbers of PC users, a market of AntiViral products
developped to help users fight such software. Moreover, several
magazines started testing the quality of AV products using their own
(usually small) virus databases. The quality of such test has rather
often been discussed "controversially".

With further growth of file/boot viruses (>40,000 file viruses) and with
the advent of document-related viruses (using macro languages to infect
master templates), there is an urgent need for professional tests of
anti-virus products. There are several reasons for that. The main one is
that the anti-virus products are not something that the end user is able
to evaluate him/herself. When the user buys a word processor, s/he can
easily see whether it works according to the expectations and whether it
performs the job it is supposed to perform. Not so with the anti-virus
products. An anti-virus product may be installed and started every day,
but its real anti-virus part enters into action (and shows whether it is
any good) only during a real virus attack, but then, its proper work may
significantly influence the users productivity for some time.
Fortunately, regardless of all the media hype, users experience computer
viruses in relatively rare cases. A user could use an anti-virus product
a whole year, if not more, without needing its anti-virus capabilities
to stop a virus attack.

Another reason is that an anti-virus product is extremely difficult to
test. In order to test a word processor, one only needs the manual and
some (potentially big) text files. In order to test an anti-virus
product, one needs a lot of things. First of all, the tester of such a
product must have a deep and intimate knowledge of how computer viruses
work, what their methods of attack are, and what the methods are to
thwart those attacks. The tester must know the principles on which the
anti-virus products work. And last, but not least: the tester must have
access to a fairly rich and well-organized virus collection. The ideal
person who has all of the above is the anti-virus researcher.

Unfortunately, the anti-virus researchers are hard to come by. Most of
them are busy developing and selling their own products. As such, they
cannot test other people's anti-virus products - because results will be
allways biased towards their own. Therefore, one needs an independent
anti-virus researcher, in order to test an anti-virus product properly.
The number of independent anti-virus researchers in the world can
probably be counted on the fingers of one hand.

Yet another problem is obtaining the necessary resources for a good
anti-virus product test. As mentioned, those tests are very difficult to
perform. They require a lot of disk space, a variety of hardware, a lot
of man-hours to complete. The main question is - how to get the money to
fund all this?

One solution is to have the anti-virus companies pay for the tests.
After all, results are usually very useful to them (in the form of bug
reports), and sometimes can be used for advertising. This approach is
followed by the UK AntiVirus working group which developped a draft of
"AntiViral Functionalities" to be certified within the European ITSEC
scheme (or possibly within the scope of the US/European "Common
Criteria"). Within such a scheme, an AV producer can apply for an AV
certificate which is given after due analysis including proper tests.
(Unfortunately, activities in this direction seem to have been stopped
meanwhile).

Another solution is to have the users of the test results to pay for the
tests - regardless of whether they are an anti-virus company that just
wants to see how well their product performs, compared to others, or if
they are end users, trying to select "the best" anti-virus product. The
main problem with this solution is that, in order to obtain some
sellable results, one need money in advance - to do all the tests.

One possible basis for independent testing could be a university
institute which specializes in computer and network security. Students
may be interested to study methods and counter-measures of
self-replicationg code. Within the 4-semester courses on IT/Network
Security at the Faculty for Informatics, University of Hamburg, several
students have specialised (including examination work) on virus
detection. 

For the test published here, facilities of the Virus Test Center at
University of Hamburg were available. Though seven students and one
professor worked on preparation and tests for more than 7 months, much
more wo/man power, time, and computer equipment would be helpful. We are
aware that our test results are limited and need improvement in several
directions (more platforms, more methods including on-access scanning,
cleaning infected macro objects, etc). Moreover, our results are limited
in time as both the viral databases grow and new scanner engines become
available.

Nevertheless, we have decided to distribute these results to the
interested public, for free. Of course, if you like them and are in a
position to be able to donate money or hardware to VTC-Hamburg: we
will highly appreciate this.

One last problem with anti-virus products, especially those of the
scanner type. They are modified very often. This means that their
production cycle is forced to be shorter than for other kinds of
software products. Usually, the part that comes too short is quality
control. If it is too difficult for the end user to assess the quality
of the product, it is often too tempting to put more efforts into making
the product to look pretty, instead of making it a strong anti-virus
tool. Therefore, it is urgent that professional tests of anti-virus
products are performed, and the results published, so that the general
public can see what they are really paying for.

Unfortunately, even for the competent anti-virus researcher, performing
a professional test of an anti-virus product is often a too difficult,
nearly impossible task. Such products often consist of several parts -
scanners, monitoring programs, integrity checkers. The latter two kinds
of programs must be tested how well they perform against each of the
known attacks against that particular kind of anti-virus defense. Just
implementing those attacks is a difficult and tedious job. But even such
products rely to some degree on proper detection of viruses by their
scanners.

Usually the part of the product that is the easiest to test is the
scanner. Even that should be done by a professional anti-virus
researcher, instead of the usual magazine reviewer, because there are a
lot of pitfals to watch for. The full description of how a professional
test of an anti-virus product is outside the scope of this document and
is described in other papers.

Nevertheless, the urgent need for good tests of anti-virus products
prompted us to use our knowledge and technical facilities to test some
of the popular products on the market. This document contains the
results of those tests. Our intention is to continue repeating it
periodically, as new anti-virus products, or new versions of the old
anti-virus products appear, and as long as we are able to update our
virus and malware testbeds suitably.

Please, note that the quality of our tests is far from perfect;read the
file 9EPILOG.TXT for some points on what is missing from our tests.
Nevertheless, we feel that the results that we can provide are of
superior quality than many so-called reviews of anti-virus products that
we have seen so far. We are concentrating our efforts on the anti-virus
side of the problem and leave the evaluation of the pretty user
interfaces and the structure of the manuals to the magazine reviewers.

We hope that our results may help the end user to select a better
product to protect him/her from computer viruses. Whether we have
succeeded to achieve our goal, only the users themselves can tell.

------------------ History of VTC (PC-related) AV tests: -----------

Before July 1994:  Several reports of boot/file virus detection
                   published by Vesselin V. Bontchev during his
                   time as PhD student at VTC 

July 19, 1994:     Last official boot/file virus detection test 
                   released by Vesselin Bontchev (see /1994-07).

------------------ Test "1997-02" -----------------------------------

May, 1996:         Foundation of new AV Product file/boot virus 
                   test group; establishment of Macro Virus Database;
                   preparation of test equipment (NT server/clients), 
                   preparing test procedures

November 30, 1996: Standard Virus Databases frozen for test; contact
                   to AV producers to get actual scanners, 
                   or download from Internet where available

December 01, 1996 - December 23, 1996: Pretest to test procedures

January 6-
    - February 14, 1997: Update of AV products, actual test runs.  

February 14, 1997: First draft of report distributed to interested
                   AV experts (AV-TEST@informatik.uni-hamburg.de)
                   including members from CARO (=Computer Antivirus 
                   Research Organisation) and selected AV-producers

February 20, 1997: Final release of "1997-02" test results

------------------ Test "1997-07" -----------------------------------

April 30, 1997:    Standard Virus Databases frozen for test

June 22, 1997:     Period for submission of AV products ended

July 22, 1997:     Test Report "1997-07" released.

------------------ Test "1998-02" -----------------------------------

November 30, 1997: Virus and Malware databases frozen for tests

January 23,1998:   Period for submission of AV products ended

February-March 9:  Problems with several scanners solved

March 16,1998:     Test Report "1998-92" released.

------------------ Test "1998-10" -----------------------------------

April 30, 1998:    VTC Virus/Malware Databases frozen for test 
                   "1998-07"

June 19, 1998:     Period for submission of AV products ended
                   serious problems with some bugs of Windows NT (see
                   8PROBLMS.TXT). Several scan runs must be performed
                   repeatedly to assure that all files had been
                   scanned). Other reasons contributed to delay the
                   publication of test report "1998-07". 

November 23,1998:  Test Report "1998-10" published.
                
------------------ Test "1999-03" -----------------------------------

October 30, 1998:  VTC Virus/Malware Databases (both full/zoo and 
                   ITW) frozen for test "1999-03"

January 17, 1999:  Last date to submit AV products for VTC test;
                   Platforms: DOS, Windows 98, Windows NT (4.0)
                   Testbeds: file, macro, boot and malware databases,
                   plus VKIT and 4 Polymorphic file testbeds. ITW
                   viruses packed with 4 packers, selection of non-
                   malicious (file,macro) objects to test ability
                   of AV products to avoid "false-positive diagnosis"

April 15, 1999:    Publication of Test Report "1999-03"

------------------ Test "1999-09" -----------------------------------

March 31, 1999:    VTC Virus/Malware Databases (full zoo and
                   ITW) frozen for test "1999-09"

May 16, 1999:      Latest date to submit AV products for VTC test;
                   Platforms: DOS, Windows 98, Windows NT (4.0)
                   Testbeds: file, macro, boot and malware databases,
                   plus VKIT and 6 Polymorphic file testbeds. ITW
                   viruses packed with 4 packers, selection of non-
                   malicious (file,macro) objects to test ability
                   of AV products to avoid "false-positive diagnosis"

September 30,1999: Publication of Test Report "1999-09"

------------------ Test "2000-02" and Test "2000-04" ----------------

October 30,1999:   VTC Virus/Malware Databases frozen for test 
                   "2000-04"; apart from updated infective and
                   (viral and wormy) and non-infective (trojanic)
                   malware objects, this test will include both
                   on-demand AV products (as before) and on-access
                   detection, as well as more packing methods.
December 1, 1999:  Period of submission for this test ends

February 26, 2000: Macro virus/malware results published: "2000-02"
May 26, 2000:      Test report "2000-04" published (full report)

------------------- Test Plan: "2000-08" ---------------------------
April 20, 2000:    Decision to review and automate VTC test methods;
                   decision that next test will concentrate on
                   macro viruses/malware (VBA/VBA5,VBS) including
                   repair test (ART) of ITW macro viruses

May 10, 2000:      VTC VBA/VBS testbeds frozen for macro virus/
                   malware detection and repair test

June 01, 2000:     Products to be submitted for test "2000-07"

September 24,2000: Millennium Test report "2000-08" published 

---------------------------------------------------------------------

June 01, 2000:     Decision to test the ability of AV products
                   to properly repair (clean) macro viruses from
                   an infected Excel (97) and Word (97) document;
                   this test was performed within a diplom thesis
                   of 2 students: Martin Retsch and Stefan Tode

November 19, 2000: AntiVirus Repair Test (ART) published

----------------------------------------------------------------------

October 30,2000:   VTC Virus/Malware Databases frozen for test 
                   "2001-04"; test adresses products for 5 platforms
                   (DOS, Win-NT, Win-98, Win-2000, and Linux(SuSe))

December 11, 2000: Period of submission for this test ends

May 31, 2001:      Test report "2001-04" published (full report)

--------------------------------------------------------------------------

May 1, 2001:      First "Heureka" test: detection of macro/script viruses
                                        and malware for 2 testbeds AFTER
                                        product/signature date 
                  Testbed #1: viruses/malware 
                              found between Nov.1, 2000 and Jan.31, 2001
                  Testbed #2: viruses/malware
                              found between Feb.1 and April 30, 2001

July 17, 2001:    Test report "Heureka 2001" published


--------------------------------------------------------------------------

April 30, 2001:    VTC Virus/Malware databases frozen for test
                   "2001-10"; test adresses detection of macro and script
                   viruses&malware on 5 platforms (DOS,WNT,W2k,W98,Linux)

June 25, 2001:     Product mission period ends

November 30, 2001: Test report "2001-10" published

--------------------------------------------------------------------------

January 6,2002:    "Heureka Test 2002-02" - the 2nd "pro-active test":
	            Scanners submitted for VTC test 2001-10 were tested
                    against 2 incremental testbeds for macro and script 
                    viruses detected 
                        testbed#1: between May 1 and July 31, 2001
                        testbed#2: between August 1 and October 31, 2001
February 28, 2002:  test period ends
March 31, 2002:     test report ready
April 4, 2002:      corrected test report available from VTC website       

---------------------------------------------------------------------------

Next test plans:      Spring 2002 File/Boot/Macro/Script/Exotic virus test 
----------------      "2002-04": testbeds are frozen on October 31, 2001
                      Products to be submitted December 17, 2001.
                      Publication of test report planned: April 2002
--------------------------------------------------------------------------
