============================================================
                    File 0XECSUM.TXT:
                   "EXECUTIVE SUMMARY" 
    Heureka-2 AntiVirus/AntiMalware Product Test "2002-02"
     antiVirus Test Center (VTC), University of Hamburg 
============================================================
[Formatted with non-proportional font (Courier), 72 columns]

     ***************************************************************
                       Content of this file:
     ***************************************************************
     1. Foreword of the Editor
     2. VTC Testbeds used in VTC Heureka-2 test "2002-02"
        .1 Reasons for Heureka tests
	.2 From Heureka-1 to Heureka-2 test
     3. Products participating ins Heureka-2 Test "2002-02"
        Table ES2: List of AV products in VTC Heureka-2 test
 
     4. Results of on-demand persistent detection under Windows-NT
        4.1 .1 Development of Zoo Macro virus detection rates
               Table "Heureka-2.MZ"
	    .2 Discussion of essential MZ results
	4.2 .1 Development of ITW Macro virus detection rates
               Table "Heureka-2.MI"
	    .2 Discussion of essential MI results
	4.3 .1 Development of Macro Malware detection rates
               Table "Heureka-2.MM"
            .2 Discussion of essential MM results
	4.4 .1 Development of zoo Script virus detection rates
               Table "Heureka-2.SZ"
	    .2 Discussion of essential SZ results
	4.5 .1 Development of ITW Script virus detection rates
               Table "Heureka-2.SI"
	    .2 Discussion of essential MI results
	4.6 .1 Development of Script Malware detection rates
               Table "Heureka-2.SM"
	    .2 Discussion of essential SM results

	4.7 Comparison of Heureka-1 and Heureka-2 results

        4.8 Grading WNT products according to "Heureka-II" results

     5. Availability of full test results
     6. Copyright, License, and Disclaimer
    *****************************************************************



1. Foreword of the Editor:
==========================

1.1 Reasons for Heureka tests
-----------------------------
When VTC test "2001-04" was discussed, the question was raised whether
related products would be able to detect also viruses found signifi-
cantly after their engine and signature date. 

Methodologically, AntiVirus products are also able to detect "new"
viruses when they resemble to a certain degree to a yet known 
virus family: in this case, variants of a given family are usually
detected with "generic" methods. An AV product may also analyse,
usually in a specific - often called "heuristic" - mode, whether
there are symptoms which may relate to some viral mechanism. In
the first case, AV products detect such viruses in their "normal"
modes, esp. including those which give "optimum detection" (as
usually adressed in VTC tests). But in the latter case, a special
switch must be set to enable the special heuristic mode.

In order to start from a known basis of detection from which we may
compare developments for 2 consecutive 3-month periods, we decided to 
use the same products and options as in the last VTC test. Conse-
quently, though this test adresses detection of viruses hitherto 
unknown to any AV product, it is NOT a "heuristic test" (where we
would have to use options adapted to heuristic search). We there-
fore named it a "Heureka" test (from Greek: "I have found").


1.2 From Heureka-1 to Heureka-2 test:
-------------------------------------
The following restrictions apply /partially due to the fact that
Heureka tests overlap with ongoing "normal" tests):
        - Heureka (1,2) tests adress only products under W-NT,
        - testbeds include macro and script viruses, both in zoo
          and In-The-Wild, plus related non-viral malware.

Testbeds for Heureka-1 test:
      - the reference testbed was frozen on October 31, 2000;
        testbed ".011" contained all zoo macro and script viruses
        which were first reported between November 1, 2000 and
        January 31, 2001
      - testbed ".014" contained all zoo macro and script viruses
        first reported between Febuary 1 and April 30, 2001.
      - for ITW macro viruses, those viruses were selected
        from the related Wildlists (January and April) which were
        then first reported to be In-The-Wild. Evidently, this
        included viruses which were probably in the zoo used
        in test "2001-10".

Testbeds for Heureka-2 test:
      - the reference testbed was frozen on April 30, 2001;
        testbed ".017" contained all zoo macro and script viruses
	and related malware which were first reported between 
        May 1 and June 31, 2001
      - testbed ".01A" contained all zoo macro and script viruses
	and related malware which were first reported between 
        July 1 and October 31, 2001.
      - for ITW macro viruses, those viruses were selected
        from the related Wildlists (July and October 2001) which 
	were then first reported to be In-The-Wild. Evidently, these
        testbeds included viruses which were probably in the zoo 
	used in test "2001-04".

After establishment of the testbeds, the test crew worked during semester
holidays (where most students have to work for their lifes, as we dont
collect money from AV producers whose products participate in VTC tests, 
from which we may pay students for their valuable work and time) very
reliable and produced results within 4 weeks. Most work in VTC tests 
rests on the shoulders of our test crew, and the editors wish to 
thank them all for their devotion and hard work. Some delay was caused
by several duties of the main report author which regrettably delayed
the publication of this report (intended for publication in February
2002). 

     
2. Testbeds used in Heureka-2 test "2002-02":
=============================================
The sizes of the different VTC testbeds is given in the following table 
(for detailed indices of VTC testbeds, see file "a3testbed.txt") 
								
 Table ES1: Content of VTC test Heureka-2 databases:                         
 ============================================================================
 Zoo objects.017: 357 newly reported Macro  viruses in 1259 infected  objects
                  164 newly reported Script viruses in  382 infected  objects
                   22 newly reported Macro  malware in   30 different objects
		   37 newly reported Script malware in   73 different objects
 ITW objects.017:  10 newly reported Macro  viruses in   37 infected  objects
		   22 newly reported Script viruses in   67 infected  objects
 ----------------------------------------------------------------------------
 Zoo objects.01A: 176 newly reported Macro  viruses in  678 infected  objects
                  102 newly reported Script viruses in  184 infected  objects
                    7 newly reported Macro  malware in    7 different objects
		   23 newly reported Script malware in   49 different objects
 ITW objects.01A:   7 newly reported Macro  viruses in   14 infected  objects
		    6 newly reported Script viruses in    6 different objects
 ============================================================================

 For comparison, the "full" reference testbeds in VTC test "2001-10" are:
 ------------------------------------------------------------------------------
 Zoo objects.014: 6762 newly reported Macro  viruses in 21677 infected  objects
                   588 newly reported Script viruses in  1079 infected  objects
                   426 newly reported Macro  malware in   683 different objects
		    73 newly reported Script malware in   167 different objects
 ITW objects.017:  143 newly reported Macro  viruses in  1308 infected  objects
                    22 newly reported Script viruses in    30 different objects
 ------------------------------------------------------------------------------
           

3. Products participating in Heureka Test "2001-02":
====================================================
As general reference, compare results (W-NT section) of VTC test "2001-04".
For test "2002-02", the following *** 21 *** AntiVirus products (adressed
in subsequent tables by a 3-letter codes, see A5CodNam for product naming) 
under Windows-NT (NT 4.0) were tested: 

    Table ES2: List of AV products in Heureka-2 test		
    ================================================
	 --------------------------------------------------------
           Products submitted for aVTC test under Windows-NT:
	 --------------------------------------------------------
     	   ANT     v(def): 6.8.0.2 	    sig: June 05,2001
	   AVA     v(def): unknown	    sig: unknown
 	   AVG     v(def): 6.0.263	    sig: June 22,2001
 	   AVK     v(def): 10.0.167	    sig: June 21,2001
 	   AVP     v(def): 3.5.133.0	    sig: June 01,2001
 	   AVX     v(def): 6.1              sig: June 18,2001
 	   CMD     v(def): 4.61.5           sig: June 25,2001
 	   DRW     v(def): 4.25             sig: June 20,2001
 	   FPR     v(def): 3.09d	    sig: June 25,2001
 	   FPW     v(def): 3.09d	    sig: June 25,2001
 	   FSE     v(def): 1.00.1251        sig: June 20,2001
		           scan eng fprot:  3.09.507
		    	   scan eng avp:    3.55.3210
		           scan eng orion:  1.02.15
 	   IKA     v(def): 5.01	            sig: June 25,2001
 	   INO     v(def): 6.0.85  	    sig: June 14,2001
	   MR2     v(def): 1.17    	    sig: June 25,2001
 	   NVC     v(def): 5.00.25 	    sig: June 19,2001
 	   PAV     v(def): 3.5.133.0	    sig: June 23,2001
 	   QHL     v(def): 6.02    	    sig: June 28,2001
 	   RAD     v(def): 8.1.001	    sig: June 25,2001
 	   RAV     v(def): 8.2.001,
                    	   scan eng:8.3     sig: June 25,2001
 	   SCN     v(def): 4144
                    	   scan eng:4.1.40  sig: June 20,2001
 	   VSP     v(def): 12.22.1 	    sig: June 25,2001
         --------------------------------------------------------
	 One products (NAV) was withdrawn from this test, 
         due to "new engines".

Detailed results including precision and reliability of virus and
malware identification are presented in 6gwnt.txt, and an analysis
(evaluation) of results is presented in 7evalwnt.txt.


4. Results of on-demand persistent detection under Windows-NT:
==============================================================
In the following section, the results are analysed in some detail.
Much more details, including detection of infected objects as well 
as precision and reliability of detection can be found in 6gwnt.txt,
and results are analysed in some details in 7evalnt.txt.


4.1.1 Development of Zoo macro virus detection rates:
=====================================================
Table Heureka-2.MZ summarizes Zoo macro virus detection results:
------------------------+---------------+---------+---------------+----------
             Viruses    |   New viruses | Loss in |   New viruses | Loss in 
Scanner      detected   |    detected   | 3 months|    detected   | 6 months
------------------------+---------------+---------+---------------+----------
Status:   April 30,2001 I   July 31,2001I         IOctober 31,2001I
Testbed    6762 100.0%  |    357 100.0% I         |    176 100.0% I
------------------------+---------------+---------+---------------+----------
ANT        6566  97.1%  |    221  61.9% |  -35.2% |     77  43.8% |  -53.3%
AVA        6604  97.7%  |    254  71.1% |  -26.6% |     97  55.1% |  -42.6%
AVG        6651  98.4%  |    318  89.1% |   -9.3% |    117  66.5% |  -31.9%
AVK        6762 100.0%  |    288  80.7% |  -19.3% |     69  39.2% |  -60.8%
AVP        6761 100.0%  |    292  81.8% |  -18.2% |     70  39.8% |  -60.2%
AVX        6703  99.1%  |    343  96.1% |   -3.0% |    166  94.3% |   -4.8%
CMD        6760 100.0%  |    324  90.8% |   -9.2% |    128  72.7% |  -27.3%
DRW        6725  99.5%  |    344  96.4% |   -3.1% |    169  96.0% |   -3.5%
FPR        6760 100.0%  |    322  90.2% |   -9.8% |    127  72.2% |  -27.8%
FPW        6760 100.0%  |    322  90.2% |   -9.8% |    127  72.2% |  -27.8%
FSE        6762 100.0%  |    341  95.5% |   -4.5% |    151  85.8% |  -14.2%
IKA        6451  95.4%  |    290  81.2% |  -14.2% |    107  60.8% |  -34.6%
INO        6755  99.9%  |    339  95.0% |   -4.9% |    167  94.9% |   -5.0%
MR2          44   0.7%  |      6   1.7% |    1.0% |      5   2.8% |    2.1%
NVC        6751  99.8%  |    223  62.5% |  -37.3% |     50  28.4% |  -71.4%
PAV        6762 100.0%  |    292  81.8% |  -18.2% |     70  39.8% |  -60.2%
QHL           0   0.0%  |      0   0.0% |    0.0% |      0   0.0% |    0.0%
RAV        6726  99.5%  |    330  92.4% |   -7.1% |    134  76.1% |  -23.4%
SCN        6762 100.0%  |    349  97.8% |   -2.2% |    167  94.9% |   -5.1%
VSP           1   0.0%  |      0   0.0% |    0.0% |      1   0.6% |    0.6%
------------------------+---------------+---------+---------------+----------
Mean ALL:        85.7%		  72.8%    -11.5%	    56.8%    -27.6%
Mean rel:	 99.2%		  85.6%    -10.5%	    66.6%    -25.1%
------------------------+---------------+---------+---------------+----------
Remarks: "Mean ALL" is the mean value of virus and file identification 
            calculated over ALL related entries.
         "Mean rel" is the relative mean value of virus and file 
            identification calculated only for those entries
	    with a minimum detection rate "minrate", where
		   minrate = 65% for zoo viruses, 
                   minrate = 95% for ITW viruses, and
                   minrate = 60% for malware.

         Definition of "loss vector":
	    Loss in 3 months = Loss vector #1 
                             = detection rate in month 1-3
                               minus detection rate in reference test
            Loss in 6 months = Loss vector #2 
                             = detection rate in month 4-6
                               minus detection rate in reference test

4.1.2 Discussion of essential results:
--------------------------------------
	  (1) For zoo macro viruses, best products are able to 
              detect more than 90% of those viruses reported within 
              first 3-month period and more than 80% within second 
              3-month period after product/signature delivery:

			SCN   (100.0%  97.8%  94.9%)
			FSE   (100.0%  95.5%  85.8%)
		        INO   ( 99.9%  95.0%  94.9%)
			DRW   ( 99.5%  96.4%  96.0%)
			AVX   ( 99.1%  96.1%  96.0%)

	  (2) During the first 3 months, mean loss in detection
              ability is 11.5% (overall), and it is slightly
              better (10.5%) when products with extremely low 
	      detection rates are not counted. The following
              products behave best in first 3-month period:

			SCN        (97.8%)
                        DRW        (96.4%)
                        AVX        (96.1%)
                        FSE        (95.5%)
                        INO        (95.0%)

	  (3) In months 4-6, the loss in detection quality
              is fastly growing, with a mean loss of 27.6%
              (overall) and 25.1% when products with extremely 
              low detection rates are not counted.


     *************************************************************
     Result "Heureka-2.MZ":  concerning new zoo macro viruses,
                             the following 4 products miss 
                             less than 10% over 6 months:
              -----------------------------------------------------
		DRW     after 3 months:  -  3.1% 	
                        after 6 months:  -  3.5%
              -----------------------------------------------------
		SCN     after 3 months:  -  2.2% 	
                        after 6 months:  -  5.1%
              -----------------------------------------------------
		INO     after 3 months:  -  4.9% 	
                        after 6 months:  -  5.0%
              -----------------------------------------------------
		AVX     after 3 months:  -  3.0% 	
                        after 6 months:  -  4.8%
     **************************************************************
             And the following product misses less than 20% 
             over 6 months:
		FSE     after 3 months:  - 4.5% 	
                        after 6 months:  -14.2%
     **************************************************************



4.2.1 Development of ITW macro virus detection rates:
=====================================================

Table "Heureka-2.MI" summarizes In-The-Wild macro virus detection results:
------------------------+---------------+---------+---------------+----------
             Viruses    |   New viruses | loss in |   New viruses | loss in 
Scanner      detected   |    detected   | 3 months|    detected   | 6 months
------------------------+---------------+---------+---------------+----------
Status:   April 30,2001     July 31,2001          IOctober 31,2001 
Testbed     143 100.0%  |     17 100.0%           |      7 100.0%
------------------------+---------------+---------+---------------+----------
ANT         142  99.3%  |     14  82.4% |  -16.9% |      5  71.4% |  -27.9%
AVA         143 100.0%  |     16  94.1% |   -5.9% |      5  71.4% |  -28.6%
AVG         143 100.0%  |     17 100.0% |    0.0% |      7 100.0% |    0.0%
AVK         143 100.0%  |     17 100.0% |    0.0% |      7 100.0% |    0.0%
AVP         143 100.0%  |     17 100.0% |    0.0% |      7 100.0% |    0.0%
AVX         143 100.0%  |     16  94.1% |   -5.9% |      6  85.7% |  -14.3%
CMD         143 100.0%  |     17 100.0% |    0.0% |      7 100.0% |    0.0%
DRW         143 100.0%  |     17 100.0% |    0.0% |      7 100.0% |    0.0%
FPR         143 100.0%  |     17 100.0% |    0.0% |      7 100.0% |    0.0%
FPW         143 100.0%  |     17 100.0% |    0.0% |      7 100.0% |    0.0%
FSE         143 100.0%  |     17 100.0% |    0.0% |      7 100.0% |    0.0%
IKA         142  99.3%  |     17 100.0% |    0.7% |      7 100.0% |    0.7%
INO         143 100.0%  |     16  94.1% |   -5.9% |      7 100.0% |    0.0%
MR2          13   9.1%  |      0   0.0% |   -9.1% |      0   0.0% |   -9.1%
NVC         143 100.0%  |     17 100.0% |    0.0% |      6  85.7% |  -14.3%
PAV         143 100.0%  |     17 100.0% |    0.0% |      7 100.0% |    0.0%
QHL           0   0.0%  |      0   0.0% |    0.0% |      0   0.0% |    0.0%
RAV         143 100.0%  |     17 100.0% |    0.0% |      7 100.0% |    0.0%
SCN         143 100.0%  |     17 100.0% |    0.0% |      7 100.0% |    0.0%
VSP           0   0.0%  |      0   0.0% |    0.0% |      0   0.0% |    0.0%
------------------------+---------------+---------+---------------+----------
Mean ALL:	 86.7%		  86.7%	    -2.2%	    80.7%     -4.7%	        
Mean rel:	 99.9%		  99.9%	    -2.4%	    95.0%     -5.2%
------------------------+---------------+---------+---------------+----------
Remark: concerning calculation of mean values: see 1st table "Eval WNT.MZ"


4.2.2 Discussion of essential results:
--------------------------------------
	  (0) Due to the small number of ITW Macro viruses detected in 
              each 3-month period, we just discuss findings but dont grade
              products based on such potentially insignificant figures.

	  (1) For macro ITW viruses, the majority of products detect
 	      all ITW viruses even after 6 months. The following products
              consistently detect ALL macro ITW viruses at reference test
	      as well after 3 and 6 months, ALL with perfect detection 
	      vectors (100% 100% 100%):

	               AVG, AVK, AVP, CMD, DRW, FPR, FPW, 
                           FSE, INO, PAV, RAV, SCN.

	   (2) In comparison with Heureka-1 test, where 6 products
               detected ALL ITW macro viruses, the situation has
               improved significantly.             


     *************************************************************
     Result "Heureka-2.MI":  concerning new Macro ITW viruses,
                             the following 13 products miss
                             NO ITW virus during 6months:
	                     AVG, AVK, AVP, CMD, DRW, FPR, FPW, 
                             FSE, INO, PAV, RAV, SCN.
     **************************************************************


4.3.1 Development of macro malware detection rates:
==================================================

Table "Heureka-2.MM" summarizes macro malware detection results:
------------------------+---------------+---------+---------------+----------
             Viruses    |   New viruses | loss in |   New viruses | loss in 
Scanner      detected   |    detected   | 3 months|    detected   | 6 months
------------------------+---------------+---------+---------------+----------
Status:   April 30,2001 I   July 31,2001I         IOctober 31,2001I
Testbed     426 100.0%  |     22 100.0%           |      7 100.0%
------------------------+---------------+---------+---------------+----------
ANT         378  88.7%  |     10  45.5% |  -43.2% |      4  57.1% |  -31.6%
AVA         377  88.5%  |     11  50.0% |  -38.5% |      3  42.9% |  -45.6%
AVG         352  82.6%  |     14  63.6% |  -19.0% |      1  14.3% |  -68.3%
AVK         425  99.8%  |     12  54.5% |  -45.3% |      3  42.9% |  -56.9%
AVP         425  99.8%  |     14  63.6% |  -36.2% |      3  42.9% |  -56.9%
AVX         392  92.0%  |     18  81.8% |  -10.2% |      7 100.0% |    8.0%
CMD         424  99.5%  |     15  68.2% |  -31.3% |      4  57.1% |  -42.4%
DRW         387  90.8%  |     18  81.8% |   -9.0% |      7 100.0% |    9.2%
FPR         424  99.5%  |     15  68.2% |  -31.3% |      4  57.1% |  -42.4%
FPW         424  99.5%  |     15  68.2% |  -31.3% |      4  57.1% |  -42.4%
FSE         425  99.8%  |     17  77.3% |  -22.5% |      4  57.1% |  -42.7%
IKA         383  89.9%  |     16  72.7% |  -17.2% |      5  71.4% |  -18.5%
INO         398  93.4%  |     15  68.2% |  -25.2% |      5  71.4% |  -22.0%
MR2         135  31.7%  |      0   0.0% |  -31.7% |      2  28.6% |   -3.1%
NVC         421  98.8%  |     12  54.5% |  -44.3% |      3  42.9% |  -55.9%
PAV         426 100.0%  |     14  63.6% |  -36.4% |      3  42.9% |  -57.1%
QHL           0   0.0%  |      0   0.0% |    0.0% |      0   0.0% |    0.0%
RAV         416  97.7%  |     17  77.3% |  -20.4% |      5  71.4% |  -26.3%
SCN         426 100.0%  |     17  77.3% |  -22.7% |      3  42.9% |  -57.1%
VSP           1   0.2%  |      0   0.0% |   -0.2% |      0   0.0% |   -0.2%
------------------------+---------------+---------+---------------+----------
Mean ALL:	 83.4% 		  56.8	   -25.8%	    50.0%    -32.6%
Mean rel:	 95.0%		  66.8%	   -28.7%           55.6%    -36.2%
------------------------+---------------+---------+---------------+----------
Remark: concerning calculation of mean values: see 1st table "Eval WNT.MZ"
Remark: concerning calculation of mean values: see 1st table "Eval WNT.MZ"


4.3.2 Discussion of essential results:
--------------------------------------
	  (0) Due to the small number of Macro Malware detected in each
              3-month period, we just discuss findings but dont grade
              products based on such potentially insignificant figures.

	  (1) For non-replicant Macro Malware, detection quality is - in the 
              mean - significantly less developed than the detection of 
              replicative malware (aka viruses & worms). The mean malware 
              detection rate of tested products (except those with extremely 
              insufficient detection rates) degrades from 95.0% (in reference 
              test) to 66.8% (after 3 months) further down to 55.6% (after
              6 months). 

	  (2) Some products even improve their detection rates, as the
 	      following detection vectors indicate:

			DRW ( 90.8%   81.8%  100.0%)
			AVX ( 92.0%   81.8%  100.0%)

	      This may indicate that the heuristic mechanism of these 
              products are very well developed, but with the relative 
              small set of samples (7 for months 4-6), it canNOT be
              determined whether this results is an artefact of the 
              statistical evaluation.

	  (3) The following products loose less than 20% detection rate
              over each 3-month period but they they dont start with
	      optimum detection rate in the reference test:

			DRW ( 90.8%   81.8%  100.0%)
			AVX ( 92.0%   81.8%  100.0%)
			IKA ( 89.9%   72.7%   71.4%)

	  (4) Those products which detected almost all malware samples
              with "fresh" signatures in the reference test (esp. PAV
              and SCN) lost significantly more detection rate compared 
              to mean loss. This may indicate that these products apply
              mechanisms of exact identification instead of heuristics.

	  (5) In comparison with Heureka-1 test results, those products
	      then scoring best (FSE, SCN: loss after 6 months: -31.7%)
              have now much larger loss in detection rate (-42.7%, -57.1%).
             

     *******************************************************************
     Result "Heureka-2.MM": The persistency of non-replicative malware
                         detection needs significant improvement. Only
			 3 products loose less than 40% detection 
                         quality over six months, but all three products 
                         have less than optimum detection rates in the
                         reference test.
      *******************************************************************


4.4.1 Development of zoo Script virus detection rates:
======================================================

Table "Heureka-2.SZ" summarizes Zoo script detection results:
------------------------+---------------+---------+---------------+----------
             Viruses    |   New viruses | loss in |   New viruses | loss in 
Scanner      detected   |    detected   | 3 months|    detected   | 6 months
------------------------+---------------+---------+---------------+----------
Status:   April 30,2001 I   July 31,2001I         IOctober 31,2001I
Testbed     588 100.0%  |    164 100.0%           |    102 100.0%
------------------------+---------------+---------+---------------+----------
ANT         481  81.8%  |     42  25.6% |  -56.2% |     12  11.8% |  -70.0%
AVA         174  29.6%  |     32  19.5% |  -10.1% |     11  10.8% |  -18.8%
AVG         370  62.9%  |     85  51.8% |  -11.1% |     40  39.2% |  -23.7%
AVK         588 100.0%  |    126  76.8% |  -23.2% |     52  51.0% |  -49.0%
AVP         588 100.0%  |    126  76.8% |  -23.2% |     49  48.0% |  -52.0%
AVX         412  70.1%  |     89  54.3% |  -15.8% |     31  30.4% |  -39.7%
CMD         552  93.9%  |    104  63.4% |  -30.5% |     46  45.1% |  -48.8%
DRW         561  95.4%  |    136  82.9% |  -12.5% |     72  70.6% |  -24.8%
FPR         558  94.9%  |    104  63.4% |  -31.5% |     46  45.1% |  -49.8%
FPW         556  94.6%  |    104  63.4% |  -31.2% |     46  45.1% |  -49.5%
FSE         588 100.0%  |    141  86.0% |  -14.0% |     71  69.6% |  -30.4%
IKA         457  77.7%  |    104  63.4% |  -14.3% |     49  48.0% |  -29.7%
INO         559  95.1%  |     78  47.6% |  -47.5% |     34  33.3% |  -61.8%
MR2         490  83.3%  |     93  56.7% |  -26.6% |     45  44.1% |  -39.2%
NVC         537  91.3%  |     74  45.1% |  -46.2% |     27  26.5% |  -64.8%
PAV         588 100.0%  |    126  76.8% |  -23.2% |     47  46.1% |  -53.9%
QHL           1   0.2%  |      1   0.6% |    0.4% |      1   1.0% |    0.8%
RAV         485  82.5%  |      0   0.0% |  -82.5% |      0   0.0% |  -82.5%
SCN         587  99.8%  |    134  81.7% |  -18.1% |     58  56.9% |  -42.9%
VSP         494  84.0%  |     93  56.7% |  -27.3% |     45  44.1% |  -39.9%
------------------------+---------------+---------+---------------+----------
Mean ALL:	 78.7%		  54.6%	   -27.2%	    38.3%    -43.6%
Mean rel:        86.6%		  60.7%	   -28.6%	    42.5%    -45.8%
------------------------+---------------+---------+---------------+----------
Remark: concerning calculation of mean values: see 1st table "Eval WNT.MZ"


4.4.2 Discussion of essential results:
--------------------------------------
	  (1) Heuristic detection of script viruses is significantly
              less developed than detection fo macro viruses, as comparison
              of mean losses (without those products with inadequate 
              detection rates) in detection rates shows:

						reference   after    after
	                                          test    3 months  6 months
	      detection rate of macro viruses     99.2%     85.6%     66.6%
	      detection rate of script viruses    86.6%     60.7%     42.5%

	      For zoo script viruses, best products are able to detect 
              more than 80% of those zoo viruses reported within 3-months 
              after products delivery and 60% of those viruses reported
              after 6 months:
			
			FSE   (100.0%  86.0%  69.9%)
			DRW   ( 95.4%  82.9%  70.6%)

	      In addition, the following products (which detected at least 
              90% in the reference test) lost less than 20% in the first 
              3-month period but lost more than 40% in the second period:

			SCN   ( 99.8%  81.7%  56.9%)

      	  (2) During the first 3 months, mean loss in detection
              ability is 28.6%. In months 4-6, the loss in detection 
              quality is fastly growing, with a mean loss of 45.8%.
           
          (3) In order to classify product behaviour, we grade 
              products according to loss in detection quality. 
              When considering only products with losses up to 
              40% after 6 months (ordered according to highest 
              detection rates after 6 months), the following 
              products behaved best in "Heureka-2" test:

                        ------------------------------------
			detection rate   loss in    loss in 
           AV product    in ref-test    month 1-3  month 4-6
	      --------+-------------------------------------
	      DRW	      95.4%     -12.5%      -24.8%
	      FSE            100.0%     -14.0%      -30.4%
                        ------------------------------------
	      SCN	      99.8%	-18.1%     (-42.9%)%
			------------------------------------


     ********************************************************************
     Result "Heureka-2.SZ": Zoo script virus detection is significantly
                            less well developped compared with macro virus
                            detection; losses in detection rates are more
                            than 5 times higher than with macro viruses. 

			    The following 2 products miss less than 15%
                            after 3 months and about 30% after 6 months:

				DRW     after 3 months:  - 12.5% 	
                        		after 6 months:  - 24.8%
                                --------------------------------
			 	FSE     after 3 months:  - 14.0% 	
                        		after 6 months:  - 30.4%

                            For AV companies, there is strong need to
			    improve persistent detection methods esp. as 
                            this category adresses many mass-emailing 
                            viruses!

			    For customers, the strong evidence is to
                            update AV products for script virus detection
                            much more often than for script viruses. 
     ********************************************************************	 



4.5.1 Development of ITW Script virus detection rates:
======================================================

Table "Heureka-2.SI" summarizes In-The-Wild macro virus detection results:
------------------------+---------------+---------+---------------+----------
             Viruses    |   New viruses | loss in |   New viruses | loss in 
Scanner      detected   |    detected   | 3 months|    detected   | 6 months
------------------------+---------------+---------+---------------+----------
Status:   April 30,2001 I   July 31,2001I         IOctober 31,2001I
Testbed      19 100.0%  |     10 100.0%           |      6 100.0%
------------------------+---------------+---------+---------------+----------
ANT          19 100.0%  |     10 100.0% |    0.0% |      2  33.3% |  -66.7%
AVA          18  94.7%  |     10 100.0% |    5.3% |      2  33.3% |  -61.4%
AVG          19 100.0%  |     10 100.0% |    0.0% |      4  66.7% |  -33.3%
AVK          19 100.0%  |     10 100.0% |    0.0% |      4  66.7% |  -33.3%
AVP          19 100.0%  |     10 100.0% |    0.0% |      4  66.7% |  -33.3%
AVX          19 100.0%  |     10 100.0% |    0.0% |      2  33.3% |  -66.7%
CMD          19 100.0%  |     10 100.0% |    0.0% |      5  83.3% |  -16.7%
DRW          19 100.0%  |     10 100.0% |    0.0% |      4  66.7% |  -33.3%
FPR          19 100.0%  |     10 100.0% |    0.0% |      5  83.3% |  -16.7%
FPW          19 100.0%  |     10 100.0% |    0.0% |      5  83.3% |  -16.7%
FSE          19 100.0%  |     10 100.0% |    0.0% |      5  83.3% |  -16.7%
IKA          18  94.7%  |     10 100.0% |    5.3% |      4  66.7% |  -28.0%
INO          19 100.0%  |      7  70.0% |  -30.0% |      1  16.7% |  -83.3%
MR2          17  89.5%  |     10 100.0% |   10.5% |      1  16.7% |  -72.8%
NVC          19 100.0%  |     10 100.0% |    0.0% |      3  50.0% |  -50.0%
PAV          19 100.0%  |     10 100.0% |    0.0% |      4  66.7% |  -33.3%
QHL           1   5.3%  |      1  10.0% |    4.7% |      1  16.7% |   11.4%
RAV          18  94.7%  |      0   0.0% |  -94.7% |      0   0.0% |  -94.7%
SCN          19 100.0%  |     10 100.0% |    0.0% |      4  66.7% |  -33.3%
VSP          17  89.5%  |     10 100.0% |   10.5% |      1  16.7% |  -72.8%
------------------------+---------------+---------+---------------+----------
Mean ALL:	 89.5%		  89.0%	    -4.4%	    50.8%    -43.7%           
Mean rel:	 98.2%		  98.3%	    -4.4%	    53.5%    -45.4%   			          
------------------------+---------------+---------+---------------+----------


4.5.2 Discussion of essential results:
--------------------------------------
	  (0) Due to the small number of ITW Script viruses detected in 
              each 3-month period, we just discuss findings but dont grade
              products based on such potentially insignificant figures.


	  (1) For script ITW viruses, the majority of products detect
 	      all ITW viruses after 3 months but detection rates are signi-
              ficantly reduced after 6 months. 

     **********************************************************
     Result "Heureka-2.SI": concerning new script ITW viruses,
	                    detection rates degrade much faster
                            after 3 months than for macro ITW 
                            viruses. 
     **********************************************************


4.6.1 Development of Script Malware detection rates:
====================================================

Table "Heureka-2.SM" summarizes macro malware detection results:
------------------------+---------------+---------+---------------+----------
             Viruses    |   New viruses | loss in |   New viruses | loss in 
Scanner      detected   |    detected   | 3 months|    detected   | 6 months
------------------------+---------------+---------+---------------+----------
Status:   April 30,2001 I   July 31,2001I         IOctober 31,2001I
Testbed       22 100.0  |     37 100.0%           |     73 100.0%
------------------------+---------------+---------+---------------+----------
ANT           0   0.0%  |      7  18.9% |   18.9% |      5   6.8% |    6.8%
AVA         ---     0%  |      1   2.7% |    2.7% |      2   2.7% |    2.7%
AVG           5  22.7%  |      4  10.8% |  -11.9% |      4   5.5% |  -17.2%
AVK          22 100.0%  |     25  67.6% |  -32.4% |     20  27.4% |  -72.6%
AVP          22 100.0%  |     28  75.7% |  -24.3% |     20  27.4% |  -72.6%
AVX           2   9.1%  |     10  27.0% |   17.9% |      8  11.0% |    1.9%
CMD          14  63.6%  |      8  21.6% |  -42.0% |      4   5.5% |  -58.1%
DRW           8  36.4%  |     19  51.4% |   15.0% |     21  28.8% |   -7.6%
FPR          14  63.6%  |      8  21.6% |  -42.0% |      4   5.5% |  -58.1%
FPW          14  63.6%  |      8  21.6% |  -42.0% |      4   5.5% |  -58.1%
FSE          22 100.0%  |     32  86.5% |  -13.5% |     24  32.9% |  -67.1%
IKA           8  36.4%  |     15  40.5% |    4.1% |     11  15.1% |  -21.3%
INO          15  68.2%  |      9  24.3% |  -43.9% |     14  19.2% |  -49.0%
MR2           5  22.7%  |      4  10.8% |  -11.9% |     10  13.7% |   -9.0%
NVC           2   9.1%  |      5  13.5% |    4.4% |      4   5.5% |   -3.6%
PAV          22 100.0%  |     25  67.6% |  -32.4% |     20  27.4% |  -72.6%
QHL           1   4.5%  |      1   2.7% |   -1.8% |      1   1.4% |   -3.1%
RAV          18  81.8%  |      0   0.0% |  -81.8% |      0   0.0% |  -81.8%
SCN          22 100.0%  |     27  73.0% |  -27.0% |     21  28.8% |  -71.2%
VSP           5  22.7%  |      4  10.8% |  -11.9% |     10  13.7% |   -9.0%
------------------------+---------------+---------+---------------+----------
Mean ALL:        49.6%		  32.4%	   -32.4%	    14.2%    -36.0%
Mean rel:	(63.1%)		 (37.8%)  (-46.0%)         (22.3%)  (-68.2%)
------------------------+---------------+---------+---------------+----------
Remark: concerning calculation of mean values: see 1st table "Eval WNT.MZ"


4.6.2 Discussion of essential results:
--------------------------------------
	  (0) When comparing the numbers of macro and script malware detected 
              within two consecutiv 3-month periods, much more samples of
              script malware have been detected for the latter.

	  (1) For non-replicant Script Malware, detection quality starts
              at a significantly lower level (63.1%), and quality degrades
              much faster than for replicative malware (aka viruses & worms). 
              The mean malware detection rate of tested products (except 
              those with extremely insufficient detection rates) degrades 
              from 63.1% (in reference test) to 37.8% (after 3 months) 
              further down to 22.3% (after 6 months). 
             

     *******************************************************************
     Result "Heureka-2.SM": The persistency of non-replicative malware
                         detection needs significant improvement. 
                         Customers of AV products are advised to update
                         there products much faster for detection of
                         trojanic script malware than for macro malware.
      *******************************************************************



4.7 Comparison of Heureka-1 and Heureka-2 results:
==================================================

The following table lists essential results of Heureka tests:
				     Mean detection rates in:
			       Heureka-1		   Heureka-2
                      ---------------------------+-----------------------------
				after     after                after     after
		    reference  3 months  6 months  reference  3 months  6 months
-------------------------------------------------+-----------------------------
MZ=Macro zoo viruses   90.8%     73.7%     66.0%     99.2%	85.6%	  66.6%
MI=Macro ITW viruses   91.8%     89.7%     83.0%     99.9%      99.9%     95.0%
MM=Macro Malware       87.1%     61.8%     56.4%     95.0%      66.8%     55.6%
--------------------------------------------------------------------------------
SZ=Script zoo viruses  83.4%     61.0%     49.3%     86.6%      60.7%     42.5%
SI=Script ITW viruses   ---	  ---	    ---      98.2%      98.3%     53.5%
SM=Script Malware       ---	  ---	    ---      63.1%      37.8%     22.3%
--------------------------------------------------------------------------------

Concerning detection rates for macro viruses, both In-The-Wild and in zoo,
AV products have imrpoved their detection rates both generally and after
first 3-month period significantly. But loss of detection quality in the
second 3.-month period is much stronger than before (the results for ITW 
viruses may be influenced to the small number of newly found viruses).

Concerning zoo script viruses, detection rates are stable on an insufficient
level.


4.8 Grading WNT products according to "Heureka-II" results:
===========================================================

In comparing products, some behave "rather well". Over two 3-months periods,
the following products behaves best (although they also need significant 
improvement):
				DRW and FSE

Moreover, the following products behave best in the first 3-month period:
			 DRW, FSE, SCN and AVX, INO.
	
But as some testbeds were rather small, and as the loss in script viruses
detection quality was so dominant, we decided NOT to grade any product 
in VTCs grading scheme. We nevertheless hope that AV companies do their 
best to improve the generic and heuristic detection mechanisms. And we
strongly advise customers, to upgrade their products and signatures as 
often as possible.


5. Availability of full test results:
======================================
Much more information about this test, its methods and viral databases,
as well as detailed test results are available for anonymous FTP
downloading from VTCs HomePage (VTC is part of Working Group AGN):

             ftp://agn-www.informatik.uni-hamburg.de/vtc

Any comment and critical remark which helps VTC in learning to improve 
our test methods will be warmly welcomed. 

The next comparative test will evaluate file, boot, macro (VBA/VBA5) 
and script virus non-viral malware detection. This test is planned for 
February until June 2002, with viral databases frozen on October 31, 2001. 
This will eb followed by another Heureka test (Heureka-3) which again
evaluates detection rates for macro/script viruses/malware detected in
2 consecutive 3-month periods AFTER product submission.

On behalf of the VTC Test Crew: 
                            	   Dr. Klaus Brunnstein  
                                 (Hamburg: March 31, 2002)
                             

6. Copyright, License, and Disclaimer:
=======================================
This publication is (C) Copyright 2002 by Klaus Brunnstein and the 
Virus Test Center (VTC) at University of Hamburg, Germany.

Permission (Copy-Left) is granted to everybody to distribute copies of
this information in electronic form, provided that this is done for
free, that contents of the information are not changed in any way, and
that origin of this information is explicitly mentioned. It is esp.
permitted to store and distribute this set of text files at university
or other public mirror sites where security/safety related information
is stored for unrestricted public access for free. 

Any other use, esp. including distribution of these text files on
CD-ROMs or any publication as a whole or in parts, are ONLY permitted
after contact with the supervisor, Prof. Dr. Klaus Brunnstein or
authorized members of Virus Test Center at Hamburg University, and this
agreement must be in explicit writing, prior to any publication. 

No responsibility is assumed by the author(s) for any injury and/or
damage to persons or property as a matter of products liability,
negligence or otherwise, or from any use or operation of any methods,
products, instructions or ideas contained in the material herein.

                                        Prof. Dr. Klaus Brunnstein
                                      University of Hamburg, Germany
                                             (March 31, 2002)

