Contents of the SAMPLES subdirectories:

1:	Using LEXGEN with command-line options

2:	Using metacommands to specify LEXGEN options

3:	Putting multiple input file definitions in a single file

4:	Setting up a text table using the -T option

5:	Setting up a flags table using the -F option

6:	Using LUTHOR functions to tokenize an input file

7:	Using LexPushFile to process an INCLUDE file.

8:	Using a second LexStream to process an INI file.

9:	Using LLT's to load lextables.

10:	Using Input functions to supply lex input.

11:	Using LexInputByCall to retrieve tokens.

12:	Changing lex tables in midstream.

13:	Recovering from a bad token using AdvanceChar.

14:	Using LexInfo.

15:	Using LexInputString and LexSubmitString to tokenize a submitted string.
