module Genlex: BatGenlextypetoken =Stdlib.Genlex.token=
| |
Kwd of |
| |
Ident of |
| |
Int of |
| |
Float of |
| |
String of |
| |
Char of |
The type of tokens. The lexical classes are: Int and Float
for integer and floating-point numbers; String for
string literals, enclosed in double quotes; Char for
character literals, enclosed in single quotes; Ident for
identifiers (either sequences of letters, digits, underscores
and quotes, or sequences of ``operator characters'' such as
+, *, etc); and Kwd for keywords (either identifiers or
single ``special characters'' such as (, }, etc).
val make_lexer : string list -> char Stdlib.Stream.t -> token Stdlib.Stream.tConstruct the lexer function. The first argument is the list of
keywords. An identifier s is returned as Kwd s if s
belongs to this list, and as Ident s otherwise.
A special character s is returned as Kwd s if s
belongs to this list, and cause a lexical error (exception
Parse_error) otherwise. Blanks and newlines are skipped.
Comments delimited by (* and *) are skipped as well,
and can be nested.
type lexer_error =
| |
IllegalCharacter of |
| |
NotReallyAChar |
| |
NotReallyAnEscape |
| |
EndOfStream |
exception LexerError of lexer_error * int
type t
A lexer
val of_list : string list -> tCreate a lexer from a list of keywords
val to_stream_filter : t -> char Stdlib.Stream.t -> token Stdlib.Stream.tApply the lexer to a stream.
val to_enum_filter : t -> char BatEnum.t -> token BatEnum.tApply the lexer to an enum.
val to_lazy_list_filter : t -> char BatLazyList.t -> token BatLazyList.tApply the lexer to a lazy list.
val string_of_token : token -> stringmodule Languages:sig..end