- (<|>) : Tokenizer t -> Lazy (Tokenizer t) -> Tokenizer t
Alternative tokenizer rules.
Totality: total
Fixity Declaration: infixr operator, level 2- data StopReason : Type
Stop reason why tokenizer can't make more progress.
@ ComposeNotClosing carries the span of composition begin token in the
form of `(startLine, startCol), (endLine, endCol)`.
Totality: total
Constructors:
- EndInput : StopReason
- NoRuleApply : StopReason
- ComposeNotClosing : (Int, Int) -> (Int, Int) -> StopReason
- data Tokenizer : Type -> Type
Description of a language's tokenization rule.
Totality: total
Constructors:
- Match : Lexer -> (String -> tokenType) -> Tokenizer tokenType
- Compose : Lexer -> (String -> tokenType) -> (String -> tag) -> Inf (tag -> Tokenizer tokenType) -> (tag -> Lexer) -> (String -> tokenType) -> Tokenizer tokenType
- Alt : Tokenizer tokenType -> Lazy (Tokenizer tokenType) -> Tokenizer tokenType
- compose : Lexer -> (String -> a) -> (String -> tag) -> Inf (tag -> Tokenizer a) -> (tag -> Lexer) -> (String -> a) -> Tokenizer a
Compose other tokenizer. Language composition should be quoted between
a begin lexer and a end lexer. The begin token can be used to generate
the composition tokenizer and the end lexer.
Totality: total- lex : Tokenizer a -> String -> (List (WithBounds a), (StopReason, (Int, (Int, String))))
Given a tokenizer and an input string, return a list of recognised tokens,
and the line, column, and remainder of the input at the first point in the string
where there are no recognised tokens.
Totality: total- lexTo : Lexer -> Tokenizer a -> String -> (List (WithBounds a), (StopReason, (Int, (Int, String))))
- Totality: total
- match : Lexer -> (String -> a) -> Tokenizer a
Match on a recogniser and cast the string to a token.
Totality: total