Idris2Doc : Text.Lexer.Tokenizer

Text.Lexer.Tokenizer

(<|>) : Tokenizert -> Lazy (Tokenizert) -> Tokenizert
  Alternative tokenizer rules.

Totality: total
Fixity Declaration: infixr operator, level 2
dataStopReason : Type
  Stop reason why tokenizer can't make more progress.
@ ComposeNotClosing carries the span of composition begin token in the
form of `(startLine, startCol), (endLine, endCol)`.

Totality: total
Constructors:
EndInput : StopReason
NoRuleApply : StopReason
ComposeNotClosing : (Int, Int) -> (Int, Int) -> StopReason
dataTokenizer : Type -> Type
  Description of a language's tokenization rule.

Totality: total
Constructors:
Match : Lexer -> (String -> tokenType) -> TokenizertokenType
Compose : Lexer -> (String -> tokenType) -> (String -> tag) -> Inf (tag -> TokenizertokenType) -> (tag -> Lexer) -> (String -> tokenType) -> TokenizertokenType
Alt : TokenizertokenType -> Lazy (TokenizertokenType) -> TokenizertokenType
compose : Lexer -> (String -> a) -> (String -> tag) -> Inf (tag -> Tokenizera) -> (tag -> Lexer) -> (String -> a) -> Tokenizera
  Compose other tokenizer. Language composition should be quoted between
a begin lexer and a end lexer. The begin token can be used to generate
the composition tokenizer and the end lexer.

Totality: total
lex : Tokenizera -> String -> (List (WithBoundsa), (StopReason, (Int, (Int, String))))
  Given a tokenizer and an input string, return a list of recognised tokens,
and the line, column, and remainder of the input at the first point in the string
where there are no recognised tokens.

Totality: total
lexTo : Lexer -> Tokenizera -> String -> (List (WithBoundsa), (StopReason, (Int, (Int, String))))
Totality: total
match : Lexer -> (String -> a) -> Tokenizera
  Match on a recogniser and cast the string to a token.

Totality: total