Package-level declarations

Types

Link copied to clipboard
class ContextTracker<T>(val start: T, val shift: (context: T, term: Int, stack: Stack, input: InputStream) -> T = { c, _, _, _ -> c }, val reduce: (context: T, term: Int, stack: Stack, input: InputStream) -> T = { c, _, _, _ -> c }, val reuse: (context: T, node: Tree, stack: Stack, input: InputStream) -> T = { c, _, _, _ -> c }, val hash: (context: T) -> Int = { 0 }, val strict: Boolean = true)
Link copied to clipboard
class Dialect(val source: String?, val flags: List<Boolean>, val disabled: IntArray?)
Link copied to clipboard
class ExternalTokenizer(tokenFn: (input: InputStream, stack: Stack) -> Unit, val contextual: Boolean = false, val fallback: Boolean = false, val extend: Boolean = false) : Tokenizer

@external tokens declarations in the grammar should resolve to an instance of this class.

Link copied to clipboard

Tokenizers interact with the input through this interface. It presents the input as a stream of characters, tracking lookahead and hiding the complexity of ranges from tokenizer code.

Link copied to clipboard
class LocalTokenGroup(data: Any, val precTable: Int, val elseToken: Int? = null) : Tokenizer
Link copied to clipboard

The main LR parser class. Instances are typically created via LRParser.deserialize from a serialized ParserSpec produced by the lezer generator.

Link copied to clipboard
data class ParserConfig(val props: List<NodePropSource<*>>? = null, val top: String? = null, val dialect: String? = null, val tokenizers: List<TokenizerReplacement>? = null, val specializers: List<SpecializerReplacement>? = null, val contextTracker: ContextTracker<Any?>? = null, val strict: Boolean? = null, val wrap: ParseWrapper? = null, val bufferLength: Int? = null)

Configuration for reconfiguring a parser.

Link copied to clipboard
data class ParserSpec(val version: Int, val states: String, val stateData: String, val goto: String, val nodeNames: String, val maxTerm: Int, val repeatNodeCount: Int, val nodeProps: List<List<Any>>? = null, val propSources: List<NodePropSource<*>>? = null, val skippedNodes: List<Int>? = null, val tokenData: String, val tokenizers: List<Any>, val topRules: Map<String, List<Int>>, val context: ContextTracker<Any?>? = null, val dialects: Map<String, Int>? = null, val dynamicPrecedences: Map<Int, Int>? = null, val specialized: List<SpecializerSpec>? = null, val tokenPrec: Int, val termNames: Map<Int, String>? = null)

Serialized parser specification.

Link copied to clipboard
Link copied to clipboard
data class SpecializerReplacement(val from: (String, Stack) -> Int, val to: (String, Stack) -> Int)
Link copied to clipboard
data class SpecializerSpec(val term: Int, val get: (value: String, stack: Stack) -> Int? = null, val external: (value: String, stack: Stack) -> Int? = null, val extend: Boolean = false)

Specializer spec for token specialization.

Link copied to clipboard
class Stack
Link copied to clipboard
interface Tokenizer
Link copied to clipboard

Functions

Link copied to clipboard

Decode a string-encoded integer array.