Ecosyste.ms: Packages

An open API service providing package, version and dependency metadata of many open source software ecosystems and registries.

Top 9.5% on proxy.golang.org
Top 2.5% dependent packages on proxy.golang.org
Top 3.5% dependent repos on proxy.golang.org

proxy.golang.org : github.com/tekwizely/go-parsing/lexer

Package lexer implements the base components of a lexical analyzer, enabling the creation of hand-written lexers for tokenizing textual content. Some Features of this Lexer: Lexing is initiated through various Lex* methods, each accepting a different type of input to lex from: In addition to the input data, each Lex function also accepts a function which serves as the starting point for your lexer: The main Lexer process will call into this function to initiate lexing. You'll notice that the `Lexer.Fn` return type is another `Lexer.Fn`. This is to allow for simplified flow control of your lexer function. Your lexer function only needs to concern itself with matching the very next rune(s) of input. This alleviates the need to manage complex looping / restart logic. Simply return from your method after (possibly) emitting a token, and the Lexer will manage the looping. Switching contexts is as easy as returning a reference to another Lexer.Fn. For example, if, within your main lexer function, you encounter a `"`, you can simply return a reference to your `quotedStringLexer` function and the Lexer will transfer control to it. Once finished, your quoted string lexer can return control back to your main lexer by returning a reference to your `mainLexer` function. You can shut down the main Lexer loop from within your `Lexer.Fn` by simply returning `nil`. All previously emitted tokens will still be available for pickup, but the lexer will stop making any further `Lexer.Fn` calls. Your Lexer function receives a `*Lexer` when called and can use the following methods to inspect and match runes: Once you've determined what the matched rune(s) represent, you can emit a token for further processing (for example, by a parser): NOTE: See the section of the document regarding "Token Types" for details on defining tokens for your lexer. Sometimes, you may match a series of runes that you simply wish to discard: The Lexer allows you to create save points and reset to them if you decide you want to re-try matching runes in a different context: A marker is good up until the next `Emit()` or `Clear()` action. Before using a marker, confirm it is still valid: Once you've confirmed a marker is still valid: NOTE: Resetting a marker does not reset the lexer function that was active when the marker was created. Instead it simply returns the function reference. If you want to return control to the function saved in the marker, you can use this pattern: Lexer defines a few pre-defined token values: You define your own token types starting from TStart: When called, the `Lex*` functions will return a `token.Nexter` which provides a means of retrieving tokens (and errors) emitted from the lexer: Lexer tracks lines and columns as runes are consumed, and exposes them in the emitted Tokens. Lexer uses '\n' as the newline separator when tracking line counts. NOTE: Error messages with line/column information may reference the start of an attempted token match and not the position of the rune(s) that generated the error. See the `examples` folder for programs that demonstrate the lexer functionality.

Registry - Source - Documentation - JSON
purl: pkg:golang/github.com/tekwizely/go-parsing/lexer
Keywords: golang, mit-license, multi-package, parsing-library
License: MIT
Latest release: over 1 year ago
First release: over 1 year ago
Namespace: github.com/tekwizely/go-parsing
Dependent packages: 5
Dependent repositories: 2
Stars: 4 on GitHub
Forks: 0 on GitHub
See more repository details: repos.ecosyste.ms
Last synced: 17 days ago

    Loading...
    Readme
    Loading...