github.com/otm/lex
Package lex provieds a skeleton for writing lexers and tokenizers. The lexer is a state machine which emits tokens. To get started with the package only token types and state functions are needed to be defined. The lexer emits tokens, and tokens are identified by its TokenType. Custom TokenTypes are defined in the following way: One notable thing is the `iota + lex.TokenTypes` which is needed as there are predefined constans allready in lex. The lexer is a state machine, so the next step is defining state functions of the type StateFn. Bellow is an example of a state function that would abort the lexing process and return an error token. Return nil in the state function to abort the lexer without an error. A simple lexer that only emits alpha-numeric or special charects could look like the example bellow. Please note that the example is not optimized but to showcase functionality, and that the power of a lexer is not displayed in such small example. Which would produce the following output: The syntax above is [TokenType() <line>:<start byte>+<byte width>] 1. "Lexical Scanning in Go" by Rob Pike: https://www.youtube.com/watch?v=HxaD_trXwRE 2. "GoDoc for the Template Package": http://golang.org/src/text/template/parse/lex.go
proxy.golang.org
v0.0.0-20150726085427-7ef7072285f2
over 10 years ago
1
Links
| Registry | proxy.golang.org |
| Source | Repository |
| Docs | Documentation |
| JSON API | View JSON |
| CodeMeta | codemeta.json |
Package Details
| PURL |
pkg:golang/github.com/otm/lex
spec |
| License | MIT |
| Namespace | github.com/otm |
| First Release | over 10 years ago |
| Last Synced | 7 days ago |
Repository
| Stars | 1 on GitHub |
| Forks | 1 on GitHub |