#lexer #tokens #parser #json #derived #calculator #lex

bin+lib lexer-generator

Lexer derived from Regex patterns with user customizeable tokens

5 releases

0.1.4 Mar 31, 2022
0.1.3 Mar 29, 2022
0.1.2 Mar 28, 2022
0.1.1 Mar 28, 2022
0.1.0 Mar 28, 2022

#1583 in Text processing

MIT license

13KB
255 lines

lexer-generator

Lexer crate derived from Regex patterns with user customizeable tokens

Example: Basic Tokenizing

Potential code one might use to lex tokens for a calculator

key.json:

{
    "literals": {
        "number": "[0-9]*(\\.[0-9]*){0, 1}",
        "subtract": "-",
        "add": "\\+",
        "divide": "/",
        "multiply": "\\*" 
    },
    "whitespace": "\n| |\r|\t"
}

main.rs:

let json: String = std::fs::read_to_string("key.json").unwrap();
let source: String = String::from("123 + 456 * 789");

let mut lexer = Lexer::from(json, source);
while !lexer.done() {
    println!("{}", lexer.next_token().unwrap());
}
number(123)
add(+)     
number(456)
multiply(*)
number(789)

lib.rs:

lexer-generator

This crate is a small scale lexer package which is parsed from JSON

Example: Basic Tokenizing

Potential code one might use to lex tokens for a calculator

key.json:

{
    "literals": {
        "number": "[0-9]*(\\.[0-9]*){0, 1}",
        "subtract": "-",
        "add": "\\+",
        "divide": "/",
        "multiply": "\\*" 
    },
    "whitespace": "\n| |\r|\t"
}

main.rs:

let json: String = std::fs::read_to_string("key.json").unwrap();
let source: String = String::from("123 + 456 * 789");

let mut lexer = Lexer::from(json, source);
// parsing, runtime, whatever one would want to do with their tokens
"123 + 456 * 789" -> Token("number", "123"), Token("add", "*"), Token("number", "456"), Token("multiply", "*"), Token("number", "789") // ignoring line position and the incremental nature of the lexer

Dependencies

~2.8–4.5MB
~87K SLoC