4 releases (2 breaking)

0.3.0 Oct 28, 2021
0.2.1 Oct 22, 2021
0.2.0 Sep 28, 2021
0.1.0 Sep 9, 2021

#281 in Parser tooling

MIT/Apache

14KB
143 lines

regex-lexer

github crates.io docs.rs build status

A regex-based lexer (tokenizer) in Rust.

Basic Usage

enum Token {
    Num(usize),
    // ...
}

let lexer = regex_lexer::LexerBuilder::new()
  .token(r"[0-9]+", |num| Some(Token::Num(num.parse().unwrap())))
  .token(r"\s+", |_| None) // skip whitespace
  // ...
  .build();

let tokens = lexer.tokens(/* source */);

License

Licensed under either of

at your option.

Contribution

Unless you explicitly state otherwise, any contribution intentionally submitted for inclusing in the work by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.

Dependencies

~2.5–4MB
~76K SLoC