#lexer #lexical #tokenizer

tusk_lexer

The lexical analysis component of Tusk

21 unstable releases (3 breaking)

0.4.7 Jun 1, 2021
0.4.6 Jun 1, 2021
0.4.5 May 29, 2021
0.3.4 May 27, 2021
0.1.0 May 26, 2021

#70 in Parser tooling

Download history 50/week @ 2021-05-20 352/week @ 2021-05-27 25/week @ 2021-06-03 30/week @ 2021-06-10

122 downloads per month
Used in tusk_parser

MIT license

8KB
303 lines

Tusk Logo

Lexer

The lexical analysis component of Tusk.

About

This crate provides the Lexer and Token implementations used in Tusk. It allows you to provide a &str of input and stream Token instances on demand.

Usage

To use the crate, first add it to your Cargo.toml:

[dependencies]
tusk_lexer = "0.2.*"

To create a new Lexer, import the struct and use the Lexer::new() method.

use tusk_lexer::Lexer;

fn main() {
    let mut lexer = Lexer::new("$hello = 'cool'");
}

To get the next token from the input, use the Lexer::next() method:

use tusk_lexer::Lexer;

fn main() {
    let mut lexer = Lexer::new("$hello = 'cool'");

    let maybe_some_token = lexer.next();
}

This method returns a Token. This struct has 3 fields:

struct Token {
    pub kind: TokenType,
    pub slice: &str,
    pub range: TextRange,
}

Contributing

For more information, please read the CONTRIBUTING document.

License

This repository is distributed under the MIT license. For more information, please read the LICENSE document.

Dependencies

~53KB