#nlp #tokenizers #natural #processing #language

yanked nlp-tokenize

Tokenizers for Natural Language Processing

Uses old Rust 2015

0.1.0 Jul 8, 2016

#8 in #tokenizers

MIT/Apache

5KB
105 lines

NLP-tokenize

A collection of tokenizers for Natural Language Processing.

License

This crate is distributed under the terms of both the MIT License and the Apache License (version 2.0). See the files and for details.


lib.rs:

NLP-tokenize

A collection of tokenizers for Natural Language Processing.

Dependencies

~3.5MB
~76K SLoC