4 releases (breaking)
0.4.0 | Nov 6, 2023 |
---|---|
0.3.0 | Oct 8, 2020 |
0.2.0 | Apr 9, 2020 |
0.1.0 | Oct 11, 2019 |
#1887 in Text processing
Used in alpino-tokenize
48KB
428 lines
alpino-tokenizer
This Rust crate provides a tokenizer based on finite state transducers. It is primarily designed to use the Alpino tokenizer for Dutch, but in principle, you could load a tokenizer for any language.
The transducer of the Alpino tokenizer can be downloaded. We will synchronize the transducer regularly as the tokenizer in Alpino is updated.
You can use the alpino-tokenizer crate to integrate the tokenizer in your Rust programs.
For convenience, an alpino-tokenize command-line utility is provided for tokenizing text on from the shell or in shell scripts.
Installing the alpino-tokenize
command-line utility
cargo
The alpino-tokenize
utility can be installed with
cargo:
$ cargo install alpino-tokenize
Nix
This repository is also a Nix flake. If you use a Nix version that
supports flakes, you can start a shell with alpino-tokenize
as
follows:
$ nix shell github:danieldk/alpino-tokenizer
License
Copyright 2019-2020 Daniël de Kok
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
lib.rs
:
Wrapper for the Alpino tokenizer for Dutch.
This crate provides a wrapper around the Alpino tokenizer for
Dutch. Since the tokenizer itself is included through the
alpino-tokenizer-sys
crate, this crate can be used without
any external dependencies.
This crate exposes a single function tokenize
, that takes a
single paragraph as a string and returns a Vec<Vec<String>>
holding the sentences and tokens. For example:
use std::fs::File;
use std::io::BufReader;
use alpino_tokenizer::{AlpinoTokenizer, Tokenizer};
let read = BufReader::new(File::open("testdata/toy.proto").unwrap());
let tokenizer = AlpinoTokenizer::from_buf_read(read).unwrap();
assert_eq!(
tokenizer.tokenize("Groningen is een Hanzestad. Groningen heeft veel bezienswaardigheden.").unwrap(),
vec![vec!["Groningen", "is", "een", "Hanzestad", "."],
vec!["Groningen", "heeft", "veel", "bezienswaardigheden", "."]]);
Dependencies
~3–5MB
~87K SLoC