6 releases

0.3.1 Sep 1, 2024
0.3.0 Jan 7, 2024
0.2.2 Sep 16, 2019
0.1.0 Sep 14, 2019

#11 in #wikipedia

Download history 150/week @ 2024-08-26 32/week @ 2024-09-02 15/week @ 2024-09-16 41/week @ 2024-09-23 19/week @ 2024-09-30

272 downloads per month
Used in tantivy-object-store

MIT license

22KB
364 lines

wikidump

This crate processes Mediawiki XML dump files and turns them into easily consumed pieces of data for language analysis, natural langauge processing, and other applications.

Example

let parser = Parser::new()
    .use_config(config::wikipedia::english());

let site = parser
    .parse_file("tests/enwiki-articles-partial.xml")
    .expect("Could not parse wikipedia dump file.");

assert_eq!(site.name, "Wikipedia");
assert_eq!(site.url, "https://en.wikipedia.org/wiki/Main_Page");
assert!(!site.pages.is_empty());

for page in site.pages {
    println!("Title: {}", page.title);

    for revision in page.revisions {
        println!("\t{}", revision.text);
    }
}

lib.rs:

This crate can process Mediawiki dump (backup) files in XML format and allow you to extract whatever data you desire.

Example

use wikidump::{config, Parser};

let parser = Parser::new().use_config(config::wikipedia::english());
let site = parser
    .parse_file("tests/enwiki-articles-partial.xml")
    .expect("Could not parse wikipedia dump file.");

assert_eq!(site.name, "Wikipedia");
assert_eq!(site.url, "https://en.wikipedia.org/wiki/Main_Page");
assert!(!site.pages.is_empty());

for page in site.pages {
    println!("\nTitle: {}", page.title);

    for revision in page.revisions {
        println!("\t{}", revision.text);
    }
}

Dependencies

~4MB
~67K SLoC