1 unstable release

0.1.0 Dec 22, 2023

#47 in #web-crawler

MIT/Apache

110KB
2.5K SLoC

Sitemap Web Scraper

Sitemap Web Scraper (sws) is a tool for simple, flexible, and yet performant web pages scraping.

It consists of a CLI written in Rust that crawls web pages and executes a Lua JIT script to scrap them, outputting results to a CSV file.

sws crawl --script examples/fandom_mmh7.lua -o result.csv

Check out the doc for more details.


lib.rs:

A sws_crawler::Scrapable implementation leveraging sws_scraper CSS selectors and scriptable in Lua.

Dependencies

~16–30MB
~482K SLoC