17 stable releases

2.2.0 Feb 9, 2024
2.1.0 Dec 4, 2023
2.0.2 May 2, 2023
2.0.1 Jun 18, 2022
1.0.1 Jul 31, 2021

#435 in Network programming

Download history 39/week @ 2023-10-29 8/week @ 2023-11-05 5/week @ 2023-11-12 5/week @ 2023-11-19 38/week @ 2023-11-26 91/week @ 2023-12-03 7/week @ 2023-12-10 21/week @ 2023-12-17 41/week @ 2023-12-24 6/week @ 2023-12-31 5/week @ 2024-01-07 3/week @ 2024-01-14 28/week @ 2024-01-21 39/week @ 2024-01-28 29/week @ 2024-02-04 87/week @ 2024-02-11

183 downloads per month

MIT license

618 lines


A command-line tool for streaming Parquet as line-delimited JSON.

It reads only required ranges from file, HTTP or S3 locations, and supports offset/limit and column selection.

It uses the Apache Parquet Official Native Rust Implementation which has excellent support for compression formats and complex types.

How to use

Install from crates.io and execute from the command line, e.g.:

$ cargo install parquet2json
$ parquet2json --help

    parquet2json [OPTIONS] <FILE> <SUBCOMMAND>

    <FILE>    Location of Parquet input file (file path, HTTP or S3 URL)

    -t, --timeout <TIMEOUT>    Request timeout in seconds [default: 60]
    -h, --help                 Print help information
    -V, --version              Print version information

    cat         Outputs data as JSON lines
    schema      Outputs the Thrift schema
    rowcount    Outputs only the total row count
    help        Print this message or the help of the given subcommand(s)

$ parquet2json cat --help

    parquet2json <FILE> cat [OPTIONS]

    -o, --offset <OFFSET>      Starts outputting from this row (first row: 0, last row: -1)
                               [default: 0]
    -l, --limit <LIMIT>        Maximum number of rows to output [default: -1]
    -c, --columns <COLUMNS>    Select columns by name (comma,separated,?prefixed_optional)
    -h, --help                 Print help information

S3 Settings

Credentials are provided as per standard AWS toolchain, i.e. per environment variables (AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY), AWS credentials file or IAM ECS container/instance profile.

The default AWS region must be set per environment variable (AWS_DEFAULT_REGION) in AWS credentials file and must match region of the object's bucket.


Use it to stream output to files and other tools such as grep and jq.

Output to a file

$ parquet2json ./myfile.pq cat > output.jsonl

From S3 or HTTP (S3)

$ parquet2json s3://noaa-ghcn-pds/parquet/by_year/YEAR=2022/ELEMENT=ADPT/c771f8c0ea844998a1c8a9d5b8f269db_0.snappy.parquet cat
$ parquet2json https://noaa-ghcn-pds.s3.amazonaws.com/parquet/by_year/YEAR%3D2022/ELEMENT%3DADPT/c771f8c0ea844998a1c8a9d5b8f269db_0.snappy.parquet cat

Filter selected columns with jq

$ parquet2json ./myfile.pq cat --columns=url,level | jq 'select(.level==3) | .url'




~1M SLoC