#rpsl #irr #routes #bgp #routing #parser #policy

route_verification_parse

Parse RPSL in the IRR to verify observed BGP routes

2 unstable releases

0.2.0 Mar 10, 2024
0.1.0 Nov 6, 2023

#10 in #rpsl

Download history 8/week @ 2024-02-19 19/week @ 2024-02-26 153/week @ 2024-03-04 58/week @ 2024-03-11 96/week @ 2024-03-18 7/week @ 2024-03-25 35/week @ 2024-04-01

341 downloads per month
Used in 4 crates (2 directly)

MIT license

125KB
2.5K SLoC

Parse RPSL Policy

WIP

TODO: update this README.

Debugging

  • Enable logging:

    export RUST_LOG=route_verification=trace
    
  • Enable backtrace in error messages:

    export RUST_BACKTRACE=1
    

Produce a parsed dump using both lexer and parser

  • Put the database file at data/ripe.db.

  • Make sure pypy3 is in the PATH.

  • Make sure you have rpsl-lexer installed and it can be found by pypy3 (e.g., in PYTHONPATH).

    python3 -m pip install rpsl-lexer
    
  • Run at route_verification/:

    cargo r --release -- parse ../data/ripe.db ../parsed
    

    The parsed dump will be distributed in parsed/.

Produce a spread parsed dump from both priority and backup registries

Obtain IRR data

Download from all FTP servers on IRR List of Routing Registries.

Download priority registries to data/irrs/priority/:

ftp://ftp.afrinic.net/pub/dbase/ ftp://ftp.altdb.net/pub/altdb/ ftp://ftp.apnic.net/pub/apnic/whois/ ftp://ftp.arin.net/pub/rr/ ftp://irr.bboi.net/ https://whois.canarie.ca/dbase/ ftp://irr-mirror.idnic.net/ ftp://ftp.nic.ad.jp/jpirr/ ftp://irr.lacnic.net/ ftp://ftp.nestegg.net/irr ftp://rr1.ntt.net/nttcomRR/ ftp://ftp.panix.com/pub/rrdb ftp://ftp.ripe.net/ripe/dbase/

Download backup registries to data/irrs/backup/:

ftp://ftp.radb.net/radb/dbase/

Decompress all files.

Run the parser with parse_priority

Run at route_verification/:

cargo r --release -- parse_priority ../data/irrs/priority/ ../data/irrs/backup/ ../parsed_all/

The above command parses all IRR DB files in data/irrs/priority/ and data/irrs/backup/, overrides any duplicated information with the version from the former, and writes the result to multiple JSON files in parsed_all/.

Produce a parsed IR dump using IRRs in ordered priorities

Run at route_verification/:

cargo r --release -- parse_ordered \
../data/irrs/priority/apnic.db.* ../data/irrs/priority/afrinic.db ../data/irrs/priority/arin.db ../data/irrs/priority/lacnic.db\
../data/irrs/priority/ripe.db ../data/irrs/backup/radb.db\
../data/irrs/backup/altdb.db ../data/irrs/backup/idnic.db ../data/irrs/backup/jpirr.db ../data/irrs/backup/level3.db ../data/irrs/backup/nttcom.db ../data/irrs/backup/reach.db ../data/irrs/backup/tc.db\
../parsed_all/

Running interactively in Jupyter Notebook

  • Finish the previous section. Your parsed dump should be cached in parsed/.
  • Install Evcxr Jupyter Kernel.
  • Open the notebook at ./, and try out parse_test.ipynb.

Produce a lexed dump using lexer

If the database file is at data/ripe.db, for example:

pypy3 -m rpsl_lexer.dump data/ripe.db > dump.json

In the command above, we use PyPy for faster performance, and pipe the dumped JSON to dump.json.

Test lexer

Run at ./:

pytest

Test lexer against ripe.db

To test against ripe.db using rpsl_lexer/tests/mp_import_w_db.py, put the database file at data/ripe.db:

python3 -m rpsl_lexer.tests.mp_import_w_db

Similarly, to test with rpsl_lexer/tests/mp_export_w_db.py, or one of the other lexers:

python3 -m rpsl_lexer.tests.mp_export_w_db
python3 -m rpsl_lexer.tests.mp_peering_w_db
python3 -m rpsl_lexer.tests.mp_filter_w_db
python3 -m rpsl_lexer.tests.action_w_db
python3 -m rpsl_lexer.tests.import_w_db
python3 -m rpsl_lexer.tests.export_w_db

Test parser and comparator

Run at route_verification/:

cargo t --workspace

Dependencies

~7–11MB
~203K SLoC