5 unstable releases

0.3.1 Sep 17, 2022
0.3.0 Sep 16, 2022
0.2.0 Sep 13, 2022
0.1.1 Sep 10, 2022
0.1.0 Sep 10, 2022

#2597 in Parser implementations

Download history 343/week @ 2024-07-25 377/week @ 2024-08-01 331/week @ 2024-08-08 248/week @ 2024-08-15 230/week @ 2024-08-22 267/week @ 2024-08-29 238/week @ 2024-09-05 326/week @ 2024-09-12 325/week @ 2024-09-19 378/week @ 2024-09-26 263/week @ 2024-10-03 168/week @ 2024-10-10 382/week @ 2024-10-17 410/week @ 2024-10-24 478/week @ 2024-10-31 334/week @ 2024-11-07

1,647 downloads per month
Used in 27 crates (via markdown-it)

MIT license

55KB
894 lines

mdurl

web demo github docs.rs crates.io coverage

URL parser and formatter that gracefully handles invalid input.

It is a rust port of mdurl.js library, created specifically for url rendering in markdown-it parser.

URL formatter

This function takes URL, decodes it, and fits it into N characters, replacing the rest with "…" symbol (that's called "url elision").

This is similar to what Chromium would show you in status bar when you hover your mouse over a link.

use mdurl::format_url_for_humans as format;
let url = "https://www.reddit.com/r/programming/comments/vxttiq/\
comment/ifyqsqt/?utm_source=reddit&utm_medium=web2x&context=3";

assert_eq!(format(url, 20), "reddit.com/…/ifyqsq…");
assert_eq!(format(url, 30), "www.reddit.com/r/…/ifyqsqt/?u…");
assert_eq!(format(url, 50), "www.reddit.com/r/programming/comments/…/ifyqsqt/?…");

Check out this demo to play around with different URLs and lengths.

humanize-url crate tries to achieve similar goals, let me know if there are others.

URL parser

In order to achieve the task above, a new url parser had to be created, so here it is:

let url = "https://www.reddit.com/r/programming/comments/vxttiq/\
comment/ifyqsqt/?utm_source=reddit&utm_medium=web2x&context=3";
let u = mdurl::parse_url(url);

assert_eq!(u.hostname, Some("www.reddit.com".into()));
assert_eq!(u.pathname, Some("/r/programming/comments/vxttiq/comment/ifyqsqt/".into()));
assert_eq!(u.search, Some("?utm_source=reddit&utm_medium=web2x&context=3".into()));

This function uses a non-standard parsing algorithm derived from node.js legacy URL parser.

You should probably be using rust-url crate instead. Unfortunately, it isn't suitable for the task of pretty-printing urls because you can't customize parts of Url returned by that library (for example, rust-url will always encode non-ascii hostname with punycode, this implementation will not).

Dependencies

~3.5–4.5MB
~94K SLoC