#url #links #email #web #link #text

linkify

Finds URLs and email addresses in plain text. Takes care to get the boundaries right with surrounding punctuation like parentheses.

14 releases (9 breaking)

0.10.0 Jun 24, 2023
0.9.0 Jul 11, 2022
0.8.1 Apr 14, 2022
0.8.0 Nov 26, 2021
0.1.2 Jun 9, 2017

#28 in Text processing

Download history 5475/week @ 2023-12-18 7392/week @ 2023-12-25 6574/week @ 2024-01-01 6592/week @ 2024-01-08 5902/week @ 2024-01-15 6801/week @ 2024-01-22 6282/week @ 2024-01-29 7184/week @ 2024-02-05 7295/week @ 2024-02-12 8589/week @ 2024-02-19 9250/week @ 2024-02-26 7539/week @ 2024-03-04 7210/week @ 2024-03-11 8481/week @ 2024-03-18 11352/week @ 2024-03-25 10238/week @ 2024-04-01

38,175 downloads per month
Used in 55 crates (26 directly)

MIT/Apache

40KB
696 lines

Linkify

Linkify is a Rust library to find links such as URLs and email addresses in plain text. It's smart about where a link ends, such as with trailing punctuation.

Documentation Crate ci codecov

Introduction

Your reaction might be: "Do I need a library for this? Why not a regex?". Let's look at a few cases:

  • In http://example.com/. the link should not include the trailing dot
  • http://example.com/, should not include the trailing comma
  • (http://example.com/) should not include the parens

Seems simple enough. But then we also have these cases:

  • https://en.wikipedia.org/wiki/Link_(The_Legend_of_Zelda) should include the trailing paren
  • http://üñîçøðé.com/ä should also work for Unicode (including Emoji and Punycode)
  • <http://example.com/> should not include angle brackets

This library behaves as you'd expect in the above cases and many more. It uses a simple scan with linear runtime.

In addition to URLs, it can also find email addresses.

Demo 🧑‍🔬

Try it out online on the demo playground (Rust compiled to WebAssembly): https://robinst.github.io/linkify/

If you want to use it on the command line, try lychee. It uses linkify to extract all links and checks if they're valid, but it can also just print them like this:

$ echo 'Test https://example.org (and https://example.com)' | lychee --dump -
https://example.org/
https://example.com/

Usage

Basic usage:

extern crate linkify;

use linkify::{LinkFinder, LinkKind};

let input = "Have you seen http://example.com?";
let finder = LinkFinder::new();
let links: Vec<_> = finder.links(input).collect();

assert_eq!(1, links.len());
let link = &links[0];

assert_eq!("http://example.com", link.as_str());
assert_eq!(14, link.start());
assert_eq!(32, link.end());
assert_eq!(&LinkKind::Url, link.kind());

Option to allow URLs without schemes:

use linkify::LinkFinder;

let input = "Look, no scheme: example.org/foo";
let mut finder = LinkFinder::new();

// true by default
finder.url_must_have_scheme(false);

let links: Vec<_> = finder.links(input).collect();
assert_eq!(links[0].as_str(), "example.org/foo");

Restrict the kinds of links:

use linkify::{LinkFinder, LinkKind};

let input = "http://example.com and foo@example.com";
let mut finder = LinkFinder::new();
finder.kinds(&[LinkKind::Email]);
let links: Vec<_> = finder.links(input).collect();

assert_eq!(1, links.len());
let link = &links[0];
assert_eq!("foo@example.com", link.as_str());
assert_eq!(&LinkKind::Email, link.kind());

See full documentation on docs.rs.

Conformance

This crates makes an effort to respect the various standards, namely:

At the same time, it does not guarantee that the returned links are valid. If in doubt, it rather returns a link than skipping it.

If you need to validate URLs, e.g. for checking TLDs, use another library on the returned links.

Contributing

Pull requests, issues and comments welcome! Make sure to add tests for new features and bug fixes.

License

Linkify is distributed under the terms of both the MIT license and the Apache License (Version 2.0). See LICENSE-APACHE and LICENSE-MIT for details. Opening a pull requests is assumed to signal agreement with these licensing terms.

Dependencies

~170–315KB