2 releases

0.1.1 Mar 17, 2024
0.1.0 Jan 11, 2024

#323 in Machine learning

Download history 4/week @ 2024-01-05 2/week @ 2024-01-12 44/week @ 2024-02-16 19/week @ 2024-02-23 7/week @ 2024-03-01 135/week @ 2024-03-15 15/week @ 2024-03-22 30/week @ 2024-03-29 8/week @ 2024-04-05

56 downloads per month

MIT/Apache

250KB
5.5K SLoC

Oxidized Transformers

Oxidized Transformers is a Rust transformers library that started out as a port of Curated Transformers. The foundations are in place and some popular models are implemented, but Oxidized Transformers is still too volatile to use in projects. Keep an eye on the repo, since progress is currently fast.

🧰 Supported Model Architectures

Supported encoder-only models:

  • ALBERT
  • BERT
  • RoBERTa
  • XLM-RoBERTa

Supported decoder-only models:

  • GPT-NeoX
  • Llama 1/2

All types of models can be loaded from Huggingface Hub. Float16/bfloat16 models can use flash attention v2 on recent CUDA GPUs.

Dependencies

~19–33MB
~518K SLoC