#cbor #serialization #binary #no-std

no-std bin+lib minicbor

A small CBOR codec suitable for no_std environments

42 releases

0.20.0 Sep 24, 2023
0.19.1 Mar 10, 2023
0.19.0 Dec 17, 2022
0.18.0 Jun 19, 2022
0.1.1 Mar 29, 2020

#61 in Encoding

Download history 14934/week @ 2023-08-16 16288/week @ 2023-08-23 15091/week @ 2023-08-30 17543/week @ 2023-09-06 18438/week @ 2023-09-13 15913/week @ 2023-09-20 16428/week @ 2023-09-27 17131/week @ 2023-10-04 18987/week @ 2023-10-11 22322/week @ 2023-10-18 24918/week @ 2023-10-25 22094/week @ 2023-11-01 20442/week @ 2023-11-08 22149/week @ 2023-11-15 18760/week @ 2023-11-22 17047/week @ 2023-11-29

81,792 downloads per month
Used in 211 crates (62 directly)




A small CBOR codec suitable for no_std environments.


Documentation is available at


This software is licensed under the Blue Oak Model License Version 1.0.0. If you are interested in contributing to this project, please read the file CONTRIBUTING.md first.


A small CBOR codec suitable for no_std environments.

The crate is organised around the following entities:

  • Encoder and Decoder for type-directed encoding and decoding of values.

  • Encode and Decode traits which can be implemented for any type that should be encoded to or decoded from CBOR. They are similar to serde's Serialize and Deserialize traits but do not abstract over the encoder/decoder.

Encoding and decoding proceeds in a type-directed way, i.e. by calling methods for expected data item types, e.g. Decoder::u32 or Encoder::str. In addition there is support for data type inspection. The Decoder can be queried for the current data type which returns a data::Type that can represent every possible CBOR type and decoding can thus proceed based on this information. It is also possible to just tokenize the input bytes using a Tokenizer, i.e. an Iterator over CBOR Tokens. Finally, the length in bytes of a value's CBOR representation can be calculated if the value's type implements the CborLen trait.

Optionally, Encode and Decode can be derived for structs and enums using the respective derive macros (requires feature "derive"). See minicbor_derive for details.

For I/O support see minicbor-io.

Feature flags

The following feature flags are supported:

  • "alloc": Enables most collection types in a no_std environment.

  • "std": Implies "alloc" and enables more functionality that depends on the std crate.

  • "derive": Allows deriving Encode and Decode traits.

Example: generic encoding and decoding

use minicbor::{Encode, Decode};

let input = ["hello", "world"];
let mut buffer = [0u8; 128];

minicbor::encode(&input, buffer.as_mut())?;
let output: [&str; 2] = minicbor::decode(buffer.as_ref())?;
assert_eq!(input, output);

Example: ad-hoc encoding

use minicbor::Encoder;

let mut buffer = [0u8; 128];
let mut encoder = Encoder::new(&mut buffer[..]);

encoder.begin_map()? // using an indefinite map here

Example: ad-hoc decoding

use minicbor::Decoder;
use minicbor::data::Tag;

let input = [
    0xc0, 0x74, 0x32, 0x30, 0x31, 0x33, 0x2d, 0x30,
    0x33, 0x2d, 0x32, 0x31, 0x54, 0x32, 0x30, 0x3a,
    0x30, 0x34, 0x3a, 0x30, 0x30, 0x5a

let mut decoder = Decoder::new(&input);
assert_eq!(Tag::DateTime, decoder.tag()?);
assert_eq!("2013-03-21T20:04:00Z", decoder.str()?);

Example: tokenization

use minicbor::display;
use minicbor::decode::{Token, Tokenizer};

let input  = [0x83, 0x01, 0x9f, 0x02, 0x03, 0xff, 0x82, 0x04, 0x05];

assert_eq!("[1, [_ 2, 3], [4, 5]]", format!("{}", display(&input)));

let tokens = Tokenizer::new(&input).collect::<Result<Vec<Token>, _>>()?;

assert_eq! { &tokens[..],