#cbor #serialization #encoding #binary #no-std

no-std bin+lib minicbor

A small CBOR codec suitable for no_std environments

35 releases (15 breaking)

Uses new Rust 2021

0.16.0 May 8, 2022
0.15.0 Mar 13, 2022
0.12.0 Nov 29, 2021
0.9.1 Jul 25, 2021
0.1.1 Mar 29, 2020

#51 in Encoding

Download history 6144/week @ 2022-01-28 6360/week @ 2022-02-04 5704/week @ 2022-02-11 6200/week @ 2022-02-18 7066/week @ 2022-02-25 5495/week @ 2022-03-04 5569/week @ 2022-03-11 5277/week @ 2022-03-18 5047/week @ 2022-03-25 5039/week @ 2022-04-01 4934/week @ 2022-04-08 5242/week @ 2022-04-15 6071/week @ 2022-04-22 5296/week @ 2022-04-29 6385/week @ 2022-05-06 5711/week @ 2022-05-13

24,863 downloads per month
Used in 119 crates (29 directly)


3.5K SLoC


A small CBOR codec suitable for no_std environments.


Documentation is available at


This software is licensed under the Blue Oak Model License Version 1.0.0. If you are interested in contributing to this project, please read the file CONTRIBUTING.md first.


A small CBOR codec suitable for no_std environments.

The crate is organised around the following entities:

  • [Encoder] and [Decoder] for type-directed encoding and decoding of values.

  • [Encode] and [Decode] traits which can be implemented for any type that should be encoded to or decoded from CBOR. They are similar to serde's Serialize and Deserialize traits but do not abstract over the encoder/decoder.

Encoding and decoding proceeds in a type-directed way, i.e. by calling methods for expected data item types, e.g. [Decoder::u32] or [Encoder::str]. In addition there is support for data type inspection. The Decoder can be queried for the current data type which returns a [data::Type] that can represent every possible CBOR type and decoding can thus proceed based on this information. It is also possible to just tokenize the input bytes using a Tokenizer, i.e. an Iterator over CBOR Tokens.

Optionally, Encode and Decode can be derived for structs and enums using the respective derive macros (requires feature "derive"). See [minicbor_derive] for details.

For I/O support see minicbor-io.

Feature flags

The following feature flags are supported:

  • "alloc": Enables most collection types in a no_std environment.

  • "std": Implies "alloc" and enables more functionality that depends on the std crate.

  • "derive": Implies "alloc" and allows deriving [Encode] and [Decode] traits.

  • "partial-skip-support": Enables the method [Decoder::skip] to skip over any CBOR item other than indefinite-length arrays or maps inside of regular maps or arrays. Support for skipping over any CBOR item is enabled by "alloc" but without"alloc" or "partial-skip-support" Decoder::skip is not available at all.

  • "partial-derive-support": Implies "partial-skip-support" and allows deriving [Encode] and [Decode] traits, but does not support indefinite-length CBOR maps and arrays inside of regular CBOR maps and arrays.

Example: generic encoding and decoding

use minicbor::{Encode, Decode};

let input = ["hello", "world"];
let mut buffer = [0u8; 128];

minicbor::encode(&input, buffer.as_mut())?;
let output: [&str; 2] = minicbor::decode(buffer.as_ref())?;
assert_eq!(input, output);

# Ok::<_, Box<dyn std::error::Error>>(())

Example: ad-hoc encoding

use minicbor::Encoder;

let mut buffer = [0u8; 128];
let mut encoder = Encoder::new(&mut buffer[..]);

encoder.begin_map()? // using an indefinite map here

# Ok::<_, Box<dyn std::error::Error>>(())

Example: ad-hoc decoding

use minicbor::Decoder;
use minicbor::data::Tag;

let input = [
    0xc0, 0x74, 0x32, 0x30, 0x31, 0x33, 0x2d, 0x30,
    0x33, 0x2d, 0x32, 0x31, 0x54, 0x32, 0x30, 0x3a,
    0x30, 0x34, 0x3a, 0x30, 0x30, 0x5a

let mut decoder = Decoder::new(&input);
assert_eq!(Tag::DateTime, decoder.tag()?);
assert_eq!("2013-03-21T20:04:00Z", decoder.str()?);
# Ok::<_, Box<dyn std::error::Error>>(())

Example: tokenization

use minicbor::display;
use minicbor::decode::{Token, Tokenizer};

let input  = [0x83, 0x01, 0x9f, 0x02, 0x03, 0xff, 0x82, 0x04, 0x05];

assert_eq!("[1, [_ 2, 3], [4, 5]]", format!("{}", display(&input)));

let tokens = Tokenizer::new(&input).collect::<Result<Vec<Token>, _>>()?;

assert_eq! { &tokens[..],

# Ok::<_, Box<dyn std::error::Error>>(())