#speech-recognition #real-time #streaming #client-library #api-bindings

aristech-stt-client

A Rust client library for the Aristech Speech-to-Text API

7 stable releases

2.1.3 Dec 5, 2024
2.1.2 Dec 4, 2024
2.1.1 Nov 21, 2024
2.1.0 Oct 31, 2024
1.0.0 Oct 21, 2024

#334 in Audio

Download history 289/week @ 2024-10-21 248/week @ 2024-10-28 15/week @ 2024-11-04 143/week @ 2024-11-18 14/week @ 2024-11-25 299/week @ 2024-12-02 49/week @ 2024-12-09

505 downloads per month

MIT license

27KB
196 lines

Aristech STT-Client for Rust

This is the Rust client implementation for the Aristech STT-Server.

Installation

To use the client in your project, add it to your Cargo.toml or use cargo to add it:

cargo add aristech-stt-client

Usage

use aristech_stt_client::{get_client, recognize_file, TlsOptions, Auth};
use std::error::Error;

#[tokio::main]
async fn main() -> Result<(), Box<dyn Error>> {
    let mut client = get_client(
    "https://stt.example.com",
    Some(TlsOptions {
        ca_certificate: None,
        auth: Some(Auth { token: "your-token", secret: "your-secret" }),
    }),
    ).await?;

    let results = recognize_file(&mut client, "path/to/audio/file.wav", None).await?;
    for result in results {
        println!(
            "{}",
            result
                .chunks
                .get(0)
                .unwrap()
                .alternatives
                .get(0)
                .unwrap()
                .text
        );
    }
    Ok(())
}

There are several examples in the examples directory:

  • file.rs: Demonstrates how to perform recognition on a file.
  • live.rs: Demonstrates how to perform live recognition using the microphone.
  • models.rs: Demonstrates how to get the available models from the server.
  • nlpFunctions.rs: Demonstrates how to list the configured NLP-Servers and the coresponding functions.
  • nlpProcess.rs: Demonstrates how to perform NLP processing on a text by using the STT-Server as a proxy.
  • account.rs: Demonstrates how to retrieve the account information from the server.

You can run the examples directly using cargo like this:

  1. Create a .env file in the rust directory:
HOST=stt.example.com
# The credentials are optional but probably required for most servers:
TOKEN=your-token
SECRET=your-secret

# The following are optional:
# ROOT_CERT=your-root-cert.pem # If the server uses a self-signed certificate
# If neither credentials nor an explicit root certificate are provided,
# you can still enable SSL by setting the SSL environment variable to true:
# SSL=true
# MODEL=some-available-model
# NLP_SERVER=some-config
# NLP_PIPELINE=function1,function2
  1. Run the examples, e.g.:
cargo run --example live

Build

To build the library, run:

cargo build

Dependencies

~13–22MB
~403K SLoC