13 releases (6 stable)

new 1.3.1 Apr 26, 2024
1.2.1 Feb 10, 2024
1.0.1 Jan 2, 2024
0.5.3 Dec 31, 2023
0.1.0 Dec 20, 2023

#340 in Network programming

Download history 27/week @ 2023-12-29 2/week @ 2024-01-05 3/week @ 2024-02-09 5/week @ 2024-02-16 13/week @ 2024-02-23 2/week @ 2024-03-01 57/week @ 2024-03-29 11/week @ 2024-04-05

68 downloads per month

Custom license

35KB
438 lines

girolle

girolle

Description

A nameko-rpc like lib in rust. Check the To-Do section to see limitation.

Do not use in production!

Girolle use Nameko architecture to send request and get response.

Documentation

User documentation and Rust documentation

Installation

cargo add girolle

Stack

Girolle use lapin as an AMQP client/server library.

Setup

You need to set this environement variables.

  • RABBITMQ_USER: The RabbitMQ user
  • RABBITMQ_PASSWORD: The RabbitMQ password
  • RABBITMQ_HOST: THe rabbitMQ host adress
  • Optional: RABBITMQ_PORT: The RabbitMQ port (default: 5672)

How to use it

The core concept is to remove the pain of the queue creation and reply by mokcing the Nameko architecture, and to use an abstract type serde_json::Value to manipulate a serializable data.

if you do not use the macro #[girolle] you need to create a function that extract the data from the a &[Value] like this:

fn fibonacci_reccursive(s: &[Value]) -> Result<Value> {
    let n: u64 = serde_json::from_value(s[0].clone())?;
    let result: Value = serde_json::to_value(fibonacci(n))?;
    Ok(result)
}

Exemple

Create a simple service

use girolle::prelude::*;

#[girolle]
fn hello(s: String) -> String {
    format!("Hello, {}!", s)
}

fn fibonacci(n: u64) -> u64 {
    if n <= 1 {
        return n;
    }
    return fibonacci(n - 1) + fibonacci(n - 2);
}

// Because the function is recursive, it need to be wrap in a function
#[girolle]
fn fib_wrap(n: u64) -> u64 {
    fibonacci(n)
}

fn main() {
    let rpc_task = RpcTask::new("hello", hello);
    let rpc_task_fib = RpcTask::new("fibonacci", fib_wrap);
    let _ = RpcService::new("video")
        .register(rpc_task)
        .register(rpc_task_fib)
        .start();
}

Create multiple calls to service of methods, sync and async

use std::vec;
use std::{thread, time};
use girolle::{JsonValue::Value, RpcClient};
use serde_json;

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    // Create the rpc call struct
    let rpc_client = RpcClient::new();
    // Transform the number into a JsonValue
    let t: serde_json::Number = serde_json::from_str("30").unwrap();
    // Create the payload
    let new_payload = vec![t.into()];
    // Send the payload
    let new_result = rpc_client
        .send("video".to_string(), "fibonacci".to_string(), new_payload)
        .await?;
    let fib_result: u64 = serde_json::from_value(new_result).unwrap();
    // Print the result
    println!("fibonacci :{:?}", fib_result);
    assert_eq!(fib_result, 832040);
    // Create a future result
    let future_result = rpc_client.call_async(
        "video".to_string(),
        "hello".to_string(),
        vec![Value::String("Toto".to_string())],
    );
    // Send a message
    let result = rpc_client
        .send(
            "video".to_string(),
            "hello".to_string(),
            vec![Value::String("Girolle".to_string())],
        )
        .await?;
    // Print the result
    println!("{:?}", result);
    assert_eq!(
        result,
        Value::String("Hello, Girolle!, by nameko".to_string())
    );
    // Wait for the future result
    let mut consumer = future_result.await?;
    // wait for it

    let two_sec = time::Duration::from_secs(20);
    thread::sleep(two_sec);
    // Print the result
    let async_result = rpc_client.result(&mut consumer).await;
    println!("{:?}", async_result);
    assert_eq!(
        async_result,
        Value::String("Hello, Toto!, by nameko".to_string())
    );
    Ok(())
}

To-Do

  • Handle the error
  • write test
  • create a proxy service in rust to interact with an other service nameko-rpc
  • Add macro to simplify the creation of a service
    • Add basic macro
    • fix macro to handle return
    • fix macro to handle recursive function
  • listen to a pub/sub queue

Limitation

The current code as been tested with the nameko and girolle examples in this repository.

nameko_test.py simple_senders
simple_service x x
nameko_service x x
simple_macro x x

Benchmark

The benchmark is done to test the overhead of the macro.

benchmark

Dependencies

~10–21MB
~324K SLoC