1 unstable release
0.1.0 | Nov 25, 2024 |
---|
#174 in Testing
173 downloads per month
Used in rswarm
39KB
679 lines
OpenAI Mock
OpenAI Mock is a tool designed to simulate the behavior of the OpenAI API for testing and development purposes without incurring costs or using actual API tokens.
Table of Contents
Installation
To install and set up the OpenAI Mock project, follow these steps:
-
Clone the Repository
git clone https://github.com/socrates8300/openai-mock.git cd openai-mock
-
Build the Project
Ensure you have Rust installed. Then, build the project using Cargo:
cargo build --release
-
Run the Mock Server
cargo run --release
By default, the server runs on
http://localhost:8000
. You can customize the port using environment variables or configuration files as needed.
Usage
Example 1: Basic Setup
This example demonstrates how to set up and run the OpenAI Mock server with default settings.
-
Start the Server
cargo run --release
-
Send a Request to the Mock Endpoint
You can use
curl
or any HTTP client to interact with the mock API.curl -X POST http://localhost:8000/v1/completions \ -H "Content-Type: application/json" \ -d '{ "model": "gpt-3.5-turbo", "prompt": "Hello, world!", "max_tokens": 50 }'
-
Expected Response
{ "id": "cmpl-mock-id-<uuid>", "object": "text_completion", "created": <timestamp>, "model": "gpt-3.5-turbo", "choices": [ { "text": "Hello! How can I assist you today?", "index": 0, "logprobs": null, "finish_reason": "stop" } ], "usage": { "prompt_tokens": 2, "completion_tokens": 10, "total_tokens": 12 } }
Example 2: Custom Responses
Customize the responses of the mock server to simulate different scenarios.
-
Modify the Handler
Update the
completions_handler
insrc/handlers/completion_handler.rs
to return a custom response based on the input prompt.// Inside the completions_handler function let response = if req.prompt.as_deref() == Some("test") { CompletionResponse { id: "cmpl-custom-id".to_string(), object: "text_completion".to_string(), created: get_current_timestamp().timestamp() as u64, model: req.model.clone(), choices: vec![ Choice { text: "This is a custom response for testing.".to_string(), index: 0, logprobs: None, finish_reason: "stop".to_string(), } ], usage: Usage { prompt_tokens: count_tokens(&prompt), completion_tokens: 7, total_tokens: count_tokens(&prompt) + 7, }, } } else { // Default mock response // ... };
-
Restart the Server
cargo run --release
-
Send a Custom Request
curl -X POST http://localhost:8000/v1/completions \ -H "Content-Type: application/json" \ -d '{ "model": "gpt-3.5-turbo", "prompt": "test", "max_tokens": 50 }'
-
Expected Custom Response
{ "id": "cmpl-custom-id", "object": "text_completion", "created": <timestamp>, "model": "gpt-3.5-turbo", "choices": [ { "text": "This is a custom response for testing.", "index": 0, "logprobs": null, "finish_reason": "stop" } ], "usage": { "prompt_tokens": 1, "completion_tokens": 7, "total_tokens": 8 } }
Example 3: Integrating with Actix-Web
Integrate the OpenAI Mock library into an existing Actix-Web application.
-
Add Dependency
In your project's
Cargo.toml
, addopenai-mock
as a dependency:[dependencies] openai-mock = { path = "../openai-mock" } # Adjust the path as necessary actix-web = "4" serde_json = "1.0"
-
Configure Routes
Update your
src/main.rs
or equivalent to include the mock routes.use actix_web::{App, HttpServer}; use openai_mock::routes::configure_completion_routes; #[actix_web::main] async fn main() -> std::io::Result<()> { HttpServer::new(|| { App::new() .configure(configure_completion_routes) }) .bind(("127.0.0.1", 8000))? .run() .await }
-
Run Your Application
cargo run
-
Interact with the Mock API
Send requests to your Actix-Web application as shown in Example 1.
Running Tests
OpenAI Mock includes a suite of tests to ensure its functionality. To run the tests:
cargo test
Example Test Case
Here's an example test case located in src/tests.rs
that verifies the completions handler.
#[cfg(test)]
mod tests {
use actix_web::{test, App};
use crate::handlers::completions_handler;
use crate::models::completion::CompletionRequest;
use serde_json::json;
#[actix_web::test]
async fn test_completions_handler() {
// Initialize the mock service
let app = test::init_service(
App::new()
.service(
actix_web::web::resource("/v1/completions")
.route(actix_web::web::post().to(completions_handler)),
)
).await;
// Create a sample CompletionRequest
let req_payload = CompletionRequest {
model: "gpt-3.5-turbo".to_string(),
prompt: Some(json!("Hello, world!")),
..Default::default()
};
// Create POST request
let req = test::TestRequest::post()
.uri("/v1/completions")
.set_json(&req_payload)
.to_request();
// Send request and get the response
let resp = test::call_service(&app, req).await;
// Assert the response status is 200 OK
assert!(resp.status().is_success());
// Parse the response body
let response_body: serde_json::Value = test::read_body_json(resp).await;
// Assert the response contains expected fields
assert_eq!(response_body["model"], "gpt-3.5-turbo");
assert!(response_body["choices"].is_array());
// Add more assertions as needed
}
}
Adding New Tests
To add new tests:
-
Create a New Test Function
Add a new
#[actix_web::test]
function insrc/tests.rs
or create additional test modules as needed. -
Write Test Logic
Utilize Actix-Web's testing utilities to initialize the service, send requests, and assert responses.
-
Run Tests
cargo test
Contributing
Contributions are welcome! Please follow these steps to contribute:
-
Fork the Repository
Click the Fork button at the top right corner of the repository page.
-
Clone Your Fork
git clone https://github.com/your-username/openai-mock.git cd openai-mock
-
Create a New Branch
git checkout -b feature/your-feature-name
-
Make Your Changes
Implement your feature or bug fix.
-
Commit Your Changes
git commit -m "Add feature: your feature description"
-
Push to Your Fork
git push origin feature/your-feature-name
-
Create a Pull Request
Navigate to the original repository and click the "Compare & pull request" button to submit your changes for review.
License
This project is licensed under the MIT License.
Contact
For any questions or suggestions, please contact:
- James T. Ray
- Email: raymac@ievolution.com
- GitHub: socrates8300
Thank you for using OpenAI Mock! We hope it aids in your development and testing endeavors.
Dependencies
~33–47MB
~671K SLoC