4 stable releases
2.0.3 | Jun 9, 2025 |
---|---|
2.0.2 | Jun 8, 2025 |
#560 in HTTP server
283 downloads per month
200KB
3.5K
SLoC
OpenServe
OpenServe is a modern, high-performance, AI-enhanced file server built in Rust. It combines the speed and safety of Rust with intelligent features powered by OpenAI, providing a comprehensive solution for file management, search, and collaboration.
Features
Core Features
- High-Performance File Server: Built with Axum and Tokio for maximum performance
- RESTful API: Complete REST API for file operations
- WebSocket Support: Real-time file change notifications
- Authentication & Authorization: JWT-based auth with role-based access control
- Search Engine: Full-text search powered by Tantivy
- File Upload/Download: Secure file operations with validation
- Directory Management: Complete directory operations
AI-Powered Features
- Intelligent File Classification: Automatic file categorization
- Content Analysis: AI-powered content summarization and tagging
- Smart Search: Semantic search capabilities
- File Organization Suggestions: AI recommendations for file structure
- Chat Interface: Query your files using natural language
Security & Safety
- Path Traversal Protection: Secure file access controls
- Input Validation: Comprehensive request validation
- Rate Limiting: Built-in rate limiting for API endpoints
- CORS Support: Configurable cross-origin resource sharing
- TLS/SSL Support: HTTPS encryption support
Monitoring & Observability
- Structured Logging: JSON and text logging with tracing
- Metrics Collection: Prometheus metrics integration
- Health Checks: Built-in health monitoring
- Performance Monitoring: Request timing and performance metrics
DevOps Ready
- Docker Support: Multi-stage Dockerfile for production
- Docker Compose: Complete stack with Redis, Prometheus, Grafana
- Configuration Management: Environment-based configuration
- Automated Testing: Comprehensive test suite
- CI/CD Ready: GitHub Actions integration
Quick Start
Prerequisites
- Rust 1.75 or later
- Docker & Docker Compose (optional)
- OpenAI API Key (for AI features)
Installation
Option 1: Using Cargo
cargo install openserve
openserve --help
Option 2: From Source
git clone https://github.com/nikjois/openserve-rs.git
cd openserve-rs
cargo build --release
Option 3: Using Docker
git clone https://github.com/nikjois/openserve-rs.git
cd openserve-rs
docker-compose up -d
Basic Usage
# Start the server
openserve --port 8080 --host 0.0.0.0 /path/to/files
# With AI features enabled
openserve --ai --openai-api-key your-key-here /path/to/files
# With custom configuration
openserve --config config.yml /path/to/files
Configuration
Environment Variables
# Server Configuration
OPENSERVE_HOST=0.0.0.0
OPENSERVE_PORT=8080
OPENSERVE_SERVE_PATH=/app/files
# AI Configuration
OPENAI_API_KEY=your-api-key-here
OPENSERVE_AI_ENABLED=true
# Database Configuration
DATABASE_URL=sqlite:///app/data/openserve.db
REDIS_URL=redis://localhost:6379
# Logging
RUST_LOG=info
Configuration File (config.yml)
server:
host: "0.0.0.0"
port: 8080
serve_path: "./files"
max_upload_size: 104857600 # 100MB
enable_tls: false
ai:
enabled: true
api_key: "your-openai-api-key"
model: "gpt-4o-mini"
max_tokens: 2048
temperature: 0.7
auth:
enabled: true
jwt_secret: "your-secret-key"
session_timeout: 3600
allow_registration: false
storage:
database_url: "sqlite://./data.db"
redis_url: "redis://localhost:6379"
cache_size: 1000
index_path: "./index"
telemetry:
log_level: "info"
log_format: "json"
metrics_enabled: true
tracing_enabled: false
API Reference
File Operations
# List directory contents
GET /api/files?path=/some/path
# Upload file
POST /api/files/upload
Content-Type: multipart/form-data
# Download file
GET /api/files/download?path=/file.txt
# Delete file
DELETE /api/files?path=/file.txt
# Get file metadata
GET /api/files/metadata?path=/file.txt
Search Operations
# Search files
GET /api/search?q=query&limit=10
# Semantic search
GET /api/search?q=query&semantic=true
# Get search statistics
GET /api/search/stats
AI Operations
# Analyze file content
POST /api/ai/analyze
{
"path": "/document.txt",
"options": {
"summarize": true,
"extract_entities": true,
"generate_tags": true
}
}
# Chat with files
POST /api/ai/chat
{
"message": "What is this document about?",
"context_paths": ["/document.txt"]
}
Authentication
# Login
POST /api/auth/login
{
"username": "user",
"password": "password"
}
# Register (if enabled)
POST /api/auth/register
{
"username": "user",
"email": "user@example.com",
"password": "password"
}
Library Usage
OpenServe can also be used as a library in your Rust projects:
use openserve::prelude::*;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let config = Config::default();
let server = Server::new(config).await?;
server.run().await?;
Ok(())
}
Testing
# Run all tests
cargo test
# Run with coverage
cargo tarpaulin --out Html
# Run benchmarks
cargo bench
# Integration tests
cargo test --test integration
Development
Project Structure
openserve-rs/
├── src/
│ ├── ai/ # AI service integration
│ ├── config/ # Configuration management
│ ├── error/ # Error handling
│ ├── handlers/ # HTTP request handlers
│ ├── middleware/ # Custom middleware
│ ├── models/ # Data models
│ ├── services/ # Business logic
│ ├── utils/ # Utility functions
│ ├── lib.rs # Library root
│ └── main.rs # Application entry point
├── tests/ # Integration tests
├── benches/ # Benchmarks
├── docker/ # Docker configuration
└── docs/ # Documentation
Building from Source
# Development build
cargo build
# Release build
cargo build --release
# With all features
cargo build --all-features
# Cross-compilation
cargo build --target x86_64-unknown-linux-musl
Contributing
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
Deployment
Docker Deployment
# Build and deploy
docker-compose up -d
# Scale services
docker-compose up -d --scale openserve=3
# View logs
docker-compose logs -f openserve
Systemd Service
# Install service
sudo cp scripts/openserve.service /etc/systemd/system/
sudo systemctl enable openserve
sudo systemctl start openserve
Monitoring
Metrics
- Request count and latency
- File operation statistics
- Search performance metrics
- AI service usage
- System resource usage
Dashboards
- Grafana dashboards included
- Prometheus metrics collection
- Real-time monitoring
- Alerting rules
Health Checks
# Health endpoint
curl http://localhost:8080/health
# Metrics endpoint
curl http://localhost:8080/metrics
# Ready endpoint
curl http://localhost:8080/ready
Troubleshooting
Common Issues
Port already in use
# Find process using port
lsof -i :8080
# Kill process
kill -9 <PID>
Permission denied
# Check file permissions
ls -la /path/to/files
# Fix permissions
chmod -R 755 /path/to/files
AI features not working
# Check API key
echo $OPENAI_API_KEY
# Test API connection
curl -H "Authorization: Bearer $OPENAI_API_KEY" https://api.openai.com/v1/models
Logs
# View logs
docker-compose logs -f openserve
# Increase log level
RUST_LOG=debug cargo run
# Structured logging
RUST_LOG=info,openserve=debug cargo run
Performance
OpenServe is designed for high performance:
- Built with async Rust using Tokio
- Efficient file I/O with memory mapping
- Connection pooling and caching
- Optimized search indexing
- Zero-copy operations where possible
Security
Security is a top priority:
- Path traversal protection
- Input validation and sanitization
- JWT token authentication
- Rate limiting
- CORS protection
- TLS/SSL support
License
This project is licensed under the MIT License - see the LICENSE file for details.
Support
- Issues: GitHub Issues
- Documentation: docs.rs/openserve
Acknowledgments
- Axum - Web framework
- Tantivy - Search engine
- OpenAI - AI capabilities
- Tokio - Async runtime
- Serde - Serialization
Made with Rust
Dependencies
~69–105MB
~2M SLoC