A type-safe, async Rust SDK for the OpenRouter API - Access 200+ AI models with ease
- 🔒 Type Safety: Leverages Rust's type system for compile-time error prevention
- ⚡ Async/Await: Built on
tokiofor high-performance concurrent operations - 🧠 Reasoning Tokens: Industry-leading chain-of-thought reasoning support
- 📡 Streaming: Real-time response streaming with
futures - 🏗️ Builder Pattern: Ergonomic client and request construction
- ⚙️ Smart Presets: Curated model groups for programming, reasoning, and free tiers
- 🎯 Complete Coverage: All OpenRouter API endpoints supported
Add to your Cargo.toml:
[dependencies]
openrouter-rs = "0.4.5"
tokio = { version = "1", features = ["full"] }use openrouter_rs::{OpenRouterClient, api::chat::*, types::Role};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Create client
let client = OpenRouterClient::builder()
.api_key("your_api_key")
.build()?;
// Send chat completion
let request = ChatCompletionRequest::builder()
.model("anthropic/claude-sonnet-4")
.messages(vec![
Message::new(Role::User, "Explain Rust ownership in simple terms")
])
.build()?;
let response = client.send_chat_completion(&request).await?;
println!("{}", response.choices[0].content().unwrap_or(""));
Ok(())
}Leverage chain-of-thought processing with reasoning tokens:
use openrouter_rs::types::Effort;
let request = ChatCompletionRequest::builder()
.model("deepseek/deepseek-r1")
.messages(vec![Message::new(Role::User, "What's bigger: 9.9 or 9.11?")])
.reasoning_effort(Effort::High) // Enable deep reasoning
.reasoning_max_tokens(2000) // Control reasoning depth
.build()?;
let response = client.send_chat_completion(&request).await?;
// Access both reasoning and final answer
println!("🧠 Reasoning: {}", response.choices[0].reasoning().unwrap_or(""));
println!("💡 Answer: {}", response.choices[0].content().unwrap_or(""));Process responses as they arrive:
use futures_util::StreamExt;
let stream = client.stream_chat_completion(&request).await?;
stream
.filter_map(|event| async { event.ok() })
.for_each(|chunk| async move {
if let Some(content) = chunk.choices[0].content() {
print!("{}", content); // Print as it arrives
}
})
.await;Use curated model collections:
use openrouter_rs::config::OpenRouterConfig;
let config = OpenRouterConfig::default();
// Three built-in presets:
// • programming: Code generation and development
// • reasoning: Advanced problem-solving models
// • free: Free-tier models for experimentation
println!("Available models: {:?}", config.get_resolved_models());use openrouter_rs::error::OpenRouterError;
match client.send_chat_completion(&request).await {
Ok(response) => println!("Success!"),
Err(OpenRouterError::ModerationError { reasons, .. }) => {
eprintln!("Content flagged: {:?}", reasons);
}
Err(OpenRouterError::ApiError { code, message }) => {
eprintln!("API error {}: {}", code, message);
}
Err(e) => eprintln!("Other error: {}", e),
}| Feature | Status | Module |
|---|---|---|
| Chat Completions | ✅ | api::chat |
| Text Completions | ✅ | api::completion |
| Reasoning Tokens | ✅ | api::chat |
| Streaming Responses | ✅ | api::chat |
| Model Information | ✅ | api::models |
| API Key Management | ✅ | api::api_keys |
| Credit Management | ✅ | api::credits |
| Authentication | ✅ | api::auth |
use openrouter_rs::types::ModelCategory;
let models = client
.list_models_by_category(ModelCategory::Programming)
.await?;
println!("Found {} programming models", models.len());let client = OpenRouterClient::builder()
.api_key("your_key")
.http_referer("https://yourapp.com")
.x_title("My AI App")
.base_url("https://openrouter.ai/api/v1") // Custom endpoint
.build()?;let stream = client.stream_chat_completion(
&ChatCompletionRequest::builder()
.model("anthropic/claude-sonnet-4")
.messages(vec![Message::new(Role::User, "Solve this step by step: 2x + 5 = 13")])
.reasoning_effort(Effort::High)
.build()?
).await?;
let mut reasoning_buffer = String::new();
let mut content_buffer = String::new();
stream.filter_map(|event| async { event.ok() })
.for_each(|chunk| async {
if let Some(reasoning) = chunk.choices[0].reasoning() {
reasoning_buffer.push_str(reasoning);
print!("🧠"); // Show reasoning progress
}
if let Some(content) = chunk.choices[0].content() {
content_buffer.push_str(content);
print!("💬"); // Show content progress
}
}).await;
println!("\n🧠 Reasoning: {}", reasoning_buffer);
println!("💡 Answer: {}", content_buffer);- 📖 API Documentation - Complete API reference
- 🎯 Examples Repository - Comprehensive usage examples
- 🔧 Configuration Guide - Model presets and configuration
- ⚡ OpenRouter API Docs - Official OpenRouter documentation
# Set your API key
export OPENROUTER_API_KEY="your_key_here"
# Basic chat completion
cargo run --example send_chat_completion
# Reasoning tokens demo
cargo run --example chat_with_reasoning
# Streaming responses
cargo run --example stream_chat_completion
# Run with reasoning
cargo run --example stream_chat_with_reasoningPlease open an issue with:
- Your Rust version (
rustc --version) - SDK version you're using
- Minimal code example
- Expected vs actual behavior
We love hearing your ideas! Start a discussion to:
- Suggest new features
- Share use cases
- Get help with implementation
Contributions are welcome! Please see our contributing guidelines:
- Fork the repository
- Create a feature branch
- Add tests for new functionality
- Follow the existing code style
- Submit a pull request
If this SDK helps your project, consider:
- ⭐ Starring the repository
- 🐦 Sharing on social media
- 📝 Writing about your experience
- 🤝 Contributing improvements
- Rust: 1.85+ (2024 edition)
- Tokio: 1.0+ (for async runtime)
- OpenRouter API Key: Get yours here
- WebSocket Support - Real-time bidirectional communication
- Retry Strategies - Automatic retry with exponential backoff
- Caching Layer - Response caching for improved performance
- CLI Tool - Command-line interface for quick testing
- Middleware System - Request/response interceptors
This project is licensed under the MIT License - see the LICENSE file for details.
This is a third-party SDK not officially affiliated with OpenRouter. Use at your own discretion.
- 🧠 New: Complete reasoning tokens implementation with chain-of-thought support
- ⚙️ Updated: Model presets restructured to
programming/reasoning/freecategories - 📚 Enhanced: Professional-grade documentation with comprehensive examples
- 🏗️ Improved: Configuration system with better model management
- Added: Support for listing models by supported parameters
- Note: OpenRouter API limitations on simultaneous category and parameter filtering
- Added: Support for listing models by category
- Thanks to OpenRouter team for the API enhancement!
Made with ❤️ for the Rust community
⭐ Star us on GitHub | 📦 Find us on Crates.io | 📚 Read the Docs