Installation
Install the vllora_llm crate from crates.io using Cargo.
Add to your project
Run cargo add to add the crate to your project:
cargo add vllora_llm
Or manually add it to your Cargo.toml:
[dependencies]
vllora_llm = "0.1"
API keys configuration
By default, VlloraLLMClient::new() fetches API keys from environment variables following the pattern VLLORA_{PROVIDER_NAME}_API_KEY. For example, for OpenAI, it will look for VLLORA_OPENAI_API_KEY.
You can also provide credentials directly when constructing the client (see the quick start for an example with explicit OpenAI credentials).
Next steps
- Quick Start - Get started with your first request
- Usage Guide - Learn about gateway-native types, streaming, and supported parameters