3.1 KiB
3.1 KiB
Mistral AI Rust Client
Rust client for the Mistral AI API.
Supported APIs
- Chat without streaming
- Chat without streaming (async)
- Chat with streaming
- Embedding
- Embedding (async)
- List models
- List models (async)
- Function Calling
- Function Calling (async)
Installation
You can install the library in your project using:
cargo add mistralai-client
Mistral API Key
You can get your Mistral API Key there: https://docs.mistral.ai/#api-access.
As an environment variable
Just set the MISTRAL_API_KEY environment variable.
use mistralai_client::v1::client::Client;
fn main() {
let client = Client::new(None, None, None, None);
}
MISTRAL_API_KEY=your_api_key cargo run
As a client argument
use mistralai_client::v1::client::Client;
fn main() {
let api_key = "your_api_key";
let client = Client::new(Some(api_key), None, None, None).unwrap();
}
Usage
Chat
examples/chat.rs
Chat (async)
examples/chat_async.rs
Chat with streaming (async)
examples/chat_with_streaming.rs
Chat with Function Calling
examples/chat_with_function_calling.rs
Chat with Function Calling (async)
examples/chat_with_function_calling_async.rs
Embeddings
examples/embeddings.rs
Embeddings (async)
examples/embeddings_async.rs
List models
examples/list_models.rs
List models (async)
examples/list_models_async.rs
Contributing
Please read CONTRIBUTING.md for details on how to contribute to this library.