BREAKING CHANGE: - `Chat::ChatParams.safe_prompt` & `Chat::ChatRequest.safe_prompt` are now `bool` instead of `Option<bool>`. Default is `false`. - `Chat::ChatParams.temperature` & `Chat::ChatRequest.temperature` are now `f32` instead of `Option<f32>`. Default is `0.7`. - `Chat::ChatParams.top_p` & `Chat::ChatRequest.top_p` are now `f32` instead of `Option<f32>`. Default is `1.0`.
3.2 KiB
3.2 KiB
Mistral AI Rust Client
Rust client for the Mistral AI API.
Important
While we are in v0, minor versions may introduce breaking changes.
Please, refer to the CHANGELOG.md for more information.
Supported APIs
- Chat without streaming
- Chat without streaming (async)
- Chat with streaming
- Embedding
- Embedding (async)
- List models
- List models (async)
- Function Calling
- Function Calling (async)
Installation
You can install the library in your project using:
cargo add mistralai-client
Mistral API Key
You can get your Mistral API Key there: https://docs.mistral.ai/#api-access.
As an environment variable
Just set the MISTRAL_API_KEY environment variable.
use mistralai_client::v1::client::Client;
fn main() {
let client = Client::new(None, None, None, None);
}
MISTRAL_API_KEY=your_api_key cargo run
As a client argument
use mistralai_client::v1::client::Client;
fn main() {
let api_key = "your_api_key";
let client = Client::new(Some(api_key), None, None, None).unwrap();
}
Usage
Chat
examples/chat.rs
Chat (async)
examples/chat_async.rs
Chat with streaming (async)
examples/chat_with_streaming.rs
Chat with Function Calling
examples/chat_with_function_calling.rs
Chat with Function Calling (async)
examples/chat_with_function_calling_async.rs
Embeddings
examples/embeddings.rs
Embeddings (async)
examples/embeddings_async.rs
List models
examples/list_models.rs
List models (async)
examples/list_models_async.rs
Contributing
Please read CONTRIBUTING.md for details on how to contribute to this library.