Files
mistralai-client-rs/README.md
2024-03-03 15:35:57 +01:00

2.9 KiB

Mistral AI Rust Client

Crates.io Package Docs.rs Documentation Test Workflow Status Code Coverage

Rust client for the Mistral AI API.



Supported APIs

  • Chat without streaming
  • Chat with streaming
  • Embedding
  • List models
  • Function Calling

Installation

You can install the library in your project using:

cargo add mistralai-client

Mistral API Key

You can get your Mistral API Key there: https://docs.mistral.ai/#api-access.

As an environment variable

Just set the MISTRAL_API_KEY environment variable.

As a client argument

use mistralai_client::v1::client::Client;

fn main() {
    let api_key = "your_api_key";

    let client = Client::new(Some(api_key), None, None, None);
}

Usage

Chat without streaming

use mistralai::v1::{
    chat_completion::{
        ChatCompletionMessage, ChatCompletionMessageRole, ChatCompletionRequest,
        ChatCompletionRequestOptions,
    },
    client::Client,
    constants::OPEN_MISTRAL_7B,
};

fn main() {
    // This example suppose you have set the `MISTRAL_API_KEY` environment variable.
    let client = Client::new(None, None, None, None);

    let model = OPEN_MISTRAL_7B.to_string();
    let messages = vec![ChatCompletionMessage {
        role: ChatCompletionMessageRole::user,
        content: "Just guess the next word: \"Eiffel ...\"?".to_string(),
    }];
    let options = ChatCompletionRequestOptions {
        temperature: Some(0.0),
        random_seed: Some(42),
        ..Default::default()
    };

    let chat_completion_request = ChatCompletionRequest::new(model, messages, Some(options));
    let result = client.chat(chat_completion_request).unwrap();
    println!("Assistant: {}", result.choices[0].message.content);
    // => "Assistant: Tower. [...]"
}

Chat with streaming

In progress.

Embeddings

In progress.

List models

In progress.