Files
mistralai-client-rs/Cargo.toml
Sienna Meridian Satterwhite 4c7f1cde0a feat: streaming Conversations API support (v1.2.0)
Add conversation_stream module with full streaming support for the
Mistral Conversations API:

- ConversationEvent enum matching API SSE event types: ResponseStarted,
  MessageOutput (text deltas), FunctionCall, ResponseDone (with usage),
  ResponseError, tool execution, agent handoff events
- parse_sse_line() handles SSE format (skips event: lines, parses
  data: JSON, handles [DONE] and comments)
- accumulate() collects streaming events into a ConversationResponse
- create_conversation_stream_async() and
  append_conversation_stream_async() client methods
- Byte-boundary buffering in sse_to_conversation_events — handles
  JSON split across TCP frames
- Integration tests hit real Mistral API: create stream, append stream,
  stream/non-stream output equivalence
2026-03-24 21:16:39 +00:00

33 lines
844 B
TOML

[package]
name = "mistralai-client"
description = "Mistral AI API client library for Rust (unofficial)."
license = "Apache-2.0"
version = "1.2.0"
edition = "2021"
rust-version = "1.76.0"
authors = ["Sunbeam Studios <hello@sunbeam.pt>"]
categories = ["api-bindings"]
homepage = "https://sunbeam.pt"
keywords = ["mistral", "mistralai", "client", "api", "llm"]
readme = "README.md"
repository = "https://src.sunbeam.pt/studio/mistralai-client-rs"
publish = ["sunbeam"]
[dependencies]
async-stream = "0.3"
async-trait = "0.1"
env_logger = "0.11"
futures = "0.3"
log = "0.4"
reqwest = { version = "0.12", features = ["json", "blocking", "stream", "multipart"] }
serde = { version = "1", features = ["derive"] }
serde_json = "1"
thiserror = "2"
tokio = { version = "1", features = ["full"] }
tokio-stream = "0.1"
[dev-dependencies]
jrest = "0.2"