feat: streaming Conversations API support (v1.2.0)
Add conversation_stream module with full streaming support for the Mistral Conversations API: - ConversationEvent enum matching API SSE event types: ResponseStarted, MessageOutput (text deltas), FunctionCall, ResponseDone (with usage), ResponseError, tool execution, agent handoff events - parse_sse_line() handles SSE format (skips event: lines, parses data: JSON, handles [DONE] and comments) - accumulate() collects streaming events into a ConversationResponse - create_conversation_stream_async() and append_conversation_stream_async() client methods - Byte-boundary buffering in sse_to_conversation_events — handles JSON split across TCP frames - Integration tests hit real Mistral API: create stream, append stream, stream/non-stream output equivalence
This commit is contained in:
@@ -2,7 +2,7 @@
|
||||
name = "mistralai-client"
|
||||
description = "Mistral AI API client library for Rust (unofficial)."
|
||||
license = "Apache-2.0"
|
||||
version = "1.1.0"
|
||||
version = "1.2.0"
|
||||
|
||||
edition = "2021"
|
||||
rust-version = "1.76.0"
|
||||
|
||||
Reference in New Issue
Block a user