(summary) add Langfuse observability for LLM API calls

Implement Langfuse tracing integration for LLM service calls to capture
prompts, responses, latency, token usage, and errors, enabling
comprehensive monitoring and debugging of AI model interactions
for performance analysis and cost optimization.
This commit is contained in:
lebaudantoine
2025-12-11 22:51:32 +01:00
committed by aleb_the_flash
parent c81ef38005
commit aff87d4953
4 changed files with 139 additions and 19 deletions

View File

@@ -11,6 +11,7 @@ and this project adheres to
### Added
- ✨(backend) enable user creation via email for external integrations
- ✨(summary) add Langfuse observability for LLM API calls
## [1.0.1] - 2025-12-17