🪢 Langfuse Python SDK - Instrument your LLM app with decorators or low-level SDK and get detailed tracing/observability. Works with any LLM or framework
-
Updated
Feb 18, 2025 - Python
🪢 Langfuse Python SDK - Instrument your LLM app with decorators or low-level SDK and get detailed tracing/observability. Works with any LLM or framework
🪢 Langfuse documentation -- Langfuse is the open source LLM Engineering Platform. Observability, evals, prompt management, playground and metrics to debug and improve LLM apps
🪢 Langfuse JS/TS SDKs - Instrument your LLM app and get detailed tracing/observability. Works with any LLM or framework
Self-hosting Langfuse on Amazon ECS with Fargate using CDK Python
SynthGenAI - Package for Generating Synthetic Datasets using LLMs.
This repository provides resources and guidelines to facilitate the integration of Open-WebUI and Langfuse, enabling seamless monitoring and management of AI model usage statistics.
This Terraform module provides infrastructure components for deploying Langfuse v3 self-hosted on Amazon Web Service(AWS).
技術書典#17 - 『俺たちと探究するLLM Observabilityアプリケーションのオブザーバビリティ』で使用するサンプルアプリケーション
Blueprint for running AWS Bedrock Multi-Agent AI collaboration with CDK, Streamlit and LangFuse
Model Context Protocol (MCP) Server for Langfuse Prompt Management. This server allows you to access and manage your Langfuse prompts through the Model Context Protocol.
Modular FastAPI-based API gateway for OpenAI compatible APIs.
Monitoring and evaluating LLM apps with Langfuse. Presented at PyConZA 2024.
Langfuse integration for pydantic-ai
Ollama Proxy Server with extensions
Langfuse-Kong-Plugin: Langfuse is a Open Source LLM Engineering Platform to log Traces, evals, prompt management and metrics to debug and improve your LLM application.
Add a description, image, and links to the langfuse topic page so that developers can more easily learn about it.
To associate your repository with the langfuse topic, visit your repo's landing page and select "manage topics."