Skip to main content

Assistants Evaluation (Langfuse)

This comprehensive guide explains how to install and configure Langfuse using Helm, with both automated and manual deployment methods.

Overview

Langfuse is an open-source LLM observability platform that provides:

  • Tracing: Track and analyze LLM calls and their performance
  • Evaluation: Assess and score AI assistant responses
  • Analytics: Gain insights into usage patterns and costs
  • Debugging: Identify and troubleshoot issues in LLM applications

Deployment Options

This guide provides two deployment methods:

Uses the deploy-langfuse.sh script to automatically handle:

  • Kubernetes secret creation
  • Helm repository configuration
  • Langfuse deployment
  • Integration secret creation for CodeMie

See Deployment for both automated and manual deployment options.

Documentation Structure

Follow these sections in order for a successful deployment:

  1. Prerequisites - Required tools and infrastructure
  2. System Requirements - Resource specifications and architecture
  3. Deployment Prerequisites - Configuration steps before deployment
  4. Deployment - Automated or manual deployment options
  5. Post-Deployment Configuration - Configure CodeMie integration
  6. Troubleshooting - Common issues and solutions

Next Steps

Start with Prerequisites to ensure your environment is ready for Langfuse deployment.