cubesLangfuse

Using Infron AI with Langfuse

Langfusearrow-up-right provides observability and analytics for LLM applications. Since Infron AI uses the OpenAI API schema, you can utilize Langfuse's native integration with the OpenAI SDK to automatically trace and monitor your Infron AI API calls.

Installation

pip install langfuse openai

Configuration

Set up your environment variables:

import os

# Set your Langfuse API keys
LANGFUSE_SECRET_KEY="sk-lf-..."
LANGFUSE_PUBLIC_KEY="pk-lf-..."
# EU region
LANGFUSE_HOST="https://cloud.langfuse.com"
# US region
# LANGFUSE_HOST="https://us.cloud.langfuse.com"

# Set your Infron AI API key
os.environ["OPENAI_API_KEY"] = "${API_KEY}"

Simple LLM Call

Since Infron AI provides an OpenAI-compatible API, you can use the Langfuse OpenAI SDK wrapper to automatically log Infron AI calls as generations in Langfuse:

Advanced Tracing with Nested Calls

Use the @observe() decorator to capture execution details of functions with nested LLM calls:

Last updated