LangfuseSharp.Core
1.1.0
dotnet add package LangfuseSharp.Core --version 1.1.0
NuGet\Install-Package LangfuseSharp.Core -Version 1.1.0
<PackageReference Include="LangfuseSharp.Core" Version="1.1.0" />
<PackageVersion Include="LangfuseSharp.Core" Version="1.1.0" />
<PackageReference Include="LangfuseSharp.Core" />
paket add LangfuseSharp.Core --version 1.1.0
#r "nuget: LangfuseSharp.Core, 1.1.0"
#:package LangfuseSharp.Core@1.1.0
#addin nuget:?package=LangfuseSharp.Core&version=1.1.0
#tool nuget:?package=LangfuseSharp.Core&version=1.1.0
Langfuse .NET SDK (Unofficial)
Unofficial .NET SDK for Langfuse - the open-source LLM engineering platform.
Packages
| Package | Description | Install |
|---|---|---|
| LangfuseSharp.OpenTelemetry | Export OTEL traces to Langfuse | dotnet add package LangfuseSharp.OpenTelemetry |
| LangfuseSharp.Client | Prompt management with caching | dotnet add package LangfuseSharp.Client |
| LangfuseSharp.Core | Core library with shared types and configuration | Automatically included with other packages |
Note:
LangfuseSharp.Coreis a shared library used by other Langfuse packages. When you installLangfuseSharp.ClientorLangfuseSharp.OpenTelemetry,LangfuseSharp.Coreis automatically included as a dependency. You typically don't need to install it directly unless you're building custom integrations.
LangfuseSharp.OpenTelemetry
Export .NET OpenTelemetry traces to Langfuse. Works with any OTEL-instrumented library including Semantic Kernel.
Quick Start
1. Install
dotnet add package LangfuseSharp.OpenTelemetry
2. Set environment variables
LANGFUSE_PUBLIC_KEY=pk-lf-...
LANGFUSE_SECRET_KEY=sk-lf-...
LANGFUSE_BASE_URL=https://cloud.langfuse.com # EU region (default)
# LANGFUSE_BASE_URL=https://us.cloud.langfuse.com # US region
3. Add to your app
using Langfuse.OpenTelemetry;
using Microsoft.SemanticKernel;
using OpenTelemetry;
using OpenTelemetry.Trace;
// Enable GenAI diagnostics (prompts, tokens, completions)
AppContext.SetSwitch("Microsoft.SemanticKernel.Experimental.GenAI.EnableOTelDiagnosticsSensitive", true);
// Setup OpenTelemetry with Langfuse exporter
using var tracerProvider = Sdk.CreateTracerProviderBuilder()
.AddSource("Microsoft.SemanticKernel*")
.AddLangfuseExporter()
.Build();
// Use Semantic Kernel as normal
var kernel = Kernel.CreateBuilder()
.AddOpenAIChatCompletion("gpt-4o-mini", apiKey)
.Build();
var result = await kernel.InvokePromptAsync("Hello!");
Configuration Options
// Option 1: Environment variables (recommended)
.AddLangfuseExporter()
// Option 2: Manual configuration
.AddLangfuseExporter(options =>
{
options.PublicKey = "pk-lf-...";
options.SecretKey = "sk-lf-...";
options.BaseUrl = "https://cloud.langfuse.com";
})
// Option 3: From IConfiguration (appsettings.json)
.AddLangfuseExporter(configuration)
LangfuseSharp.Client
Access Langfuse features like Prompt Management directly from .NET with built-in caching.
Quick Start
1. Install
dotnet add package LangfuseSharp.Client
2. Set environment variables
LANGFUSE_PUBLIC_KEY=pk-lf-...
LANGFUSE_SECRET_KEY=sk-lf-...
LANGFUSE_BASE_URL=https://cloud.langfuse.com
3. Use prompts
using Langfuse.Client;
var client = new LangfuseClient();
// Fetch a text prompt (cached for 60s by default)
var prompt = await client.GetPromptAsync("movie-critic");
// Compile with variables
var compiled = prompt.Compile(new Dictionary<string, string>
{
["criticlevel"] = "expert",
["movie"] = "Dune 2"
});
// -> "As an expert movie critic, do you like Dune 2?"
// Fetch a chat prompt
var chatPrompt = await client.GetChatPromptAsync("movie-critic-chat");
var messages = chatPrompt.Compile(("criticlevel", "expert"), ("movie", "Dune 2"));
// -> [{ role: "system", content: "..." }, { role: "user", content: "..." }]
Features
- Text & Chat prompts - Full support for both prompt types
- Variable compilation -
{{variable}}syntax support - Version/Label selection - Fetch specific versions or labels (production, staging)
- Client-side caching - 60s TTL by default, configurable
- Fallback prompts - Graceful degradation when API fails
- Config access - Access prompt config (model, temperature, etc.)
// Get specific version
var v1 = await client.GetPromptAsync("my-prompt", version: 1);
// Get by label
var staging = await client.GetPromptAsync("my-prompt", label: "staging");
// With fallback
var fallback = TextPrompt.CreateFallback("default", "Fallback prompt text");
var prompt = await client.GetPromptAsync("my-prompt", fallback: fallback);
// Access config
var model = prompt.GetConfigValue<string>("model");
var temperature = prompt.GetConfigValue<double>("temperature", 0.7);
Documentation
- Testing Guide - How to run tests
- Features - Implemented features with Langfuse docs links
- Contributing - How to contribute
Running the Sample
# Set environment variables
export OPENAI_API_KEY="sk-..."
export LANGFUSE_PUBLIC_KEY="pk-lf-..."
export LANGFUSE_SECRET_KEY="sk-lf-..."
export LANGFUSE_BASE_URL="https://cloud.langfuse.com"
# Run sample
cd samples/SemanticKernel.Sample
dotnet run
Check your Langfuse dashboard to see the traces.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Links
- Langfuse - Open-source LLM engineering platform
- Langfuse Docs - Official documentation
- OpenTelemetry Integration - OTEL docs
- Prompt Management - Prompts docs
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| .NET | net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 is compatible. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net8.0
- Microsoft.Extensions.Configuration.Abstractions (>= 8.0.0)
- Microsoft.Extensions.Configuration.Binder (>= 8.0.0)
- Microsoft.Extensions.Logging.Abstractions (>= 8.0.2)
-
net9.0
- Microsoft.Extensions.Configuration.Abstractions (>= 8.0.0)
- Microsoft.Extensions.Configuration.Binder (>= 8.0.0)
- Microsoft.Extensions.Logging.Abstractions (>= 8.0.2)
NuGet packages (2)
Showing the top 2 NuGet packages that depend on LangfuseSharp.Core:
| Package | Downloads |
|---|---|
|
LangfuseSharp.Client
Unofficial .NET client for Langfuse. Provides access to Langfuse features like Prompt Management, with client-side caching. |
|
|
LangfuseSharp.OpenTelemetry
A bridge between .NET OpenTelemetry (OTEL) and Langfuse. Automatically exports OTEL traces to Langfuse with a single line of code. |
GitHub repositories
This package is not used by any popular GitHub repositories.
| Version | Downloads | Last Updated |
|---|---|---|
| 1.1.0 | 241 | 12/5/2025 |