0.01
A long-lived project that still receives updates
Official Ruby SDK for Langfuse, providing LLM tracing, observability, and prompt management capabilities
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
2025
2026
 Dependencies
 Project Readme
langfuse-wordart

Langfuse Ruby SDK

Gem Version Ruby Test Coverage

Ruby SDK for Langfuse - Open-source LLM observability and prompt management.

Installation

gem "langfuse-rb"

Quick Start

Langfuse.configure do |config|
  config.public_key = ENV["LANGFUSE_PUBLIC_KEY"]
  config.secret_key = ENV["LANGFUSE_SECRET_KEY"]
  config.base_url = ENV.fetch("LANGFUSE_BASE_URL", "https://cloud.langfuse.com")

  # Optional: sample traces and trace-linked scores deterministically
  config.sample_rate = 1.0
end

message = Langfuse.client.compile_prompt(
  "greeting",
  variables: { name: "Alice" }
)

Langfuse tracing is isolated by default. Langfuse.configure stores configuration only; it does not replace OpenTelemetry.tracer_provider.

sample_rate is applied to traces and trace-linked scores. Rebuild the client with Langfuse.reset! before expecting runtime sampling changes to take effect.

Trace an LLM Call

Langfuse.observe("chat-completion", as_type: :generation) do |gen|
  gen.model = "gpt-4.1-mini"
  gen.input = [{ role: "user", content: "Hello!" }]

  response = openai_client.chat(
    parameters: {
      model: "gpt-4.1-mini",
      messages: [{ role: "user", content: "Hello!" }]
    }
  )

  gen.update(
    output: response.dig("choices", 0, "message", "content"),
    usage_details: {
      prompt_tokens: response.dig("usage", "prompt_tokens"),
      completion_tokens: response.dig("usage", "completion_tokens")
    }
  )
end

Start Here

License

MIT