0.0
No release in over 3 years
Provides Sentry Gen AI instrumentation for AI/LLM agents, supporting multiple providers (Anthropic, OpenAI, etc.) with auto-instrumentation for RubyLLM and LangChain.rb.
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
2025
 Dependencies

Development

~> 5.0
~> 13.0
~> 1.21

Runtime

>= 5.0.0
 Project Readme

Sentry Agents

Gem Version Ruby Style Guide Gem Downloads

Sentry Gen AI instrumentation for AI/LLM agents in Ruby applications.

Provides Sentry's AI Agents monitoring capabilities for Ruby, supporting multiple LLM providers (Anthropic, OpenAI, Cohere, Google Gemini, etc.).

Installation

Add this line to your application's Gemfile:

gem 'sentry-agents'

And then execute:

bundle install

Or install it yourself as:

gem install sentry-agents

Requirements

  • Ruby >= 3.1.0
  • sentry-ruby >= 5.0.0

Configuration

Sentry::Agents.configure do |config|
  # Default LLM provider (default: "anthropic")
  config.default_system = "anthropic"

  # Maximum string length for serialized data (default: 1000)
  config.max_string_length = 1000

  # Enable debug logging (default: false)
  config.debug = false

  # Custom data filtering (optional)
  config.data_filter = ->(data) do
    # Remove sensitive keys in production
    data.delete("gen_ai.request.messages") if ENV["SENTRY_SKIP_MESSAGES"]
    data
  end
end

Usage

Manual Instrumentation

Include the Sentry::Agents::Instrumentation module in any class:

class MyAgent
  include Sentry::Agents::Instrumentation

  def process_request(user_message)
    with_agent_span(agent_name: "MyAgent", model: "claude-3-5-sonnet") do
      # Get LLM response
      response = with_chat_span(model: "claude-3-5-sonnet") do
        client.messages.create(
          model: "claude-3-5-sonnet-20241022",
          messages: [{ role: "user", content: user_message }]
        )
      end

      # Execute tool if needed
      if response.stop_reason == "tool_use"
        with_tool_span(
          tool_name: "search",
          tool_input: { query: response.tool_input["query"] }
        ) do
          search_service.search(response.tool_input["query"])
        end
      end

      # Track stage transition
      with_handoff_span(from_stage: "processing", to_stage: "complete") do
        update_status!(:complete)
      end

      response
    end
  end
end

Custom Provider Override

Override the default provider on a per-span basis:

class OpenAIAgent
  include Sentry::Agents::Instrumentation

  def process(message)
    with_chat_span(model: "gpt-4", system: "openai") do
      openai_client.chat(model: "gpt-4", messages: [message])
    end
  end
end

Span Types

Agent Invocation (gen_ai.invoke_agent)

Wraps the overall agent execution lifecycle.

with_agent_span(agent_name: "Emily", model: "claude-3-5-sonnet") do
  # Full agent conversation logic
end

Chat Completion (gen_ai.chat)

Wraps individual LLM API calls. Automatically captures:

  • Token usage (input/output tokens)
  • Response text
with_chat_span(model: "claude-3-5-sonnet", messages: conversation_history) do
  llm_client.chat(messages)
end

Tool Execution (gen_ai.execute_tool)

Wraps tool/function executions. Captures:

  • Tool name
  • Tool input
  • Tool output
with_tool_span(tool_name: "weather_lookup", tool_input: { city: "NYC" }) do
  weather_api.get_forecast("NYC")
end

Handoff (gen_ai.handoff)

Tracks stage transitions or agent handoffs.

with_handoff_span(from_stage: "greeting", to_stage: "qualification") do
  update_conversation_stage!
end

Graceful Degradation

All instrumentation methods gracefully degrade when Sentry is not available or tracing is disabled. Your code will continue to work normally without any errors.

# Works fine even without Sentry initialized
with_chat_span(model: "claude-3-5-sonnet") do
  llm_client.chat(messages)  # Still executes, just without tracing
end

Development

After checking out the repo, run:

bundle install
rake test      # Run tests
rake rubocop   # Run linter
rake           # Run both

Releasing

Releases are automated via GitHub Actions. To publish a new version:

  1. Update the version in lib/sentry/agents/version.rb
  2. Update CHANGELOG.md with the new version's changes
  3. Commit the changes:
    git add -A && git commit -m "Bump version to X.Y.Z"
  4. Create and push a version tag:
    git tag vX.Y.Z
    git push origin main --tags

The release workflow will automatically:

  • Run the test suite
  • Build the gem
  • Publish to RubyGems
  • Create a GitHub Release with auto-generated release notes

Contributing

Bug reports and pull requests are welcome on GitHub at https://github.com/ihoka/sentry-agents/.

Sponsors

This project is sponsored by:

License

The gem is available as open source under the terms of the MIT License.