0.0
No release in over 3 years
A unified interface for multiple LLM providers (OpenAI, Anthropic, Google, OpenRouter). Supports streaming, tool calling, and configurable logging.
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
2025
2026
 Dependencies

Development

~> 3.0
~> 3.0

Runtime

~> 2.0
 Project Readme

LlmProviders

Gem Version License: MIT

A lightweight, unified interface for multiple LLM providers. Only depends on faraday — no ActiveSupport required.

日本語版 README

Features

  • Lightweight — Single dependency (faraday), no ActiveSupport
  • Unified interface — Same API for all providers
  • Streaming — Real-time token streaming with block syntax
  • Tool calling — Consistent tool/function calling across providers
  • Token tracking — Usage stats (input, output, cached) in every response

Supported Providers

Provider ENV Variable Default Model
anthropic ANTHROPIC_API_KEY claude-sonnet-4-5-20250929
openai OPENAI_API_KEY gpt-5-mini
google GOOGLE_API_KEY gemini-2.5-flash
openrouter OPENROUTER_API_KEY anthropic/claude-sonnet-4.5

Installation

Add to your Gemfile:

gem "llm_providers"

Quick Start

require "llm_providers"

provider = LlmProviders::Providers.build(:anthropic)

# Synchronous
result = provider.chat(
  messages: [{ role: "user", content: "Hello!" }],
  system: "You are a helpful assistant."
)
puts result[:content]

# Streaming
provider.chat(messages: [{ role: "user", content: "Hello!" }]) do |chunk|
  print chunk[:content]
end

Usage

Configuration

LlmProviders.configure do |config|
  config.logger = Rails.logger  # or any Logger instance
end

Provider Options

provider = LlmProviders::Providers.build(
  :openai,
  model: "gpt-4.1",
  temperature: 0.7,
  max_tokens: 4096
)

Tool Calling

tools = [
  {
    name: "get_weather",
    description: "Get the current weather",
    parameters: {
      type: "object",
      properties: {
        location: { type: "string", description: "City name" }
      },
      required: ["location"]
    }
  }
]

result = provider.chat(
  messages: [{ role: "user", content: "What's the weather in Tokyo?" }],
  tools: tools
)

result[:tool_calls].each do |tc|
  puts "#{tc[:name]}: #{tc[:input]}"
end

Response Format

Every chat call returns a hash with:

{
  content: "Response text",
  tool_calls: [
    { id: "...", name: "...", input: {...} }
  ],
  usage: {
    input: 100,           # Input tokens
    output: 50,           # Output tokens
    cached_input: 80      # Cached input tokens (Anthropic only)
  },
  latency_ms: 1234,
  raw_response: {...}
}

OpenRouter

OpenRouter gives you access to 300+ models through a single API — ideal for providers not directly supported by this gem (DeepSeek, Meta Llama, Mistral, Qwen, etc.).

provider = LlmProviders::Providers.build(
  :openrouter,
  model: "deepseek/deepseek-chat",
  app_name: "MyApp",
  app_url: "https://myapp.example.com",
  provider: { order: ["DeepSeek", "Together"], allow_fallbacks: true }
)

# List available models
models = LlmProviders::Providers::Openrouter.models
models.each { |m| puts "#{m[:id]} (ctx: #{m[:context_length]})" }

Error Handling

begin
  result = provider.chat(messages: messages)
rescue LlmProviders::ProviderError => e
  puts "Error: #{e.message}"  # OpenRouter: "[DeepSeek] Model is unavailable"
  puts "Code: #{e.code}"      # e.g., "anthropic_error", "openrouter_error"
end

Examples

# Interactive chat
ANTHROPIC_API_KEY=your-key ruby examples/simple_chat.rb

# One-shot
ANTHROPIC_API_KEY=your-key ruby examples/one_shot.rb "Hello!"

# With tools
ANTHROPIC_API_KEY=your-key ruby examples/with_tools.rb

# Other providers
OPENAI_API_KEY=your-key ruby examples/simple_chat.rb openai
GOOGLE_API_KEY=your-key ruby examples/simple_chat.rb google

# OpenRouter (DeepSeek, Llama, Mistral, etc.)
OPENROUTER_API_KEY=your-key ruby examples/simple_chat.rb openrouter deepseek/deepseek-chat
OPENROUTER_API_KEY=your-key ruby examples/with_tools.rb openrouter meta-llama/llama-3.3-70b-instruct

License

MIT