0.0
No release in over 3 years
RubyLLM provider for girb (AI-powered IRB assistant). Install this gem to use OpenAI, Anthropic, Gemini and other LLMs via RubyLLM as your LLM backend.
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
2025
2026
 Dependencies

Runtime

~> 0.1
~> 1.0
 Project Readme

girb-ruby_llm

日本語版 README

RubyLLM provider for girb (AI-powered IRB assistant).

This gem allows you to use multiple LLM providers (OpenAI, Anthropic, Google Gemini, Ollama, and more) through the RubyLLM unified API.

Installation

For Rails Projects

Add to your Gemfile:

group :development do
  gem 'girb-ruby_llm'
end

Then run:

bundle install

Create a .girbrc file in your project root:

# .girbrc
require 'girb-ruby_llm'

Girb.configure do |c|
  c.provider = Girb::Providers::RubyLlm.new(model: 'gemini-2.5-flash')
end

Now rails console will automatically load girb!

For Non-Rails Projects

Install globally:

gem install girb girb-ruby_llm

Create a .girbrc file in your project directory:

# .girbrc
require 'girb-ruby_llm'

Girb.configure do |c|
  c.provider = Girb::Providers::RubyLlm.new(model: 'gemini-2.5-flash')
end

Then use girb command instead of irb.

Configuration

Set your API key or endpoint as an environment variable:

Cloud Providers

Provider Environment Variable
Anthropic ANTHROPIC_API_KEY
OpenAI OPENAI_API_KEY
Google Gemini GEMINI_API_KEY
DeepSeek DEEPSEEK_API_KEY
Mistral MISTRAL_API_KEY
OpenRouter OPENROUTER_API_KEY
Perplexity PERPLEXITY_API_KEY
xAI XAI_API_KEY

Local Providers

Provider Environment Variable
Ollama OLLAMA_API_BASE
GPUStack GPUSTACK_API_BASE

Additional Configuration

Environment Variable Description
OPENAI_API_BASE Custom OpenAI-compatible API endpoint
GEMINI_API_BASE Custom Gemini API endpoint
GPUSTACK_API_KEY GPUStack API key
BEDROCK_API_KEY AWS Bedrock access key
BEDROCK_SECRET_KEY AWS Bedrock secret key
BEDROCK_REGION AWS Bedrock region
VERTEXAI_PROJECT_ID Google Vertex AI project ID
VERTEXAI_LOCATION Google Vertex AI location

Examples

Using Google Gemini

# .girbrc
require 'girb-ruby_llm'

# Set GEMINI_API_KEY environment variable
Girb.configure do |c|
  c.provider = Girb::Providers::RubyLlm.new(model: 'gemini-2.5-flash')
end

Using OpenAI

# .girbrc
require 'girb-ruby_llm'

# Set OPENAI_API_KEY environment variable
Girb.configure do |c|
  c.provider = Girb::Providers::RubyLlm.new(model: 'gpt-5.2-2025-12-11')
end

Using Anthropic Claude

# .girbrc
require 'girb-ruby_llm'

# Set ANTHROPIC_API_KEY environment variable
Girb.configure do |c|
  c.provider = Girb::Providers::RubyLlm.new(model: 'claude-opus-4-5')
end

Using Ollama (Local)

# .girbrc
require 'girb-ruby_llm'

# Set OLLAMA_API_BASE environment variable (e.g., http://localhost:11434/v1)
Girb.configure do |c|
  c.provider = Girb::Providers::RubyLlm.new(model: 'llama3.2:latest')
end

Using OpenAI-compatible APIs (e.g., LM Studio, vLLM)

# .girbrc
require 'girb-ruby_llm'

# Set OPENAI_API_BASE and OPENAI_API_KEY environment variables
Girb.configure do |c|
  c.provider = Girb::Providers::RubyLlm.new(model: 'your-model-name')
end

Advanced Configuration

# .girbrc
require 'girb-ruby_llm'

Girb.configure do |c|
  c.provider = Girb::Providers::RubyLlm.new(model: 'gemini-2.5-flash')
  c.debug = true  # Enable debug output
  c.custom_prompt = <<~PROMPT
    Always confirm before destructive operations.
  PROMPT
end

Note: RubyLLM::Models.refresh! is automatically called for local providers (Ollama, GPUStack).

Alternative: Environment Variable Configuration

For the girb command, you can also configure via environment variables (used when no .girbrc is found):

export GIRB_PROVIDER=girb-ruby_llm
export GIRB_MODEL=gemini-2.5-flash
export GEMINI_API_KEY=your-api-key
girb

Supported Models

See RubyLLM Available Models for the full list of supported models.

License

MIT License