ActionPrompt
ActionPrompt brings Rails-native conventions to Large Language Model (LLM) integration. Inspired by ActionMailer, it lets you define prompt classes with structured actions, write prompts as ERB templates, and deliver them to any LLM provider through a pluggable adapter system.
# app/prompts/article_summarizer_prompt.rb
class ArticleSummarizerPrompt < ActionPrompt::Base
default model: "gpt-4o-mini", temperature: 0.5
def summarize(article)
@article = article
prompt
end
end
# app/views/action_prompts/article_summarizer_prompt/summarize.text.erb
Summarize the following article in three concise bullet points.
Title: <%= @article.title %>
<%= @article.body %>
# Anywhere in your app
response = ArticleSummarizerPrompt.summarize(article).deliver_nowTable of Contents
- Installation
- Configuration
- Core Concepts
- Quick Start
- Generator
- Writing Templates
- Default Options
- Adapters
- Writing an OpenAI Adapter
- Writing a Google Gemini Adapter
- Testing
- Advanced Usage
- Contributing
- License
Installation
Add to your Gemfile:
gem "action_prompter"Then run:
bundle installActionPrompt requires Ruby 3.1+ and Rails 7.0+.
Configuration
Create an initializer:
# config/initializers/action_prompt.rb
ActionPrompt.configure do |config|
# Set the delivery adapter (required for real LLM calls)
config.adapter = MyOpenAIAdapter.new(api_key: ENV.fetch("OPENAI_API_KEY"))
# Global defaults applied to every prompt unless overridden
config.default_options = {
model: "gpt-4o",
temperature: 0.7
}
endIn development/test, the gem ships with two built-in adapters so you can work without live API credentials:
| Adapter | Behaviour |
|---|---|
ActionPrompt::Adapters::Test |
Records deliveries in memory (default) |
ActionPrompt::Adapters::Null |
Silently discards all prompts, returns nil
|
Core Concepts
| Concept | ActionMailer analogy | ActionPrompt equivalent |
|---|---|---|
| Prompt class | Mailer class | ApplicationPrompt < ActionPrompt::Base |
| Action method | welcome_email(user) |
summarize(article) |
| Template | welcome_email.html.erb |
summarize.text.erb |
| Message object | Mail::Message |
ActionPrompt::Message |
| Delivery | deliver_now |
deliver_now |
| Adapter | smtp_settings |
config.adapter = MyAdapter.new |
Flow
YourPrompt.action(args) # class-level call
└─► ActionPrompt::Base # creates instance, calls action method
└─► #prompt(options) # sets options, returns Message
└─► Message # lazily renders ERB template
└─► #deliver_now
└─► Adapter#complete(body, options) # → LLM → response
Quick Start
1. Create the prompt class
# app/prompts/article_summarizer_prompt.rb
class ArticleSummarizerPrompt < ActionPrompt::Base
# Class-level defaults — override global config for this class only.
default model: "gpt-4o-mini", temperature: 0.5
# Each public method is a "prompt action".
# Set instance variables here; they become available in the ERB template.
def summarize(article)
@article = article
@word_limit = 150
# Call `prompt` last. Pass per-action overrides if needed.
prompt(model: "gpt-4o")
end
end2. Create the ERB template
<%# app/views/action_prompts/article_summarizer_prompt/summarize.text.erb %>
You are an expert editorial assistant. Summarize the article below in no more
than <%= @word_limit %> words, using three concise bullet points.
## Article
Title: <%= @article.title %>
Author: <%= @article.author.name %>
Published: <%= @article.published_at.strftime("%B %-d, %Y") %>
---
<%= @article.body %>3. Deliver
# In a controller, background job, or anywhere else:
article = Article.find(params[:id])
response = ArticleSummarizerPrompt.summarize(article).deliver_now
render json: { summary: response }Generator
The generator creates both the prompt class and the view template(s) in one command:
rails generate action_prompt NAME [action action ...]Example
rails generate action_prompt ArticleSummarizer summarizeGenerates:
create app/prompts/article_summarizer_prompt.rb
create app/views/action_prompts/article_summarizer_prompt/summarize.text.erb
Multiple actions
rails generate action_prompt Moderation classify flag summarizeGenerates one class with three stubbed action methods and three view templates.
Namespaced prompts
rails generate action_prompt Admin::Report generateGenerates:
create app/prompts/admin/report_prompt.rb
create app/views/action_prompts/admin/report_prompt/generate.text.erb
Writing Templates
Templates live in app/views/action_prompts/ and follow the naming convention:
app/views/action_prompts/<prompt_class_underscored>/<action_name>.text.erb
All instance variables set in the action method are available in the template. Standard ERB interpolation, conditionals, and partials all work.
<%# app/views/action_prompts/article_summarizer_prompt/summarize.text.erb %>
You are a helpful assistant specialised in content summarisation.
<% if @article.paywalled? %>
Note: this article is paywalled; focus only on the excerpt provided.
<% end %>
Please summarise the following content:
<%= @article.excerpt %>System prompts
A common pattern is to split system and user content into separate template "sections" within a single file using a plain delimiter:
---SYSTEM---
You are an expert Ruby on Rails developer. Answer questions concisely.
---USER---
<%= @question %>Your adapter can then parse the delimiter and map sections to the LLM's
system / user message roles.
Default Options
Options are merged in increasing precedence:
global config < class default < per-action option
# Global (lowest priority)
ActionPrompt.configure { |c| c.default_options = { model: "gpt-3.5-turbo", temperature: 0.9 } }
class MyPrompt < ActionPrompt::Base
# Class default (overrides global)
default model: "gpt-4o", max_tokens: 1024
def draft(brief)
@brief = brief
# Per-action (overrides class default — highest priority)
prompt(temperature: 0.2)
end
end
# Effective options for draft():
# { model: "gpt-4o", temperature: 0.2, max_tokens: 1024 }Adapters
An adapter is any object that inherits from ActionPrompt::Adapters::Base and
implements the #complete method:
# @param prompt_text [String] the fully rendered prompt string
# @param options [Hash] merged LLM options (model, temperature, …)
# @return [String] the LLM response text
def complete(prompt_text, options = {})
raise NotImplementedError
endWriting an OpenAI Adapter
# lib/my_app/adapters/open_ai_adapter.rb
# Gemfile: gem "ruby-openai"
require "openai"
module MyApp
module Adapters
class OpenAIAdapter < ActionPrompt::Adapters::Base
def initialize(api_key:)
@client = OpenAI::Client.new(access_token: api_key)
end
def complete(prompt_text, options = {})
response = @client.chat(
parameters: {
model: options.fetch(:model, "gpt-4o"),
temperature: options.fetch(:temperature, 0.7),
max_tokens: options[:max_tokens],
messages: [{ role: "user", content: prompt_text }]
}.compact
)
response.dig("choices", 0, "message", "content")
rescue OpenAI::Error => e
raise ActionPrompt::DeliveryError, "OpenAI error: #{e.message}"
end
end
end
endRegister in your initializer:
# config/initializers/action_prompt.rb
ActionPrompt.configure do |config|
config.adapter = MyApp::Adapters::OpenAIAdapter.new(
api_key: ENV.fetch("OPENAI_API_KEY")
)
config.default_options = { model: "gpt-4o", temperature: 0.7 }
endWriting a Google Gemini Adapter
# lib/my_app/adapters/gemini_adapter.rb
# Gemfile: gem "gemini-ai"
require "gemini-ai"
module MyApp
module Adapters
class GeminiAdapter < ActionPrompt::Adapters::Base
def initialize(api_key:, default_model: "gemini-1.5-pro")
@default_model = default_model
@api_key = api_key
end
def complete(prompt_text, options = {})
model = options.fetch(:model, @default_model)
client = Gemini.new(
credentials: { service: "generative-language-api", api_key: @api_key },
options: { model: model, server_sent_events: false }
)
result = client.generate_content(
{
contents: { role: "user", parts: { text: prompt_text } },
generationConfig: {
temperature: options[:temperature],
maxOutputTokens: options[:max_tokens]
}.compact
}
)
result.dig("candidates", 0, "content", "parts", 0, "text")
end
end
end
endRegister in your initializer:
ActionPrompt.configure do |config|
config.adapter = MyApp::Adapters::GeminiAdapter.new(
api_key: ENV.fetch("GOOGLE_AI_API_KEY"),
default_model: "gemini-1.5-flash"
)
endTesting
ActionPrompt ships with ActionPrompt::Adapters::Test, which captures all
deliveries in memory without making real API calls.
Setup
# spec/support/action_prompt.rb
RSpec.configure do |config|
config.before do
ActionPrompt.reset_configuration!
ActionPrompt.configure do |c|
c.adapter = ActionPrompt::Adapters::Test.new
end
ActionPrompt::Adapters::Test.clear_deliveries!
end
endRequire this file from spec/rails_helper.rb:
require "support/action_prompt"Writing tests
# spec/prompts/article_summarizer_prompt_spec.rb
RSpec.describe ArticleSummarizerPrompt do
let(:article) do
double(:article,
title: "The Future of Rails",
body: "Rails continues to evolve...",
author: double(name: "DHH"),
paywalled?: false,
published_at: Date.today,
excerpt: "Rails continues to evolve..."
)
end
describe ".summarize" do
it "returns an ActionPrompt::Message" do
message = described_class.summarize(article)
expect(message).to be_an(ActionPrompt::Message)
end
it "uses the configured model" do
described_class.summarize(article).deliver_now
options = ActionPrompt::Adapters::Test.deliveries.last[:options]
expect(options[:model]).to eq("gpt-4o")
end
it "includes the article title in the rendered prompt" do
described_class.summarize(article).deliver_now
prompt = ActionPrompt::Adapters::Test.deliveries.last[:prompt]
expect(prompt).to include("The Future of Rails")
end
it "includes the word limit in the rendered prompt" do
described_class.summarize(article).deliver_now
prompt = ActionPrompt::Adapters::Test.deliveries.last[:prompt]
expect(prompt).to include("150")
end
end
endTesting the adapter in isolation
RSpec.describe ArticleSummarizerPrompt do
it "calls the adapter with the rendered body" do
adapter = instance_double(MyApp::Adapters::OpenAIAdapter, complete: "Summary text.")
ActionPrompt.configure { |c| c.adapter = adapter }
described_class.summarize(article).deliver_now
expect(adapter).to have_received(:complete)
.with(include("The Future of Rails"), hash_including(model: "gpt-4o"))
end
endAdvanced Usage
Accessing the raw message body before delivery
message = ArticleSummarizerPrompt.summarize(article)
puts message.body # renders the ERB and returns the string
puts message.options # { model: "gpt-4o", temperature: 0.5 }
response = message.deliver_nowExtracting a base prompt class for your app
# app/prompts/application_prompt.rb
class ApplicationPrompt < ActionPrompt::Base
default model: "gpt-4o-mini", temperature: 0.7
private
# Shared helpers available in all subclass action methods.
def current_date
Date.today.strftime("%B %-d, %Y")
end
endclass ArticleSummarizerPrompt < ApplicationPrompt
def summarize(article)
@article = article
@date = current_date # available from ApplicationPrompt
prompt
end
endConditional adapter switching per environment
# config/initializers/action_prompt.rb
ActionPrompt.configure do |config|
config.adapter =
case Rails.env
when "production" then MyApp::Adapters::OpenAIAdapter.new(api_key: ENV.fetch("OPENAI_API_KEY"))
when "development" then ActionPrompt::Adapters::Null.new # save API credits locally
when "test" then ActionPrompt::Adapters::Test.new
end
endProject Structure
your_rails_app/
├── app/
│ ├── prompts/
│ │ └── article_summarizer_prompt.rb # Prompt class
│ └── views/
│ └── action_prompts/
│ └── article_summarizer_prompt/
│ └── summarize.text.erb # ERB template
├── config/
│ └── initializers/
│ └── action_prompt.rb # Adapter + global config
└── spec/
├── prompts/
│ └── article_summarizer_prompt_spec.rb
└── support/
└── action_prompt.rb # Test setup
Contributing
- Fork the repository
- Create a feature branch (
git checkout -b feature/my-feature) - Make your changes and add tests
- Ensure all tests pass (
bundle exec rspec) and the linter is happy (bundle exec rubocop) - Open a Pull Request
Bug reports and feature requests are welcome at https://github.com/soran-me/action_prompt/issues.
License
The gem is available as open source under the terms of the MIT License.