Project

promptly

0.0
The project is in a healthy, maintained state
Build maintainable, localized, and testable AI prompts using ERB or Liquid templates with Rails conventions
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
2025
2026
 Dependencies

Development

~> 5.5
>= 7.2, < 9
~> 3.12
~> 1.37

Runtime

>= 7.2, < 9
 Project Readme

Promptly

Opinionated Rails integration for reusable AI prompt templates. Build maintainable, localized, and testable AI prompts using ERB or Liquid templates with Rails conventions.

Features

  • Template rendering: ERB (via ActionView) and optional Liquid support
  • I18n integration: Automatic locale fallback (welcome.es.erbwelcome.en.erbwelcome.erb)
  • Rails conventions: Store prompts in app/prompts/ with organized subdirectories
  • Render & CLI: Test prompts in Rails console or via rake tasks
  • Minimal setup: Auto-loads via Railtie, zero configuration required
  • Prompt caching: Configurable cache store, TTL, and cache-bypass options
  • Schema Validation: Ensure all locals passed to templates match a defined schema.

Documentation

For detailed documentation, please visit the Promptly Wiki.

Install

Add to your Gemfile:

gem "promptly"

For Liquid template support, also add:

gem "liquid"

Then run:

bundle install

Quick Start

# In a controller, service, or anywhere in Rails
prompt = Promptly.render(
  "user_onboarding/welcome_email",
  locale: :es,
  locals: {
    name: "María García",
    app_name: "ProjectHub",
    user_role: "Team Lead",
    features: ["Create projects", "Invite team members", "Track progress", "Generate reports"],
    days_since_signup: 2
  }
)

# Send to your AI service (OpenAI, Anthropic, etc.)
ai_response = openai_client.completions(
  model: "gpt-4",
  messages: [{role: "user", content: prompt}]
)

puts ai_response.dig("choices", 0, "message", "content")
# => AI-generated personalized welcome email in Spanish

Structured Outputs (Response Schema Validation)

Promptly supports OpenAI's structured outputs (guided_json style) by defining .response.json files alongside your templates.

For example, given an output schema app/prompts/user_onboarding/welcome.response.json, you can pass it directly to an AI service:

# Returns the schema wrapped in expected OpenAI format
response_format = Promptly.response_format("user_onboarding/welcome", strict: true)

ai_response = openai_client.chat(
  parameters: {
    model: "gpt-4o",
    messages: [{role: "user", content: prompt}],
    response_format: response_format
  }
)

You can also natively validate the returned JSON string in Ruby to ensure it conforms exactly to the schema:

raw_json = ai_response.dig("choices", 0, "message", "content")

begin
  # Validates and parses the JSON, or raises Promptly::ValidationError
  parsed_output = Promptly.validate_response!("user_onboarding/welcome", raw_json)
rescue Promptly::ValidationError => e
  # Handle invalid response
end

Development

# Install dependencies
bundle install

# Run tests
bundle exec rspec

# Run linter
bundle exec standardrb

# Build gem
rake build

Contributing

  1. Fork it
  2. Create your feature branch (git checkout -b my-new-feature)
  3. Commit your changes (git commit -am 'Add some feature')
  4. Push to the branch (git push origin my-new-feature)
  5. Create new Pull Request

License

MIT