AIA is a command-line utility that facilitates interaction with AI models through dynamic prompt management. It automates the management of pre-compositional prompts and executes generative AI commands with enhanced features including embedded directives, shell integration, embedded Ruby, history management, interactive chat, and prompt workflows.
AIA leverages the following Ruby gems:
- prompt_manager to manage prompts,
- ruby_llm to access LLM providers,
- ruby_llm-mcp for Model Context Protocol (MCP) support,
- and can use the shared_tools gem which provides a collection of common ready-to-use MCP clients and functions for use with LLMs that support tools.
Wiki: Checkout the AIA Wiki
BLOG: Series on AIA
Quick Start
-
Install AIA:
gem install aia
-
Install dependencies:
brew install fzf
-
Create your first prompt:
mkdir -p ~/.prompts echo "What is [TOPIC]?" > ~/.prompts/what_is.txt
-
Run your prompt:
aia what_is
You'll be prompted to enter a value for
[TOPIC]
, then AIA will send your question to the AI model. -
Start an interactive chat:
aia --chat
, ,
(\____/) AI Assistant (v0.9.7) is Online
(_oo_) gpt-4o-mini
(O) using ruby_llm (v1.3.1)
__||__ \) model db was last refreshed on
[/______\] / 2025-06-18
/ \__AI__/ \/ You can share my tools
/ /__\
(\ /____\
Table of Contents
- Installation & Prerequisites
- Requirements
- Installation
- Setup Shell Completion
- Basic Usage
- Command Line Interface
- Key Command-Line Options
- Directory Structure
- Configuration
- Essential Configuration Options
- Configuration Precedence
- Configuration Methods
- Complete Configuration Reference
- Advanced Features
- Prompt Directives
- Configuration Directive Examples
- Dynamic Content Examples
- Custom Directive Examples
- Shell Integration
- Embedded Ruby (ERB)
- Prompt Sequences
- Using --next
- Using --pipeline
- Example Workflow
- Roles and System Prompts
- RubyLLM::Tool Support
- Prompt Directives
- Examples & Tips
- Practical Examples
- Code Review Prompt
- Meeting Notes Processor
- Documentation Generator
- Executable Prompts
- Tips from the Author
- The run Prompt
- The Ad Hoc One-shot Prompt
- Recommended Shell Setup
- Prompt Directory Organization
- Practical Examples
- Security Considerations
- Shell Command Execution
- Safe Practices
- Recommended Security Setup
- Troubleshooting
- Common Issues
- Error Messages
- Debug Mode
- Performance Issues
- Development
- Testing
- Building
- Architecture Notes
- Contributing
- Reporting Issues
- Development Setup
- Areas for Improvement
- Roadmap
- License
- Articles on AIA
Installation & Prerequisites
Requirements
- Ruby: >= 3.2.0
-
External Tools:
- fzf - Command-line fuzzy finder
Installation
# Install AIA gem
gem install aia
# Install required external tools (macOS)
brew install fzf
# Install required external tools (Linux)
# Ubuntu/Debian
sudo apt install fzf
# Arch Linux
sudo pacman -S fzf
Setup Shell Completion
Get completion functions for your shell:
# For bash users
aia --completion bash >> ~/.bashrc
# For zsh users
aia --completion zsh >> ~/.zshrc
# For fish users
aia --completion fish >> ~/.config/fish/config.fish
Basic Usage
Command Line Interface
# Basic usage
aia [OPTIONS] PROMPT_ID [CONTEXT_FILES...]
# Interactive chat session
aia --chat [--role ROLE] [--model MODEL]
# Use a specific model
aia --model gpt-4 my_prompt
# Specify output file
aia --out_file result.md my_prompt
# Use a role/system prompt
aia --role expert my_prompt
# Enable fuzzy search for prompts
aia --fuzzy
Key Command-Line Options
Option | Description | Example |
---|---|---|
--chat |
Start interactive chat session | aia --chat |
--model MODEL |
Specify AI model to use | aia --model gpt-4 |
--role ROLE |
Use a role/system prompt | aia --role expert |
--out_file FILE |
Specify output file | aia --out_file results.md |
--fuzzy |
Use fuzzy search for prompts | aia --fuzzy |
--help |
Show complete help | aia --help |
Directory Structure
~/.prompts/ # Default prompts directory
├── ask.txt # Simple question prompt
├── code_review.txt # Code review prompt
├── roles/ # Role/system prompts
│ ├── expert.txt # Expert role
│ └── teacher.txt # Teaching role
└── _prompts.log # History log
Configuration
Essential Configuration Options
The most commonly used configuration options:
Option | Default | Description |
---|---|---|
model |
gpt-4o-mini |
AI model to use |
prompts_dir |
~/.prompts |
Directory containing prompts |
out_file |
temp.md |
Default output file |
temperature |
0.7 |
Model creativity (0.0-1.0) |
chat |
false |
Start in chat mode |
Configuration Precedence
AIA determines configuration settings using this order (highest to lowest priority):
-
Embedded config directives (in prompt files):
//config model = gpt-4
-
Command-line arguments:
--model gpt-4
-
Environment variables:
export AIA_MODEL=gpt-4
-
Configuration files:
~/.aia/config.yml
- Default values
Configuration Methods
Environment Variables:
export AIA_MODEL=gpt-4
export AIA_PROMPTS_DIR=~/my-prompts
export AIA_TEMPERATURE=0.8
Configuration File (~/.aia/config.yml
):
model: gpt-4
prompts_dir: ~/my-prompts
temperature: 0.8
chat: false
Embedded Directives (in prompt files):
//config model = gpt-4
//config temperature = 0.8
Your prompt content here...
Complete Configuration Reference
Config Item Name | CLI Options | Default Value | Environment Variable |
---|---|---|---|
adapter | --adapter | ruby_llm | AIA_ADAPTER |
aia_dir | ~/.aia | AIA_DIR | |
append | -a, --append | false | AIA_APPEND |
chat | --chat | false | AIA_CHAT |
clear | --clear | false | AIA_CLEAR |
config_file | -c, --config_file | ~/.aia/config.yml | AIA_CONFIG_FILE |
debug | -d, --debug | false | AIA_DEBUG |
embedding_model | --em, --embedding_model | text-embedding-ada-002 | AIA_EMBEDDING_MODEL |
erb | true | AIA_ERB | |
frequency_penalty | --frequency_penalty | 0.0 | AIA_FREQUENCY_PENALTY |
fuzzy | -f, --fuzzy | false | AIA_FUZZY |
image_quality | --iq, --image_quality | standard | AIA_IMAGE_QUALITY |
image_size | --is, --image_size | 1024x1024 | AIA_IMAGE_SIZE |
image_style | --style, --image_style | vivid | AIA_IMAGE_STYLE |
log_file | -l, --log_file | ~/.prompts/_prompts.log | AIA_LOG_FILE |
markdown | --md, --markdown | true | AIA_MARKDOWN |
max_tokens | --max_tokens | 2048 | AIA_MAX_TOKENS |
model | -m, --model | gpt-4o-mini | AIA_MODEL |
next | -n, --next | nil | AIA_NEXT |
out_file | -o, --out_file | temp.md | AIA_OUT_FILE |
parameter_regex | --regex | '(?-mix:([[A-Z _|]+]))' | AIA_PARAMETER_REGEX |
pipeline | --pipeline | [] | AIA_PIPELINE |
presence_penalty | --presence_penalty | 0.0 | AIA_PRESENCE_PENALTY |
prompt_extname | .txt | AIA_PROMPT_EXTNAME | |
prompts_dir | -p, --prompts_dir | ~/.prompts | AIA_PROMPTS_DIR |
refresh | --refresh | 7 (days) | AIA_REFRESH |
require_libs | --rq --require | [] | AIA_REQUIRE_LIBS |
role | -r, --role | AIA_ROLE | |
roles_dir | ~/.prompts/roles | AIA_ROLES_DIR | |
roles_prefix | --roles_prefix | roles | AIA_ROLES_PREFIX |
shell | true | AIA_SHELL | |
speak | --speak | false | AIA_SPEAK |
speak_command | afplay | AIA_SPEAK_COMMAND | |
speech_model | --sm, --speech_model | tts-1 | AIA_SPEECH_MODEL |
system_prompt | --system_prompt | AIA_SYSTEM_PROMPT | |
temperature | -t, --temperature | 0.7 | AIA_TEMPERATURE |
terse | --terse | false | AIA_TERSE |
tool_paths | --tools | [] | AIA_TOOL_PATHS |
allowed_tools | --at --allowed_tools | nil | AIA_ALLOWED_TOOLS |
rejected_tools | --rt --rejected_tools | nil | AIA_REJECTED_TOOLS |
top_p | --top_p | 1.0 | AIA_TOP_P |
transcription_model | --tm, --transcription_model | whisper-1 | AIA_TRANSCRIPTION_MODEL |
verbose | -v, --verbose | false | AIA_VERBOSE |
voice | --voice | alloy | AIA_VOICE |
Advanced Features
Prompt Directives
Directives are special commands in prompt files that begin with //
and provide dynamic functionality:
Directive | Description | Example |
---|---|---|
//config |
Set configuration values | //config model = gpt-4 |
//context |
Show context for this conversation | //context |
//include |
Insert file contents | //include path/to/file.txt |
//shell |
Execute shell commands | //shell ls -la |
//robot |
Show the pet robot ASCII art w/versions | //robot |
//ruby |
Execute Ruby code | //ruby puts "Hello World" |
//next |
Set next prompt in sequence | //next summary |
//pipeline |
Set prompt workflow | //pipeline analyze,summarize,report |
//clear |
Clear conversation history | //clear |
//help |
Show available directives | //help |
//available_models |
List available models | //available_models |
//tools |
Show a list of available tools and their description | //tools |
//review |
Review current context | //review |
Directives can also be used in the interactive chat sessions.
Configuration Directive Examples
# Set model and temperature for this prompt
//config model = gpt-4
//config temperature = 0.9
# Enable chat mode and terse responses
//config chat = true
//config terse = true
Your prompt content here...
Dynamic Content Examples
# Include file contents
//include ~/project/README.md
# Execute shell commands
//shell git log --oneline -10
# Run Ruby code
//ruby require 'json'; puts JSON.pretty_generate({status: "ready"})
Analyze the above information and provide insights.
Custom Directive Examples
You can extend AIA with custom directives by creating Ruby files that define new directive methods:
# examples/directives/ask.rb
module AIA
class DirectiveProcessor
private
desc "A meta-prompt to LLM making its response available as part of the primary prompt"
def ask(args, context_manager=nil)
meta_prompt = args.empty? ? "What is meta-prompting?" : args.join(' ')
AIA.config.client.chat(meta_prompt)
end
end
end
Usage: Use the --tools option to specific a specific directive file or a directory full of files
# Load custom directive
aia --tools examples/directives/ask.rb --chat
# Use the results of the custom directive as input to a prompt
//ask gather the latest closing data for the DOW, NASDAQ, and S&P 500
Shell Integration
AIA automatically processes shell patterns in prompts:
-
Environment variables:
$HOME
,${USER}
-
Command substitution:
$(date)
,$(git branch --show-current)
Examples:
# Dynamic system information
As a system administrator on a $(uname -s) platform, how do I optimize performance?
# Include file contents via shell
Here's my current configuration: $(cat ~/.bashrc | head -20)
# Use environment variables
My home directory is $HOME and I'm user $USER.
Security Note: Be cautious with shell integration. Review prompts before execution as they can run arbitrary commands.
Embedded Ruby (ERB)
AIA supports full ERB processing in prompts for dynamic content generation:
<%# ERB example in prompt file %>
Current time: <%= Time.now %>
Random number: <%= rand(100) %>
<% if ENV['USER'] == 'admin' %>
You have admin privileges.
<% else %>
You have standard user privileges.
<% end %>
<%= AIA.config.model %> is the current model.
Prompt Sequences
Chain multiple prompts for complex workflows:
Using --next
# Command line
aia analyze --next summarize --next report
# In prompt files
# analyze.txt contains: //next summarize
# summarize.txt contains: //next report
Using --pipeline
# Command line
aia research --pipeline analyze,summarize,report,present
# In prompt file
//pipeline analyze,summarize,report,present
Example Workflow
research.txt:
//config model = gpt-4
//next analyze
Research the topic: [RESEARCH_TOPIC]
Provide comprehensive background information.
analyze.txt:
//config out_file = analysis.md
//next summarize
Analyze the research data and identify key insights.
summarize.txt:
//config out_file = summary.md
Create a concise summary of the analysis with actionable recommendations.
Roles and System Prompts
Roles define the context and personality for AI responses:
# Use a predefined role
aia --role expert analyze_code.rb
# Roles are stored in ~/.prompts/roles/
# expert.txt might contain:
# "You are a senior software engineer with 15 years of experience..."
Creating Custom Roles:
# Create a code reviewer role
cat > ~/.prompts/roles/code_reviewer.txt << EOF
You are an experienced code reviewer. Focus on:
- Code quality and best practices
- Security vulnerabilities
- Performance optimizations
- Maintainability issues
Provide specific, actionable feedback.
EOF
RubyLLM::Tool Support
AIA supports function calling through RubyLLM tools for extended capabilities:
# Load tools from directory
aia --tools ~/my-tools/ --chat
# Load specific tool files
aia --tools weather.rb,calculator.rb --chat
# Filter tools
aia --tools ~/tools/ --allowed_tools weather,calc
aia --tools ~/tools/ --rejected_tools deprecated
Tool Examples (see examples/tools/
directory):
- File operations (read, write, list)
- Shell command execution
- API integrations
- Data processing utilities
MCP Client Examples (see examples/tools/mcp/
directory):
AIA supports Model Context Protocol (MCP) clients for extended functionality:
# GitHub MCP Server (requires: brew install github-mcp-server)
# Set GITHUB_PERSONAL_ACCESS_TOKEN environment variable
aia --tools examples/tools/mcp/github_mcp_server.rb --chat
# iMCP for macOS (requires: brew install --cask loopwork/tap/iMCP)
# Provides access to Notes, Calendar, Contacts, etc.
aia --tools examples/tools/mcp/imcp.rb --chat
These MCP clients require the ruby_llm-mcp
gem and provide access to external services and data sources through the Model Context Protocol.
Shared Tools Collection: AIA can use the shared_tools gem which provides a curated collection of commonly-used tools (aka functions) via the --require option.
# Access shared tools automatically (included with AIA)
aia --require shared_tools/ruby_llm --chat
# To access just one specific shared tool
aia --require shared_tools/ruby_llm/edit_file --chat
# Combine with your own local custom RubyLLM-based tools
aia --require shared_tools/ruby_llm --tools ~/my-tools/ --chat
The above examples show the shared_tools being used within an interactive chat session. They are also available in batch prompts as well using the same --require option. You can also use the //ruby directive to require the shared_tools as well and using a require statement within an ERB block.
Examples & Tips
Practical Examples
Code Review Prompt
# ~/.prompts/code_review.txt
//config model = gpt-4o-mini
//config temperature = 0.3
Review this code for:
- Best practices adherence
- Security vulnerabilities
- Performance issues
- Maintainability concerns
Code to review:
Usage: aia code_review mycode.rb
Meeting Notes Processor
# ~/.prompts/meeting_notes.txt
//config model = gpt-4o-mini
//pipeline format,action_items
Raw meeting notes:
//include [NOTES_FILE]
Please clean up and structure these meeting notes.
Documentation Generator
# ~/.prompts/document.txt
//config model = gpt-4o-mini
//shell find [PROJECT_DIR] -name "*.rb" | head -10
Generate documentation for the Ruby project shown above.
Include: API references, usage examples, and setup instructions.
Executable Prompts
The --exec
flag is used to create executable prompts. If it is not present on the shebang line then the prompt file will be treated like any other context file. That means that the file will be included as context in the prompt but no dynamic content integration or directives will be processed. All other AIA options are, well, optional. All you need is an initial prompt ID and the --exec flag.
In the example below the option --no-out_file
is used to direct the output from the LLM processing of the prompt to STDOUT. This way the executable prompts can be good citizens on the *nix command line receiving piped in input via STDIN and send its output to STDOUT.
Create executable prompts:
weather_report (make executable with chmod +x
):
#!/usr/bin/env aia run --no-out_file --exec
# Get current storm activity for the east and south coast of the US
Summarize the tropical storm outlook fpr the Atlantic, Caribbean Sea and Gulf of America.
//webpage https://www.nhc.noaa.gov/text/refresh/MIATWOAT+shtml/201724_MIATWOAT.shtml
Usage:
./weather_report
./weather_report | glow # Render the markdown with glow
Tips from the Author
The run Prompt
# ~/.prompts/run.txt
# Desc: A configuration only prompt file for use with executable prompts
# Put whatever you want here to setup the configuration desired.
# You could also add a system prompt to preface your intended prompt
Usage: echo "What is the meaning of life?" | aia run
The Ad Hoc One-shot Prompt
# ~/.prompts/ad_hoc.txt
[WHAT_NOW_HUMAN]
Usage: aia ad_hoc
- perfect for any quick one-shot question without cluttering shell history.
Recommended Shell Setup
# ~/.bashrc_aia
export AIA_PROMPTS_DIR=~/.prompts
export AIA_OUT_FILE=./temp.md
export AIA_MODEL=gpt-4o-mini
export AIA_VERBOSE=true # Shows spinner while waiting for LLM response
alias chat='aia --chat --terse'
ask() { echo "$1" | aia run --no-out_file; }
The chat
alias and the ask
function (shown above in HASH) are two powerful tools for interacting with the AI assistant. The chat
alias allows you to engage in an interactive conversation with the AI assistant, while the ask
function allows you to ask a question and receive a response. Later in this document the run
prompt ID is discussed. Besides using the run prompt ID here its also used in making executable prompt files.
Prompt Directory Organization
~/.prompts/
├── daily/ # Daily workflow prompts
├── development/ # Coding and review prompts
├── research/ # Research and analysis
├── roles/ # System prompts
└── workflows/ # Multi-step pipelines
Security Considerations
Shell Command Execution
⚠️ Important Security Warning
AIA executes shell commands and Ruby code embedded in prompts. This provides powerful functionality but requires caution:
- Review prompts before execution, especially from untrusted sources
- Avoid storing sensitive data in prompts (API keys, passwords)
- Use parameterized prompts instead of hardcoding sensitive values
- Limit file permissions on prompt directories if sharing systems
Safe Practices
# ✅ Good: Use parameters for sensitive data
//config api_key = [API_KEY]
# ❌ Bad: Hardcode secrets
//config api_key = sk-1234567890abcdef
# ✅ Good: Validate shell commands
//shell ls -la /safe/directory
# ❌ Bad: Dangerous shell commands
//shell rm -rf / # Never do this!
Recommended Security Setup
# Set restrictive permissions on prompts directory
chmod 700 ~/.prompts
chmod 600 ~/.prompts/*.txt
Troubleshooting
Common Issues
Prompt not found:
# Check prompts directory
ls $AIA_PROMPTS_DIR
# Verify prompt file exists
ls ~/.prompts/my_prompt.txt
# Use fuzzy search
aia --fuzzy
Model errors:
# List available models
aia --available_models
# Check model name spelling
aia --model gpt-4o # Correct
aia --model gpt4 # Incorrect
Shell integration not working:
# Verify shell patterns
echo "Test: $(date)" # Should show current date
echo "Home: $HOME" # Should show home directory
Configuration issues:
# Check current configuration
aia --config
# Debug configuration loading
aia --debug --config
Error Messages
Error | Cause | Solution |
---|---|---|
"Prompt not found" | Missing prompt file | Check file exists and spelling |
"Model not available" | Invalid model name | Use --available_models to list valid models |
"Shell command failed" | Invalid shell syntax | Test shell commands separately first |
"Configuration error" | Invalid config syntax | Check config file YAML syntax |
Debug Mode
Enable debug output for troubleshooting:
# Enable debug mode
aia --debug my_prompt
# Combine with verbose for maximum output
aia --debug --verbose my_prompt
Performance Issues
Slow model responses:
- Try smaller/faster models:
--model gpt-4o-mini
- Reduce max_tokens:
--max_tokens 1000
- Use lower temperature for faster responses:
--temperature 0.1
Large prompt processing:
- Break into smaller prompts using
--pipeline
- Use
//include
selectively instead of large files - Consider model context limits
Development
Testing
# Run unit tests
rake test
# Run integration tests
rake integration
# Run all tests with coverage
rake all_tests
open coverage/index.html
Building
# Install locally with documentation
just install
# Generate documentation
just gen_doc
# Static code analysis
just flay
Architecture Notes
ShellCommandExecutor Refactor:
The ShellCommandExecutor
is now a class (previously a module) with instance variables for cleaner encapsulation. Class-level methods remain for backward compatibility.
Prompt Variable Fallback:
Variables are always parsed from prompt text when no .json
history file exists, ensuring parameter prompting works correctly.
Contributing
Bug reports and pull requests are welcome on GitHub at https://github.com/MadBomber/aia.
Reporting Issues
When reporting issues, please include:
- AIA version:
aia --version
- Ruby version:
ruby --version
- Operating system
- Minimal reproduction example
- Error messages and debug output
Development Setup
git clone https://github.com/MadBomber/aia.git
cd aia
bundle install
rake test
Areas for Improvement
- Configuration UI for complex setups
- Better error handling and user feedback
- Performance optimization for large prompt libraries
- Enhanced security controls for shell integration
Roadmap
- Enhanced Search: Restore full-text search within prompt files
- UI Improvements: Better configuration management for fzf and rg tools
- Logging: Enhanced logging using Ruby Logger class; integration with RubyLLM and RubyLLM::MCP logging
License
The gem is available as open source under the terms of the MIT License.