Net::Llm
A minimal Ruby gem providing interfaces to connect to OpenAI, Ollama, and Anthropic (Claude) LLM APIs.
Installation
Install the gem and add to the application's Gemfile by executing:
bundle add net-llmIf bundler is not being used to manage dependencies, install the gem by executing:
gem install net-llmUsage
OpenAI
require 'net/llm'
client = Net::Llm::OpenAI.new(
api_key: ENV['OPENAI_API_KEY'],
model: 'gpt-4o-mini'
)
messages = [
{ role: 'user', content: 'Hello!' }
]
response = client.chat(messages, [])
puts response.dig('choices', 0, 'message', 'content')Custom Base URL
client = Net::Llm::OpenAI.new(
api_key: ENV['OPENAI_API_KEY'],
base_url: 'https://custom-openai-compatible-api.com/v1',
model: 'gpt-4o-mini'
)With Tools
tools = [
{
type: 'function',
function: {
name: 'get_weather',
description: 'Get current weather',
parameters: {
type: 'object',
properties: {
location: { type: 'string' }
},
required: ['location']
}
}
}
]
response = client.chat(messages, tools)Ollama
require 'net/llm'
client = Net::Llm::Ollama.new(
host: 'localhost:11434',
model: 'gpt-oss:latest'
)
messages = [
{ role: 'user', content: 'Hello!' }
]
response = client.chat(messages)
puts response['message']['content']With Tools
tools = [
{
type: 'function',
function: {
name: 'get_weather',
description: 'Get current weather',
parameters: {
type: 'object',
properties: {
location: { type: 'string' }
},
required: ['location']
}
}
}
]
response = client.chat(messages, tools)Streaming
client.chat(messages) do |chunk|
print chunk.dig('message', 'content')
endGenerate
response = client.generate('Write a haiku')
puts response['response']
client.generate('Write a haiku') do |chunk|
print chunk['response']
endOther Endpoints
client.embeddings('Hello world')
client.tags
client.show('llama2')Anthropic (Claude)
require 'net/llm'
client = Net::Llm::Anthropic.new(
api_key: ENV['ANTHROPIC_API_KEY'],
model: 'claude-3-5-sonnet-20241022'
)
messages = [
{ role: 'user', content: 'Hello!' }
]
response = client.messages(messages)
puts response.dig('content', 0, 'text')With System Prompt
response = client.messages(
messages,
system: 'You are a helpful assistant'
)Streaming
client.messages(messages) do |event|
if event['type'] == 'content_block_delta'
print event.dig('delta', 'text')
end
endWith Tools
tools = [
{
name: 'get_weather',
description: 'Get current weather',
input_schema: {
type: 'object',
properties: {
location: { type: 'string' }
},
required: ['location']
}
}
]
response = client.messages(messages, tools: tools)Error Handling
All non-streaming API methods return error information as a hash when requests fail:
response = client.chat(messages, tools)
if response["code"]
puts "Error #{response["code"]}: #{response["body"]}"
else
puts response.dig('choices', 0, 'message', 'content')
endStreaming methods still raise exceptions on HTTP errors.
API Coverage
OpenAI
-
/v1/chat/completions(with tools support) /v1/models/v1/embeddings
Ollama
-
/api/chat(with streaming and tools) -
/api/generate(with streaming) /api/embed/api/tags/api/show
Anthropic (Claude)
-
/v1/messages(with streaming and tools)
Development
After checking out the repo, run bin/setup to install dependencies. Then, run rake spec to run the tests. You can also run bin/console for an interactive prompt that will allow you to experiment.
To install this gem onto your local machine, run bundle exec rake install. To release a new version, update the version number in version.rb, and then run bundle exec rake release, which will create a git tag for the version, push git commits and the created tag, and push the .gem file to rubygems.org.
Contributing
Bug reports and pull requests are welcome on GitHub at https://github.com/xlgmokha/net-llm.
License
The gem is available as open source under the terms of the MIT License.