Project

llm_hub

0.0
The project is in a healthy, maintained state
A Ruby interface for multiple LLM providers.It provides easy access to Completion and Embedding functionalities.
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
2025
 Dependencies

Runtime

 Project Readme

LlmHub

A Ruby interface for multiple LLM providers with Completion and Embedding support.

Supported Providers

Completion API

  • OpenAI
  • Anthropic
  • DeepSeek
  • Google

Embedding API

  • OpenAI

The gem provides a unified interface to interact with these different providers, making it easy to switch between them or use multiple providers in your application.

Installation

Install the gem and add to the application's Gemfile by executing:

bundle add llm_hub

If bundler is not being used to manage dependencies, install the gem by executing:

gem install llm_hub

Usage

client = LlmHub::Completion::Client.new(
  api_key: ENV['OPENAI_API_KEY'],
  provider: :openai
)

response = client.ask_single_question(
  system_prompt: 'You are a helpful assistant.',
  content: 'What is the capital of Japan?',
  model_name: 'gpt-4o-mini'
)

puts response

Development

After checking out the repo, run bin/setup to install dependencies. Then, run bundle exec rake to run the tests and code quality checks (or rake spec for tests only). You can also run bin/console for an interactive prompt that will allow you to experiment.

To install this gem onto your local machine, run bundle exec rake install. To release a new version, update the version number in version.rb, and then run bundle exec rake release, which will create a git tag for the version, push git commits and the created tag, and push the .gem file to rubygems.org.

Contributing

Bug reports and pull requests are welcome on GitHub at https://github.com/akiraNuma/llm_hub.

License

The gem is available as open source under the terms of the MIT License.