Project

llm_lib

0.0
The project is in a healthy, maintained state
Gem to invoke API calls to Huggingface and Openai LLMs
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
 Dependencies

Development

~> 5.19.0

Runtime

~> 2.8.1
~> 2.6.3
~> 1.1.2
~> 0.6.0
~> 13.0.6
~> 0.12.2
 Project Readme

llm_lib

example workflow

Gem to invoke API calls to Huggingface and OpenAI LLMs.

Support both Huggingface and OpenAI models.

Install

gem install llm_lib

Usage examples

Replace 'API_KEY' with your actual OpenAI/Huggingface API key

OpenAI API call

apikey = ENV['OPENAI_API_KEY']
client = LlmLib::OpenAIClient.new(apikey)

prompt = "Once upon a time"
max_tokens = 100
response = client.chat_gpt_call(prompt, 
                        max_tokens)
puts response

HuggingFace API call

apikey = ENV['HUGGINGFACE_API_KEY']
client = LlmLib::HuggingfaceApiClient.new(apikey)

query = "Once upon a time"
response = client.hugging_bloom_call(query)

puts response

Supported models

Currently support following models, more models will be added in the future.

  • gpt-3.5-turbo
  • gpt-4
  • bigscience/bloom
  • tiiuae/falcon-40b-instruct
  • meta-llama/Llama-2-70b-chat-hf
  • databricks/dolly-v2-12b
  • google/flan-t5-xl

Use following example to invoke unavailable models from huggingface.

class HuggingfaceApiClient

    def initialize(api_key)
        @api_key = api_key
    end

    def hugging_dolly2_call(query, model = "databricks/dolly-v2-12b")
        response = HuggingFace.send(@api_key,
                                model,
                                query 
                                )
        response
    end
end

require 'llm_lib/huggingface'

For more visit the documentation.

Disclaimer

This is only a API wrapper for OpenAI and Huggingface models. Usage license, pricing, limitations, etc; are relevant to each model and provider, thus their guidelines and documentation must be read and followed before using.