0.07
The project is in a healthy, maintained state
High performance scoring engine for ML models
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
 Dependencies

Runtime

>= 0
 Project Readme

ONNX Runtime Ruby

šŸ”„ ONNX Runtime - the high performance scoring engine for ML models - for Ruby

Check out an example

Build Status

Installation

Add this line to your applicationā€™s Gemfile:

gem "onnxruntime"

Getting Started

Load a model and make predictions

model = OnnxRuntime::Model.new("model.onnx")
model.predict({x: [1, 2, 3]})

Download pre-trained models from the ONNX Model Zoo

Get inputs

model.inputs

Get outputs

model.outputs

Get metadata

model.metadata

Load a model from a string or other IO object

io = StringIO.new("...")
model = OnnxRuntime::Model.new(io)

Get specific outputs

model.predict({x: [1, 2, 3]}, output_names: ["label"])

Session Options

OnnxRuntime::Model.new(path_or_io, {
  enable_cpu_mem_arena: true,
  enable_mem_pattern: true,
  enable_profiling: false,
  execution_mode: :sequential,    # :sequential or :parallel
  free_dimension_overrides_by_denotation: nil,
  free_dimension_overrides_by_name: nil,
  graph_optimization_level: nil,  # :none, :basic, :extended, or :all
  inter_op_num_threads: nil,
  intra_op_num_threads: nil,
  log_severity_level: 2,
  log_verbosity_level: 0,
  logid: nil,
  optimized_model_filepath: nil,
  profile_file_prefix: "onnxruntime_profile_",
  session_config_entries: nil
})

Run Options

model.predict(input_feed, {
  output_names: nil,
  log_severity_level: 2,
  log_verbosity_level: 0,
  logid: nil,
  terminate: false,
  output_type: :ruby       # :ruby or :numo
})

Inference Session API

You can also use the Inference Session API, which follows the Python API.

session = OnnxRuntime::InferenceSession.new("model.onnx")
session.run(nil, {x: [1, 2, 3]})

The Python example models are included as well.

OnnxRuntime::Datasets.example("sigmoid.onnx")

GPU Support

To enable GPU support on Linux and Windows, download the appropriate GPU release and set:

OnnxRuntime.ffi_lib = "path/to/lib/libonnxruntime.so" # onnxruntime.dll for Windows

History

View the changelog

Contributing

Everyone is encouraged to help improve this project. Here are a few ways you can help:

To get started with development and testing:

git clone https://github.com/ankane/onnxruntime-ruby.git
cd onnxruntime-ruby
bundle install
bundle exec rake vendor:all
bundle exec rake test