README.md in onnxruntime-0.2.1 vs README.md in onnxruntime-0.2.2
- old
+ new
@@ -48,16 +48,59 @@
```ruby
model.predict({x: [1, 2, 3]}, output_names: ["label"])
```
+## Session Options
+
+```ruby
+OnnxRuntime::Model.new(path_or_bytes, {
+ enable_cpu_mem_arena: true,
+ enable_mem_pattern: true,
+ enable_profiling: false,
+ execution_mode: :sequential,
+ graph_optimization_level: nil,
+ inter_op_num_threads: nil,
+ intra_op_num_threads: nil,
+ log_severity_level: 2,
+ log_verbosity_level: 0,
+ logid: nil,
+ optimized_model_filepath: nil
+})
+```
+
+## Run Options
+
+```ruby
+model.predict(input_feed, {
+ log_severity_level: 2,
+ log_verbosity_level: 0,
+ logid: nil,
+ terminate: false
+})
+```
+
## Inference Session API
You can also use the Inference Session API, which follows the [Python API](https://microsoft.github.io/onnxruntime/python/api_summary.html).
```ruby
session = OnnxRuntime::InferenceSession.new("model.onnx")
session.run(nil, {x: [1, 2, 3]})
+```
+
+The Python example models are included as well.
+
+```ruby
+OnnxRuntime::Datasets.example("sigmoid.onnx")
+```
+
+## GPU Support
+
+To enable GPU support on Linux and Windows, download the appropriate [GPU release](https://github.com/microsoft/onnxruntime/releases) and set:
+
+```ruby
+OnnxRuntime.ffi_lib = "path/to/lib/libonnxruntime.so" # onnxruntime.dll for Windows
```
## History
View the [changelog](https://github.com/ankane/onnxruntime/blob/master/CHANGELOG.md)