README.md in onnxruntime-0.1.2 vs README.md in onnxruntime-0.2.0

- old
+ new

@@ -50,10 +50,10 @@ model.predict({x: [1, 2, 3]}, output_names: ["label"]) ``` ## Inference Session API -You can also use the Inference Session API, which follows the [Python API](https://microsoft.github.io/onnxruntime/api_summary.html). +You can also use the Inference Session API, which follows the [Python API](https://microsoft.github.io/onnxruntime/python/api_summary.html). ```ruby session = OnnxRuntime::InferenceSession.new("model.onnx") session.run(nil, {x: [1, 2, 3]}) ```