Py's onnx: a detailed introduction to the onnx/onnxruntime library, installation, and usage

Py's onnx: a detailed introduction to the onnx/onnxruntime library, installation, and usage

Table of contents

Introduction to the onnx/onnxruntime library

Installation of onnx/onnxruntime library

How to use the onnx/onnxruntime library

1. Basic Usage


Introduction to the onnx / onnxruntime library

         Open Neural Network Exchange (ONNX) is an open ecosystem that empowers AI developers to choose the right tools to meet the continuous development of projects. ONNX provides an open source format for AI models, covering deep learning and traditional ML . It defines an extensible computational graph model , along with definitions of built-in operators and standard data types. Currently, we focus on the features required for inference (scoring).
         ONNX is widely supported and can be found in many frameworks, tools and hardware. Enabling interoperability between different frameworks and simplifying the path from research to production helps increase the velocity of innovation in the AI ​​community. We invite the community to join us and further develop ONNX.

GitHub地址GitHub - onnx/onnx: Open standard for machine learning interoperability

Installation of onnx / onnxruntime library


pip install onnx

pip install -i https://pypi.tuna.tsinghua.edu.cn/simple onnx

pip install -i https://pypi.tuna.tsinghua.edu.cn/simple onnxruntime

 

 

How to use the onnx / onnxruntime library

1. Basic Usage

import onnx
import onnxruntime as ort

# 加载ONNX模型
model = onnx.load("model.onnx")

# 打印模型信息
print(onnx.helper.printable_graph(model.graph))

# 创建ONNX运行时
ort_session = ort.InferenceSession("model.onnx")

# 准备输入数据
input_data = np.random.rand(1, 3, 224, 224).astype(np.float32)

# 推理
ort_inputs = {ort_session.get_inputs()[0].name: input_data}
ort_outputs = ort_session.run(None, ort_inputs)

# 输出结果
print(ort_outputs)

Guess you like

Origin blog.csdn.net/qq_41185868/article/details/130652656