MxNet model transformation Onnx

MxNet model export ONNX models
Open Neural Network Exchange (ONNX) provides an open source data model format for the AI model. It defines an extensible model calculation map, as well as built-in operators and standard data type definition. It can be used as media conversion between various AI model, for example, the market is not ready to Caffe model conversion tool MxNet model, we can by means of ONNX, is first converted to Onnx Caffe, and then converted to MxNet Onnx , even more amazing is that the conversion process between the loss of precision, but the original model.

In this tutorial, we'll show you how to save MXNet model ONNX format. Coverage and characteristics MXNet-ONNX operator regularly updated. Visit ONNX operator coverage for the latest information. In this tutorial, we will learn how to use MXNet to ONNX model Export tools to model pre-trained for export.

Prerequisites
To run this tutorial, you need to install the following python module:

MXNet> = 1.3.0. Note that, tested using the following command to install MXNET available: PIP INSTALL MXNET == 1.4.0 --USER

onnx. Note that, by using the following command to install the test onnx available: pip install onnx == 1.2.1 --user

** Note: ** MXNet-ONNX import, export tool follow ONNX 7th Edition operator set, the operator sets included ONNX v1.2.1.

MX mxnet AS Import
Import numpy AS NP
from mxnet.contrib Import onnx AS onnx_mxnet
Import logging
logging.basicConfig (Level = logging.info)

to download a model from MXNet Model Zoo
us from MXNet Model Zoo. ResNet-18 ImageNet download a pre-trained model . We will also file to download synset matching tag

Pre-trained ResNet the Download # Model - JSON and by running the params following code.
Path = 'HTTP: //data.mxnet.io/models/imagenet/'
[mx.test_utils.download (path + 'ResNet /-18 is Layers / ResNet -18-0000.params'),
mx.test_utils.download (path + 'ResNet /-18 is Layers / ResNet-18 is-symbol.json'),
mx.test_utils.download (path + 'synset.txt')]

now, we I have downloaded ResNet-18, params and synset files on the disk.

MXNet ONNX Exporter API to
let us describe MXNet of 'export_model' API.

help(onnx_mxnet.export_model)

Help on function export_model in module mxnet.contrib.onnx.mx2onnx.export_model:

export_model(sym, params, input_shape, input_type=<type 'numpy.float32'>, onnx_file_path=u'model.onnx', verbose=False)
Exports the MXNet model file, passed as a parameter, into ONNX model.
Accepts both symbol,parameter objects as well as json and params filepaths as input.
Operator support and coverage - https://cwiki.apache.org/confluence/display/MXNET/MXNet-ONNX+Integration

Parameters
----------
sym : str or symbol object
Path to the json file or Symbol object
params : str or symbol object
Path to the params file or params dictionary. (Including both arg_params and aux_params)
input_shape : List of tuple
Input shape of the model e.g [(1,3,224,224)]
input_type : data type
Data type EG np.float32 INPUT
onnx_file_path: STR
the Path WHERE to Save The File Generated onnx
verbose: Boolean
the If Print logs to true Will Conversion of Model The

Returns
-------
onnx_file_path: STR
Onnx path File

'export_model' by the API one of two ways to accept MXNet model.

MXNet sym, params object is:
if we are training a model, which is useful. At the end of the training, we just call 'export_model' function, and provides sym and params object as input and other properties to Save the model as ONNX format.
MXNet exported json file and params:
If we have a model pre-trained, and you want to convert them to ONNX format, then this is very useful.
Since we have already downloaded the file model pre-trained, we will use 'export_model' API path passing through symbols and params file.

How to use MXNet to ONNXA import, export, PI tools
we will use to download the file model pre-trained (sym, params) and define the input variables.

# Downloaded files and parameter input symbols
sym = './resnet-18-symbol.json'
the params = './resnet-18-0000.params'

# Standard Imagenet inputs - 3 channels, 224 * 224
input_shape = (1,3,224,224)

# Path to the output file
onnx_file = './mxnet_exported_resnet50.onnx'

we have the desired 'export_model' API defines the input parameters. Now, we are ready to MXNet model into ONNX format

# Call the exported model API. It is the path of return after onnx model conversion
converted_model_path = onnx_mxnet.export_model (sym, params, [input_shape], np.float32, onnx_file)

this path model of the API returns a conversion, you can use this path later in the model into other frame.

ONNX model validation
Now we can use ONNX checking tool to check the validity ONNX model after conversion. The tool to validate the model by checking whether the content contains a valid protobuf:

from onnx import checker
import onnx

# Load onnx model
model_proto = onnx.load_model(converted_model_path)

Converted ONNX protobuf the Check IF # IS Valid
checker.check_graph (model_proto.graph)

if the converted format protobuf ONNX proto not meet specifications, will throw an error check, in the present embodiment by success.

The method verify the validity of the export model of the original buf. Now, the model can be imported into another frame of reasoning!

Guess you like

Origin www.cnblogs.com/cloudrivers/p/12129606.html