ONNX Support

ReNom2.7 supports export a neural network as onnx format. The exported onnx file can be loaded from various onnx supporting frame works.

renom.utility.onnx

renom.utility.onnx. export_onnx ( name , model , x , path , printtext=False )

This function exports an onnx file

Parameters:
  • name ( str ) – The name of computational graph.
  • model ( Model ) – Neural Network Model
  • x ( ndarray ) – Dummy input for building a computational graph.
  • path ( str ) – The onnx file path to which the model will be export.
  • printtext ( bool ) – If True is given, this function print the str(model).

How to export a neural network model

import renom as rm
import renom.utility.onnx as onnx

# Define a CNN
cnn = rm.Sequential([
    rm.Conv2d(channel=32, filter=3, padding=1),
    rm.Relu(),
    rm.Conv2d(channel=64, filter=3, padding=1),
    rm.Relu(),
    rm.MaxPool2d(filter=2, stride=2),
    rm.Dropout(0.5),
    rm.Flatten(),
    rm.Dense(128),
    rm.Relu(),
    rm.Dense(10)
])

# Train the CNN on some dataset
# ... CNN Train ...

# Save the trained model as ONNX fomrat.
# Note: This requires ``dummy_input`` for build a computational graph.
dummy_input = np.random.random((1, 1, 28, 28))
onnx.export_onnx("mnist", cnn, dummy_input, "mnist.onnx")

Onnx supported functions.

Following ReNom functions can be converted to onnx format.

Operations

  • __neg__
  • __add__
  • __abs__
  • __sub__
  • __mul__
  • __div__

Activation functions