Quick tip: converting Gluon models to symbolic format
Julien Simon
Posted on January 25, 2019
As discussed earlier, Gluon is an imperative API available on top of the symbolic API implemented by Apache MXNet.
One of the cool features of Gluon is the extensive collection of pre-trained computer vision models available in Gluon CV. Using these models with the Gluon API is extremely easy, but sometimes we’d rather use the symbolic API instead.
One reason could be language support: Gluon is Python only whereas MXNet supports a whole list of languages including C++, which you may need to get the best prediction performance possible. Unfortunately, the MXNet model zoo is not synchronized with the Gluon model zoo, so you can’t just grab the same models.
One easy solution to this problem is to use the Gluon API to download models, export them to symbolic format and then load them using the MXNet API.
It goes like this.
from gluoncv import model\_zoo
import mxnet as mx
import numpy as np
# Download the model from the Gluon model zoo
# You'll find it in ~/.mxnet/models
net = model\_zoo.get\_model('resnet50\_v1', pretrained=True)
# Convert the model to symbolic format
net.hybridize()
# Build a fake image to run a single prediction
# This is required to initialize the model properly
x = np.zeros([1,3,224,244])
x = mx.nd.array(x)
# Predict the fake image
net.forward(x)
# Export the model
net.export('resnet50\_v1')
This will export the model weights and the JSON file containing the symbolic definition of the model.
$ ls -1 resnet50\*
resnet50\_v1-0000.params
resnet50\_v1-symbol.json
Now you can load this model as usual with the model.load_checkpoint() API. Just make sure you use a recent version of MXNet, as Gluon models could be incompatible with older ones. YMMV.
import mxnet as mx
sym, arg, aux = mx.model.load\_checkpoint("resnet50\_v1", 0)
...
That’s it. Now you can enjoy all these models with the symbolic API :)
Posted on January 25, 2019
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.