New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TensorFlow: 2.0 saved_model support #342
Comments
Hi, this is a great program - thanks. I am using a subclassed model and therefore can only save_weights (TensorFlow SavedModel checkpoint). What is the easiest way to convert this into a supported file-type for Netron? |
I appreciate your efforts on this. Can you clarify the status on this feature? Is there supported in the desktop or browser version of Netron? |
I appreciate your efforts on netron. I really like netron!!! |
I think a temporary easiest way is using # A test model
class MyModel(tf.keras.Model):
def __init__(self):
super(MyModel, self).__init__()
self.conv1 = keras.layers.Conv2D(32, 3, activation='relu')
self.flatten = keras.layers.Flatten()
self.d1 = keras.layers.Dense(10, activation='softmax')
def call(self, x):
x = self.conv1(x)
x = self.flatten(x)
return self.d1(x)
model = MyModel()
model(tf.ones([1, 28, 28, 3]))
model.save('aa')
# From a loaded saved_model:
cc = tf.lite.TFLiteConverter.from_keras_model(model)
# From a saved_model directory:
# cc = tf.lite.TFLiteConverter.from_saved_model('aa')
open('aa.tflite', 'wb').write(cc.convert()) |
Is TF 2.0 savedModel supported? it's looks weird. converting the TF2.0 SavedModel to ONNX works well with Netron though. Thanks |
I created basic TF image classification model as described in the tutorial (using tensorflow==2.5.0)
Script to create and save test TF model:
|
To visualize saved model "mymodel" with tensorboard
Looks a bit ugly , but at least smth https://www.dropbox.com/s/bhjugtur1gsxk0m/saved_model_serving_default_graph.png?dl=0 |
The situation improved in TF 2.5.0. Now
Now we can save this loaded "saved model" in h5 format and open it in Netron.
|
Examples:
0342.zip
The text was updated successfully, but these errors were encountered: