Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problem with tensorflow serving #45

Open
dsabarinathan opened this issue Apr 12, 2019 · 2 comments
Open

Problem with tensorflow serving #45

dsabarinathan opened this issue Apr 12, 2019 · 2 comments

Comments

@dsabarinathan
Copy link

when I was creating the tensorflow serving model with frozen PB. I am getting an empty variable folder. could you share the un-frozen graph file?

Please help me on this issue.

@victordibia
Copy link
Owner

victordibia commented Apr 12, 2019

Which model are you referring to (v1 or v2)?
In the readme I have some directions on how to use the model checkpoints (which are available for mobilenetv1) to generate your own frozen graph.

In my experience using a frozen model from different TensorFlow models can be challenging. Its better to export using your own froze graph from checkpoints

@dsnsabari
Copy link

Hi Victordibia,

I am using ssdlitemobilenetv2 model checkpoint. Below I mentioned the code for converting the model checkpoint to serving. But empty variable folder was generated.

import tensorflow as tf

SAVE_PATH = D:/handtracking-master/model-checkpoint/ssdlitemobilenetv2/'
MODEL_NAME = 'test'
VERSION = 5
SERVE_PATH = './serve/{}/{}'.format(MODEL_NAME, VERSION)

checkpoint = tf.train.latest_checkpoint(SAVE_PATH)
print(checkpoint)

tf.reset_default_graph()

with tf.Session() as sess:
saver = tf.train.import_meta_graph(checkpoint + '.meta')
graph = tf.get_default_graph()
sess.run(tf.global_variables_initializer())
inputs = tf.saved_model.utils.build_tensor_info(graph.get_tensor_by_name('image_tensor:0'))
detection_boxes = tf.saved_model.utils.build_tensor_info(graph.get_tensor_by_name('detection_boxes:0'))
detection_scores = tf.saved_model.utils.build_tensor_info(graph.get_tensor_by_name('detection_scores:0'))
detection_classes = tf.saved_model.utils.build_tensor_info(graph.get_tensor_by_name('detection_classes:0'))
num_detections = tf.saved_model.utils.build_tensor_info(graph.get_tensor_by_name('num_detections:0'))

export_path =  './savedmodel/3'
builder = tf.saved_model.builder.SavedModelBuilder(export_path)


prediction_signature = (
  tf.saved_model.signature_def_utils.build_signature_def(
      inputs={'inputs': inputs},
      outputs={'output1': detection_boxes,'output2':detection_scores,'output3':detection_classes},
      method_name=tf.saved_model.signature_constants.PREDICT_METHOD_NAME))

builder.add_meta_graph_and_variables(
  sess, [tf.saved_model.tag_constants.SERVING],
  signature_def_map={
      tf.saved_model.signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY:
          prediction_signature 
  },
  )
builder.save()

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants