Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Order of output heads from converted model keeps changing #8204

Open
Pensarfeo opened this issue Mar 15, 2024 · 4 comments
Open

Order of output heads from converted model keeps changing #8204

Pensarfeo opened this issue Mar 15, 2024 · 4 comments
Assignees
Labels
comp:converter type:bug Something isn't working

Comments

@Pensarfeo
Copy link

Hello everyone, whenever I covert my model to tfjs the order of the output heads keeps changing, which means I need to check every time which one is correct and which one is not. I just have not been able to figure out how to fix that, I am not super sure if this is a bug or a feature, but it would seem a very unintuitive behavior.

This is the code I use to convert from python to tfjs

@tf.function()
def module(x):
    return model(x)
module(tf.zeros([1, 128, 128, 3]))

module_with_signature_path = os.path.join(modelName, 'saved')
tf.saved_model.save(model, module_with_signature_path)
model.trainable = False


from IPython.display import Markdown, display

tfjs_model_dir=os.path.join(tfModelSavingDir, name)

tfjs_convert_command = f"""tensorflowjs_converter
                 --input_format=tf_saved_model 
                 --output_format=tfjs_graph_model 
                 --signature_name=serving_default 
                 --saved_model_tags=serve
                --use_structured_outputs_names=True
                 "{module_with_signature_path}" 
                 "{tfjs_model_dir}"
                 "--quantize_uint8"
                 """
tfjs_convert_command = " ".join(tfjs_convert_command.split())
display(Markdown(f"bash\n{tfjs_convert_command}\n"))

print("Exporting TensorFlow SavedModel to TensorFlow.js Graph model...")
conversion_result = %sx $tfjs_convert_command
print("\n".join(conversion_result))

Then I execute the network in the usual way

const outputTfJs = await model.executeAsync(colorCenterPatchersNorm)
const [semantic, dxdywh] = outputTfJs

the order of the outputs keeps changing

@Pensarfeo Pensarfeo added the type:bug Something isn't working label Mar 15, 2024
@gaikwadrahul8 gaikwadrahul8 self-assigned this Mar 18, 2024
@gaikwadrahul8
Copy link
Contributor

Hi, @Pensarfeo

I apologize for the delayed response and I see currently you're using --use_structured_outputs_names (Changes output of graph model to match the structured_outputs format instead of list format. Defaults to False.) option in tfjs-converter command which is recommended approach.

In Python code I believe you've assigned meaningful names to your model's outputs heads, I would suggest you verify output heads names once by printing the model summary with the help of model.summary() and during conversion include --use_structured_outputs_names=True in your conversion command with this setup, the converted TFJs model will have named outputs that match the names you assigned in Python. You can then access them by name something like below :

const outputTfJs = await model.executeAsync(colorCenterPatchersNorm);
const semanticOutput = outputTfJs['semantic'];
const dxdywhOutput = outputTfJs['dxdywh'];

If possible could you please help us with your complete code-snippet or Github repo along with your model and complete steps to replicate the same behavior from our end to investigate this issue further from our end ?

Thank you for your understanding and patience.

@Pensarfeo
Copy link
Author

@gaikwadrahul8

Thanks for your answer!
I'm not currently working on the model right now, so ill se what I can provide you once I am back to work on it.

Thanks again and thanks for building this amazing tool :)

@gaikwadrahul8
Copy link
Contributor

Hi, @Pensarfeo

You're welcome! Feel free to reach out whenever you're back to working on the model and we'll do our best to help further. In the meantime, Thank you so much for the kind words about the tool!

To confirm, would you like to close this issue for now and create a new one when you're ready to resume working on the model ?

Thank you for your understanding and patience.

@Pensarfeo
Copy link
Author

@gaikwadrahul8 I think its better to keep it open to avoid having too many repeated issues.
but also keeping it open or closed probably matters more to you so feel free to close it if it helps you better manage the issues in this repo 😉.

Cheers

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:converter type:bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants