Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Load and save model issue #8243

Open
Infiniteskys opened this issue Apr 11, 2024 · 5 comments
Open

Load and save model issue #8243

Infiniteskys opened this issue Apr 11, 2024 · 5 comments

Comments

@Infiniteskys
Copy link

Infiniteskys commented Apr 11, 2024

**I have an question about using the TFJS package to classfy the image.

I try to load model from the website URL, it is success. Since I need to run the app offline, I have to save the model and load it by local path.

The main issue is how to save the model in local path, i use the save function, but it show there is no save handler for it.

After I use the asynstorageIO, it seem the saving is work but for the loading part, there still a big issue for it.

here is my code.**

import * as tf from "@tensorflow/tfjs";
import {asyncStorageIO} from "@tensorflow/tfjs-react-native";

export const loadModel = async () => {
try {
await tf.ready();
const model = await tf.loadGraphModel(
"https://test_model/model.json"
);
console.log("Model loaded.");
model.summary();

// const saved = await model.save('file:///path/to/my-model');
const saved = await model.save(asyncStorageIO('customs'));
console.log("model saved");
saved.summary();

const final = await tf.loadLayersModel(saved);
console.log(final == model);


return model;

} catch (e) {
console.error("Error:", e);
return null;
}
};

here is the error I meet:

LOG [Error: Cannot proceed with model loading because the IOHandler provided does not have the load method implemented.]
LOG [TypeError: url.match is not a function (it is undefined)] <- loadGraphModel

@gaikwadrahul8
Copy link
Contributor

Hi, @Infiniteskys

Thank you for bringing this issue to our attention and please give it try with below code snippet and see is it working as expected or not ? Please refer official documentation of asyncStorageIO to save and load the model to/from async storage

import * as tf from "@tensorflow/tfjs";
import { asyncStorageIO } from "@tensorflow/tfjs-react-native";

export const loadModel = async () => {
  try {
    await tf.ready();
    const model = await tf.loadGraphModel("https://test_model/model.json");
    console.log("Model loaded.");
    model.summary();

    // Save the model to AsyncStorage
    const savedModel = await model.save(asyncStorageIO('customs'));
    console.log("Model saved to AsyncStorage");

    // Load the model from AsyncStorage
    const loadedModel = await tf.loadLayersModel(asyncStorageIO('customs'));
    console.log(loadedModel == model); // This might not be true due to model serialization

    return loadedModel;

  } catch (e) {
    console.error("Error:", e);
    return null;
  }
};

If issue still persists please let us know and if possible please help us with your Github repo along with model and complete steps to run code to replicate the same behaviour from our end.

Thank you for your cooperation and patience.

@Infiniteskys
Copy link
Author

Hi @gaikwadrahul8 ,
I had tried it but it still not working and show the error that layer improper config format.
I believe the json file we convert is a graph model, I had tried to use Loadgraphmodel function, but I can only load from the Website URL but not the asynstrongIO.

@Infiniteskys
Copy link
Author

Infiniteskys commented Apr 12, 2024

If I have the function as below, it will show the error: Based on the provided shape, [4.2],the tensor should have 8 values but has 1.

import * as tf from "@tensorflow/tfjs";
import { asyncStorageIO } from "@tensorflow/tfjs-react-native";

export const loadModel = async () => {
try {
await tf.ready();
const model = await tf.loadGraphModel("https://test_model/model.json");
console.log("Model loaded.");

// Save the model to AsyncStorage
const savedModel = await model.save(asyncStorageIO('customs'));
console.log("Model saved to AsyncStorage");

// Load the model from AsyncStorage
const loadedModel = await tf.loadGraphModel(asyncStorageIO('customs'));

return loadedModel;

} catch (e) {
console.error("Error:", e);
return null;
}
};

@gaikwadrahul8
Copy link
Contributor

gaikwadrahul8 commented Apr 15, 2024

Hi, @Infiniteskys

If possible could you please help us with your model file in zip format along with complete steps to replicate the same behavior from our end to investigate this issue further ?

Thank you for your cooperation and patience.

@Infiniteskys
Copy link
Author

Hi @gaikwadrahul8 ,
Here is the test model other friend train at the service side and convert it at the service.
test_model.zip

I had also test to use the location directly.

const loadedModel = await tf.loadLayersModel('/moblientclient/assest/model/test_model/model.json');

But it will show error the network request fail. But in this case, I dont have any idea how to save the model if there is a update version at the service.

Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants