New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Load and save model issue #8243
Comments
Hi, @Infiniteskys Thank you for bringing this issue to our attention and please give it try with below code snippet and see is it working as expected or not ? Please refer official documentation of asyncStorageIO to save and load the model to/from async storage
If issue still persists please let us know and if possible please help us with your Github repo along with model and complete steps to run code to replicate the same behaviour from our end. Thank you for your cooperation and patience. |
Hi @gaikwadrahul8 , |
If I have the function as below, it will show the error: Based on the provided shape, [4.2],the tensor should have 8 values but has 1. import * as tf from "@tensorflow/tfjs"; export const loadModel = async () => {
} catch (e) { |
Hi, @Infiniteskys If possible could you please help us with your model file in zip format along with complete steps to replicate the same behavior from our end to investigate this issue further ? Thank you for your cooperation and patience. |
Hi @gaikwadrahul8 , I had also test to use the location directly. const loadedModel = await tf.loadLayersModel('/moblientclient/assest/model/test_model/model.json'); But it will show error the network request fail. But in this case, I dont have any idea how to save the model if there is a update version at the service. Thank you. |
**I have an question about using the TFJS package to classfy the image.
I try to load model from the website URL, it is success. Since I need to run the app offline, I have to save the model and load it by local path.
The main issue is how to save the model in local path, i use the save function, but it show there is no save handler for it.
After I use the asynstorageIO, it seem the saving is work but for the loading part, there still a big issue for it.
here is my code.**
import * as tf from "@tensorflow/tfjs";
import {asyncStorageIO} from "@tensorflow/tfjs-react-native";
export const loadModel = async () => {
try {
await tf.ready();
const model = await tf.loadGraphModel(
"https://test_model/model.json"
);
console.log("Model loaded.");
model.summary();
} catch (e) {
console.error("Error:", e);
return null;
}
};
here is the error I meet:
LOG [Error: Cannot proceed with model loading because the IOHandler provided does not have the load method implemented.]
LOG [TypeError: url.match is not a function (it is undefined)] <- loadGraphModel
The text was updated successfully, but these errors were encountered: