Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reading model from internal storage #15

Open
sai-pher opened this issue Apr 28, 2019 · 4 comments
Open

Reading model from internal storage #15

sai-pher opened this issue Apr 28, 2019 · 4 comments

Comments

@sai-pher
Copy link

Hi there.

I wanted to know if it is possible to store and read the trained .tflite model from the Android device's internal storage instead of the assets folder?

The issue I am having is with the startOffset value in the loadModelFile(AssetManager assetManager, String modelPath) function. currently, it comes from AssetFileDescriptor fileDescriptor = assetManager.openFd(modelPath).
This file descriptor is used to get the start offset and declared length. Is there another way to read it from internal memory instead and still get the start offset and declaredLength? If not, is there a way to calculate the startOffset of a new model and its declared length when reading the raw binary from internal storage?

@liyoung1992
Copy link

I have the same problem

@susielau
Copy link

susielau commented Jul 9, 2019

Hi, I have a similar issue. Has anyone figured out how to solve it?

@mehedi-shafi
Copy link

File modelFile = new File(this.modelPath);
FileChannel channel = new FileInputStream(modelFile).getChannel();
interpreterInference = new Interpreter(channel.map(FileChannel.MapMode.READ_ONLY, 0, modelFile.length()), new Interpreter.Options());

This does load the model. Without any issue. But it arises a new problem which is

Caused by: java.lang.IllegalStateException: Internal error: Unexpected failure when preparing tensor allocations: tensorflow/lite/kernels/conv.cc:237 input->dims->size != 4 (1 != 4)Node number 0 (CONV_2D) failed to prepare.
    
        at org.tensorflow.lite.NativeInterpreterWrapper.allocateTensors(Native Method)

Well I guess that's because of the input tensor size (shape) I have to reshape the image I suppose.

@GyanendroKh
Copy link

Well, I've been searching all over the place and finally I've figured it out. It's dead simple.
For some reason I thought that AssetFileDescriptor's getStartOffset is related to the actual tflite model but it's not. I think the getStartOffset gives the start point of the file in the application's asset. And for the tflite model the startOffset should be 0 because that's where the file start as it is only one file.
So, the code should be

File file = new File('path_to_model');
FileInputStream is = new FileInputStream(file);
    
return is.getChannel().map(FileChannel.MapMode.READ_ONLY, 0, file.length());

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants