New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Reading model from internal storage #15
Comments
I have the same problem |
Hi, I have a similar issue. Has anyone figured out how to solve it? |
File modelFile = new File(this.modelPath);
FileChannel channel = new FileInputStream(modelFile).getChannel();
interpreterInference = new Interpreter(channel.map(FileChannel.MapMode.READ_ONLY, 0, modelFile.length()), new Interpreter.Options()); This does load the model. Without any issue. But it arises a new problem which is
Well I guess that's because of the input tensor size (shape) I have to reshape the image I suppose. |
Well, I've been searching all over the place and finally I've figured it out. It's dead simple. File file = new File('path_to_model');
FileInputStream is = new FileInputStream(file);
return is.getChannel().map(FileChannel.MapMode.READ_ONLY, 0, file.length()); |
Hi there.
I wanted to know if it is possible to store and read the trained
.tflite
model from the Android device's internal storage instead of the assets folder?The issue I am having is with the
startOffset
value in theloadModelFile(AssetManager assetManager, String modelPath)
function. currently, it comes fromAssetFileDescriptor fileDescriptor = assetManager.openFd(modelPath)
.This file descriptor is used to get the start offset and declared length. Is there another way to read it from internal memory instead and still get the start offset and declaredLength? If not, is there a way to calculate the startOffset of a new model and its declared length when reading the raw binary from internal storage?
The text was updated successfully, but these errors were encountered: