Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

I want to save the model as tflite, but I get the error that can't generate my tflite file #67811

Closed
Blackoowhite opened this issue May 17, 2024 · 3 comments
Assignees
Labels
comp:lite TF Lite related issues TF 2.16 type:support Support issues

Comments

@Blackoowhite
Copy link

Blackoowhite commented May 17, 2024

  1. System information
  • My code compile under windows
  • Python version is : 3.10.8
  • TF version is : 2.16.1
  1. Issue
    I generated a sequential model, and I want to transfer it into tflite, but I got the error.

  2. Code
    `
    folder_path = r"D:\VisualStudioCode\Project\FL_For_Android\android\tflite_convertor\tflite_file"
    print(os.path.exists(folder_path))
    h5_path = os.path.join(folder_path, "model.h5")
    tflite_path = os.path.join(folder_path, "model.tflite")

Define the head model (which is actually the full model in this case).

model: Sequential = tf.keras.Sequential(
[
tf.keras.Input(shape=(32, 32, 3)),
tf.keras.layers.Conv2D(6, 5, activation="relu"),
tf.keras.layers.MaxPooling2D(pool_size=(2, 2)),
tf.keras.layers.Conv2D(16, 5, activation="relu"),
tf.keras.layers.Flatten(),
tf.keras.layers.Dense(units=120, activation="relu"),
tf.keras.layers.Dense(units=84, activation="relu"),
tf.keras.layers.Dense(units=10, activation="softmax"),
]
)
model.compile(loss="categorical_crossentropy", optimizer="sgd")

converter = tf.lite.TFLiteConverter.from_keras_model(model)
tflite_model = converter.convert()

with open(tflite_path, 'wb') as f:
f.write(tflite_model)
`

  1. Info & logs
    2024-05-17 16:21:00.770822: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable TF_ENABLE_ONEDNN_OPTS=0. 2024-05-17 16:21:01.229299: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable TF_ENABLE_ONEDNN_OPTS=0. True 2024-05-17 16:21:02.362680: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: AVX2 AVX_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING:absl:You are saving your model as an HDF5 file via model.save()orkeras.saving.save_model(model). This file format is considered legacy. We recommend using instead the native Keras format, e.g. model.save('my_model.keras')orkeras.saving.save_model(model, 'my_model.keras')`.

WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
W0000 00:00:1715934062.706921 22212 tf_tfl_flatbuffer_helpers.cc:390] Ignored output_format.
W0000 00:00:1715934062.707266 22212 tf_tfl_flatbuffer_helpers.cc:393] Ignored drop_control_dependency.
2024-05-17 16:21:02.708010: I tensorflow/cc/saved_model/reader.cc:83] Reading SavedModel from: C:\Users\BLACK01\AppData\Local\Temp\tmpxvz0370l
2024-05-17 16:21:02.708775: I tensorflow/cc/saved_model/reader.cc:51] Reading meta graph with tags { serve }
2024-05-17 16:21:02.708865: I tensorflow/cc/saved_model/reader.cc:146] Reading SavedModel debug info (if present) from: C:\Users\BLACK0
1\AppData\Local\Temp\tmpxvz0370l
2024-05-17 16:21:02.713857: I tensorflow/compiler/mlir/mlir_graph_optimization_pass.cc:388] MLIR V1 optimization pass is not enabled
2024-05-17 16:21:02.715152: I tensorflow/cc/saved_model/loader.cc:234] Restoring SavedModel bundle.
2024-05-17 16:21:02.736379: I tensorflow/cc/saved_model/loader.cc:218] Running initialization op on SavedModel bundle at path: C:\Users\BLACK0~1\AppData\Local\Temp\tmpxvz0370l
2024-05-17 16:21:02.741926: I tensorflow/cc/saved_model/loader.cc:317] SavedModel load for tags { serve }; Status: success: OK. Took 33913 microseconds.
2024-05-17 16:21:02.751512: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:268] disabling MLIR crash reproducer, set env var MLIR_CRASH_REPRODUCER_DIRECTORY to enable.
loc(fused["ReadVariableOp:", callsite("sequential_1/conv2d_1/Reshape/ReadVariableOp@__inference_serving_default_183"("d:\VisualStudioCode\Project\FL_For_Android\android\tflite_convertor\test_convert_to_tflite.py":37:1) at callsite("D:\Python\python3.10.8\lib\site-packages\tensorflow\lite\python\lite.py":1175:1 at callsite("D:\Python\python3.10.8\lib\site-packages\tensorflow\lite\python\lite.py":1129:1 at callsite("D:\Python\python3.10.8\lib\site-packages\tensorflow\lite\python\lite.py":1636:1 at callsite("D:\Python\python3.10.8\lib\site-packages\tensorflow\lite\python\lite.py":1614:1 at callsite("D:\Python\python3.10.8\lib\site-packages\tensorflow\lite\python\convert_phase.py":205:1 at callsite("D:\Python\python3.10.8\lib\site-packages\tensorflow\lite\python\lite.py":1537:1 at callsite("D:\Python\python3.10.8\lib\site-packages\keras\src\backend\tensorflow\layer.py":58:1 at callsite("D:\Python\python3.10.8\lib\site-packages\keras\src\backend\tensorflow\layer.py":120:1 at callsite("D:\Python\python3.10.8\lib\site-packages\keras\src\utils\traceback_utils.py":117:1 at callsite("D:\Python\python3.10.8\lib\site-packages\keras\src\layers\layer.py":826:1 at callsite("D:\Python\python3.10.8\lib\site-packages\keras\src\utils\traceback_utils.py":117:1 at callsite("D:\Python\python3.10.8\lib\site-packages\keras\src\ops\operation.py":48:1 at callsite("D:\Python\python3.10.8\lib\site-packages\keras\src\utils\traceback_utils.py":156:1 at callsite("D:\Python\python3.10.8\lib\site-packages\keras\src\models\sequential.py":206:1 at callsite("D:\Python\python3.10.8\lib\site-packages\keras\src\models\functional.py":199:1 at callsite("D:\Python\python3.10.8\lib\site-packages\keras\src\ops\function.py":151:1 at callsite("D:\Python\python3.10.8\lib\site-packages\keras\src\models\functional.py":583:1 at callsite("D:\Python\python3.10.8\lib\site-packages\keras\src\utils\traceback_utils.py":117:1 at callsite("D:\Python\python3.10.8\lib\site-packages\keras\src\layers\layer.py":826:1 at callsite("D:\Python\python3.10.8\lib\site-packages\keras\src\utils\traceback_utils.py":117:1 at callsite("D:\Python\python3.10.8\lib\site-packages\keras\src\ops\operation.py":48:1 at callsite("D:\Python\python3.10.8\lib\site-packages\keras\src\utils\traceback_utils.py":156:1 at callsite("D:\Python\python3.10.8\lib\site-packages\keras\src\layers\convolutional\base_conv.py":233:1 at callsite("D:\Python\python3.10.8\lib\site-packages\keras\src\ops\numpy.py":4527:1 at callsite("D:\Python\python3.10.8\lib\site-packages\keras\src\backend\tensorflow\numpy.py":1618:1 at "D:\Python\python3.10.8\lib\site-packages\keras\src\backend\tensorflow\core.py":64:1))))))))))))))))))))))))))]): error: missing attribute 'value'
LLVM ERROR: Failed to infer result type(s).`

@Blackoowhite Blackoowhite changed the title I want to save the model as tflite, but I get the error that can't generate my tflite I want to save the model as tflite, but I get the error that can't generate my tflite file May 17, 2024
@sushreebarsa sushreebarsa added type:support Support issues comp:lite TF Lite related issues TF 2.16 labels May 17, 2024
@sushreebarsa
Copy link
Contributor

@Blackoowhite Some TensorFlow operations might not be directly supported by TFLite. During conversion, the converter checks for compatibility and throws errors if it encounters unsupported operations.
In order to expedite the trouble-shooting process, please provide a code snippet to reproduce the issue reported here. Also find resources on exporting models to SavedModel format in the TensorFlow documentation here.

Thank you!

@sushreebarsa sushreebarsa added the stat:awaiting response Status - Awaiting response from author label May 17, 2024
@Blackoowhite
Copy link
Author

Thanks everyone, I havd found the issue. I must import keras like tf.python.keras..... instead of tf.keras......

@google-ml-butler google-ml-butler bot removed the stat:awaiting response Status - Awaiting response from author label May 18, 2024
Copy link

Are you satisfied with the resolution of your issue?
Yes
No

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:lite TF Lite related issues TF 2.16 type:support Support issues
Projects
None yet
Development

No branches or pull requests

2 participants