Error while using tf.load on custom tflite model

I own an OpenMV CAM H7

I built and trained a CNN model using Keras and converted it to tflite using the following code:

model = tf.keras.models.load_model('/content/model1_954.h5')
converter = tf.lite.TFLiteConverter.from_keras_model(model)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.target_ops = [tf.lite.OpsSet.TFLITE_BUILTINS_INT8]
tflite_quant_model = converter.convert()

I added it to the 32GB sd card that I had put in the cam, when I load the model using tf.load() function, I get the following error:

OSError: tensorflow/lite/micro/kernels/ Hybrid models are not supported on TFLite Micro.

Please help me resolve this issue. I need to make this model work on this camera module, it is basically a classification model.

As the error says it’s not supported.

Please use Edge Impulse to train TensorFlow models. They have the complete tool chain setup to make this easy.

TF lite micro doesn’t support everything desktop tf lite supports. If you look at the code in that file you may be able to determine what layer is wrong.

I have found the solution, actually, if we use the code which I posted earlier, it only quantized the weights and not the input/activation tensors.
The following code can be used to fully quantize the model, but it requires representative database more details here: Post-training quantization  |  TensorFlow Lite


Glad to hear it’s working!