March 9, 2023, 8:35pm
I am trying to load/run the MoveNet tensorflow lite model published by google on an OpenMV H7+ board.
As of now, I get a hard system crash / reboot with no error message when it gets to loading the tflite model file. The file is offered in an int8 model (singlepose lighting int8) on tfhub:
TensorFlow Hub (tfhub.dev)
I did find this OpenMV forum post describing a similar phenomenon. In that case, the error was a datatype error:
Crash without error message when loading tflite model - OpenMV Products - OpenMV Forums
Does anyone have any advice on getting MoveNet to load/run on an OpenMV H7+ ? I did get it running on a Rasp Pi 4B.
Thanks in advance.
It probably can’t run on our device. We specifically support networks generated by Edge Impulse. Any other types use operators that we don’t have in the TensorFlow library we have compiled.
March 9, 2023, 9:24pm
It would be nice if the OpenMV IDE could somehow throw a labelled error when a tflite with a disallowed operation type was included. I don’t know how to inspect the tflite file for specific operations used.
To be clear, my understanding is that TFLite operations listed here (without the comment bars) are the supported-operation list:
static void libtf_init_op_resolver(tflite::MicroMutableOpResolver<LIBTF_MAX_OPS> &resolver)
It normally would. However, the TensorFlow code by Google is of such a low quality that it just segfaults.
I suppose we can do a patch on the library to support this.