Trying to upload a trained tflite model

I’m trying to upload a tenserflow model which I trained using Google Colab in openmv H7 plus , though I’m facing this error ‘MemoryError: memory allocation failed, allocating 477184 bytes’ . I also tried using a 8 GB sd card though it wasn’t really helpful and I still faced the same error. and also when I tried to save the model in the sd card using (“/sdcard”)I faced this error OSError: [Errno 2] ENOENT what am I supposed to do with this?

Hi, you just need to put the model on your SD card and then pass the filename of the model.

It would help to see your code and know where the model is stored.

As for this error MemoryError: memory allocation failed, allocating 477184 bytes. Which line of code is throwing that? If it’s the model allocation then it would be useful to know the size of the model itself and what you expect it’s activation buffers to be for memory usage.

hi, I tried to put the model in the sd card using net = tf.load(‘/sd/hsu_model.tflite’) though I face OSError: [Errno 2] ENOENT error . the error also is throwing on this line net = tf.load(‘=hsu_model.tflite’) the model size is exactly 477184 bytes

Try:

tf.load("hsu_model.tflite", load_to_fb=True)

Also, you are using very old firmware if still using the TF module. Please upgrade to the latest using the ML module. We increased the heaps of the system to so that you don’t need to pass load_to_fb anymore.

It’s fixed now!! thankssss