ESP32 AND OPENMV

Hello!

i am trying to stream live video using openmv3 m7 and Esp32. so i need some suggestions that which interface will be better for communication to openmv and if is there any person have done the same thing can you guys let me know.

Use the RPC library. We have an Arduino implementation of the library. Just connect that to an OpenMV Cam with an RPC script running and you can move data then.

Hello! I have been trying to do a similar thing and am running into an issue setting up the streaming interface. I am hoping you can help. I am using an Adafruit HUZZAH32 - ESP32 Feather and an OpemMV H7 with the newest firmware. It looks like the SPI interface is the fastest available over RPC for connecting these two, so that is the interface I am trying to use. I have attached the test files I am using. Note that your forum does not consider .ino as a valid file extension, so I renamed the Arduino file to .ino.txt so I could upload it.

TestStreaming.ino is run on my ESP32.
If I set the variable on line 9: streamTest = false, then TestStreaming.ino works with popular_features_spi_test.py (see attached) on the OpenMV H7. That works correctly, so I know my SPI setup is correct.
If I set the variable on line 9: streamTest = true, then TestStreaming.ino works with image_transfer_jpg_streaming.py (see attached) on the OpenMV H7. Here is where I run into issues. Both programs run, and I do not get the debug output from my Arduino program of “Failed to start streaming”, so it appears that the RPC streaming connection is set up correctly. However, I never get the debug output from my Arduino program of "Received video bytes: ", so the callback function for receiving data is never called.

In image_transfer_jpg_streaming.py (see attached), I have tried uncommenting lines 36 and 37 and commenting out line 39 to test significantly reduced data (since I see you mention that the full streaming rate overwhelms most MCUs). Even when testing that way, I never see the debug output from my Arduino program of "Received video bytes: ", so the callback function for receiving data is never called.

I am assuming I have set up something incorrectly for the RPC streaming to work properly, but I do not know what I am missing/doing incorrectly. Could you please help?

Thank you!
TestStreaming.ino.txt (1.4 KB)
popular_features_spi_test.py (10.8 KB)
image_transfer_jpg_streaming.py (2.05 KB)

Stream mode doesn’t really work for SPI since the SPI bus is half duplex. Use a UART if you want to stream data async like.

For SPI mode… there’s the cut through option that’s in between stream mode and normal RPC calls where you are just sending data in the raw after an RPC call completes that you want to use. The image transfer (non-streaming) scripts show how to do this.

Thank you! I followed the non-streaming scripts and it is working now. I misinterpreted the H7 marketing statement “A SPI bus that can run up to 80Mbs allowing you to easily stream image data off the system to either the LCD Shield, the WiFi Shield, or another microcontroller.” to mean that it can use the RPC streaming protocol.

Two follow-up questions:

  1. When you say half-duplex, I read on the STM32H743 datasheet that it supports full-duplex, half-duplex, and simplex SPI modes. Is the half-duplex a fixed H7 cam configuration? I was just surprised to hear half-duplex since the wiring seems to be typical for a full-duplex setup.
  2. With the current setup, I am getting approximately 2FPS. The compressed JPEG image sizes are <10000 Bytes/80000 bits. Given that the SPI bus should be able to move up to 20Mbs or more (2 orders of magnitude higher for the frame rate), I am assuming the bottleneck is either with the RPC calls or how the images are captured/sent on the H7 camera. Even when trying the streaming mode, my debug statements on the camera seemed to suggest that the capture/send was the slower part. Is there a better way for me to capture/send on the H7 camera that would speed things up? I do not need high rates, but I would ideally like to be above 10FPS.

Thank you!

  1. SPI is full duplex, but, it would be exceptionally hard to send data in both directions at the same time since both streams share the same clock. So, TX and RX are not separated. Contrast to the UART where RX and TX are independent.

I suppose it wouldn’t be impossible to get full duplex working. But, it would require you to deal with bit and byte framing in software… so, the CPU load to decode the data would be quite high.

Anyway, I don’t plan to support streaming mode for SPI and I2C.

  1. It’s the RPC call overhead + snapshot overhead. If you check bus usage you’ll see it’s idle most of the time.

Can you use the UART? It’s just easier. If not, then you’ll want to use the cut through mode to move data. This is where an RPC call is used to transfer the file header and then the data is transferred in one giant SPI write. The image transfer script shows this mode off. There’s no CRC protection on the data but it is much faster to move the image. You should instantly see the performance increase.

I’m not sure if my Arduino examples show this mode off however. You can see it in the OpenMV Cam RPC scripts that have the master/slave role for the OpenMV Cam.

The reason cut through mode is not shown is the at when you use it the camera WILL send the data at whatever clock speed you set with NO break. Unless your MCU uses SPI with DMA your microcontroller will fall over trying to receive the data. From what I’ve seen of Arduino wrappers for hardware like a SPI… DMA is not used. So, I suspect your Microcontroller will have trouble handing the data. But, you can try.

(Typically most low effort bus wrappers have the processor reading data from the SPI bus one byte at a time. This is not fast enough to handle receiving data when it’s coming in at 40 Mb/s).