GENX320 async processing

I recently got the genx320 and the h7 cam plus and i want to stream the event data at a very high rate to decode led blinks - simulate lifi

Through the latest development release, i have access to the x,y,t data. Now i want to stream this raw event data to a server (my laptop) that will decode the events asynchronously

  1. whats the best way to stream genx320 camera back to my laptop using the usb port?
  2. I noticed the openmv n6 recently announced, does that provide a better throughput for this usecase?

You’d kinda want the RT1062 versus the H7 Plus, as it has high-speed USB. So, it can do 480 Mb/s back to the PC instead of 12 Mb/s. The N6 will have the same event streaming performance.

As for the best way to get events back to the PC, you really want to process them onboard the camera. We have a numpy package onboard which you can do quite a bit with. If you plan to send back to the PC. Note that the bandwidth of events from the camera exceeds what the USB speed will be.

Anyway, print() is pretty much the way to send data over USB back to the PC. When the OpenMV cam is not used in debug mode that will just send data over the link to any receiving program. However, it’s not going to be that high bandwidth. Our debug protocol image transfer mechanism is the highest bandwidth interface we offer and can do about 30MB/s. But, you’d need to turn the raw event data into an image and send to the PC. The current API doesn’t really make this easy to do, but, if you edit our firmware it’s possible.

Can you try to process the events onboard?

Hey @kwagyeman

Thank you for the feedback, i will try to get my hands on a RT1062 as well.

I believe processing events on board is too slow as i need to decode led blinks faster than 500hz. The decoding logic itself could be taking a few ms which would result in the led detection loop to miss.

My assumption was if i can decouple the led blink detection logic with the decoding logic, I can still see these fast blinks. Let me know if this makes sense, otherwise we can try optimizing the decoding logic

The RT1062 can handle event buffer returns at up to 2KHz with each buffer being 1K events. It can handle a 500 Hz LED.

A customer asked me about doing this for the whole image. I tried creating the below script. It doesn’t quite work. But, the idea is almost there. You get a result every 1-2 FPS for all pixels.

genx320_event_mode_frequency_histogram.py (4.6 KB)

There’s also this: GitHub - berndpfrommer/frequency_cam: frequency cam: detecting and visualizing frequencies with an event based camera

They have a paper available: [2211.00198] Frequency Cam: Imaging Periodic Signals in Real-Time

However, the way they handle events is very tricky. frequency_cam/include/frequency_cam/frequency_cam.h at master · berndpfrommer/frequency_cam · GitHub

Anyway, doing the FFT per pixel area or their method should work.

If you still want to stream events to the PC I don’t quite know how you do this at any speed with our current setup… we’re not optimized to transfer anything other than images at a high speed. Note that the latest RT1062 firmware can now move full (320x320) histogram images to the PC at 30 MB/s.

If you 100% want to stream events to the PC I can show you where you need to modify our firmware, but, like, you need to be able to create your own PC app and etc. So, you need to be like a pro if you want to do this. I’d just give a few pointers where to change things. Otherwise you are on your own.