Optical Flow and Camera Pose for RGB vs GENX320

How is the OpenMV software suite configured for optical flow and camera pose estimations? I have found that the find_displacement function is a bit inaccurate, but that my event camera running at 100fps is more accurate than my rgb at 50fps. I would have thought the contrast trails formed by accumulated event sensing would cause the function to fail.

Find_displacement works via phase correlation. So, it’s comparing FFTs. It greatly improves it’s ability to operate thanks to the strong edges.

Honestly, we don’t really have a great framework for pose estimation of the camera, though. However, you can pretty much do any math you like via ULAB onboard using ndarrays. You can convert images into ndarrays, run FFTs on the values, and etc. So, any math you want to do onboard is possible.