Hi all, I am trying to use my openMV camera to find out where my computer screen is relative to the openMV camera for an eye tracking experiment. The setup I currently have is that an openMV camera mounted on a 3D printed eyewear facing forwards towards the computer screen, and another facing the user’s eyes for infrared eye tracking (not the focus of this topic).
I had originally tried using blob tracking to determine the corners/edges of the screen (the screen has just been set to pure white for now to provide a strong contrast against the background), however the min_corners did not accurately give me the 4 corners, so I’m now using infinite line detection and extracting the intersection points with math to detect the corners of the screen (sometimes its off by a few pixels, but this is the most accurate solution I’ve got for now, though I’m open to more suggestions). I’ve also experimented with april tag’s camera pose estimation, which looked promising at first, however given my current setup, the april tags are too small for them to be detected at such a distance (roughly more than 50cm away from the screen) and low resolution (limited by memory buffer) and it would be impractical for me to print on massive april tags to place around the screen.
Anyways, back to the problem: given that I have 4 corners of the computer screen in pixel coordinates, and I can find the focal length and field of view of the lens from the store page, how do I find the position and rotation of the camera relative to the screen, given that I know the size of the screen in real life? It all seems to make sense that all these factors are related but I just can’t seem to piece it together. Are there any built in functions for this? Ideally, once I can get the position and rotation of the camera relative to the screen, I hope to be able to combine this with my other openMV camera facing the eye, and determine at which point of the screen is the user looking at. Any help on this matter is greatly appreciated.