My application involves inspecting a mostly white object that is placed in a black jig for inspection.
There may or may not be an object in the jig at power up.
I am currently setting the exposure, gain, and white balance but I still see variance.
My test is to check the average brightness of a small white patch that is always in the field of view.
If the object is not present at power up the field is very dark and the patch ends up at about 150.
When the object is present at power up the patch end up at about 120.
What am I missing?
sensor.reset() # Reset and initialize the sensor.
sensor.set_pixformat(sensor.GRAYSCALE) # Set pixel format to RGB565 (or GRAYSCALE)
sensor.skip_frames(20) # Wait for settings take effect.
sensor.set_auto_whitebal(False, value =(50,50,50))
sensor.set_windowing((0, 250, 640, 50))
Turn autogain and autowhite balance off after startup. As for the variance… welcome to color tracking. Settings will always move around. The camera never produces a consistent output. It’s best to use gradients to track if something is in the image.
Can you look at the edges of the image first to get an idea of the color and then compare that with the middle? Maybe a split the image up into 9 parts… 3x3 boxes, look at the colors in each box and check if the middle is not consistent with the rest?
As for checking the image… do, img.get_statistics(). This methods will average the colors in the image given a region of interest for you. You can find the method to use here: image — machine vision — MicroPython 1.15 documentation.
Thanks for the quick response.
What is the mechanism at play here? Is it just not possible to get to the necessary registers? The effect I’m seeing isn’t drift or anything like that. I can repeat it with no problems. It seems like some kind of auto setup that doesn’t get overwritten by the values I’m setting.