Image Quality Issue with Sensor.Ioctl

I want to use sensor.ioctl to take the top left QVGA window with a target resolution of QVGA (no resizing). I want to only capture the top left 320x240 pixels and do so at very high FPS. I’m using the following code:

import sensor, image
sensor.reset()                      
sensor.set_pixformat(sensor.RGB565) 
sensor.set_framesize(sensor.QVGA)
sensor.ioctl(sensor.IOCTL_SET_READOUT_WINDOW, (0,0,320,240))
sensor.skip_frames(time = 2000)     

while(True):
    img = sensor.snapshot()

While the FPS is very high, the image quality is extremely poor. The image is extremely dark and somewhat blurry. See the example sensor snapshots below:
Screen Shot 2020-12-14 at 12.44.31 AM.png
1.png
However, when I remove the “sensor.ioctl(sensor.IOCTL_SET_READOUT_WINDOW, (0,0,320,240))” line from my code, the image looks fine. See the example sensor snapshots below:
Screen Shot 2020-12-14 at 12.44.56 AM.png
2.png
The images were taking under identical lighting conditions of the same objects (image 1 is of a laptop, image 2 is of scissors). Why does using sensor.ioctl cause the image to be so dark?

Hi, it’s because the sensor just speeds up and doesn’t adjust it’s exposure.

If you increase the FPS like that expect the image quality to be terrible.

Just for reference, the exposure is generally like 10ms. Once you go below that… expect the image quality to get worse dramatically. You need to have very bright lighting to handle the lower exposure. You will also need to increase the sensor gain.

Note that the exposure maximum is based on the frame rate the system runs at. When you set the readout window to small region like that the exposure is not allowed to be long anymore. The sensor doesn’t have like a target frame rate. This is all just based on how many pixels it has to process and the exposure is just set as a number of pixel clocks at minimum it’s supposed to expose per frame for. When you make the image smaller this naturally makes the FPS go way up and caps the exposure to the frame readout time.

If you’ve made the FPS like 500+ when the normal fps is like 50 fps then you’ve 10x reduced the maximum exposure.

Set readout window is useful for tracking very bright objects. Like an LED, etc.

Would be able to fix the issue by manually setting the exposure time? I’ve tried to following function but haven’t managed to improve the quality.

sensor.set_auto_exposure(enable[, exposure_us])
enable turns auto exposure control on (True) or off (False). The camera will startup with auto exposure control on.

If enable is False you may set a fixed exposure time in microseconds with exposure_us

No… as mentioned, the exposure is limit capped based on the frame readout time which is reduced if you reduce the resolution.

To be clear. You don’t need to shrink the readout frame size to the top 320x240… this speeds up the camera readout by 65x. You may need to play with the readout size and determine what the proper rectangle size should be for you.

Here’s what the camera does internally:

Readout Window → Image Scaler to Frame Size → Output from camera to STM32 → STM32 does cropping.

You can control the readout window, frame size, and stm32 cropping. Our driver will force the readout window to be at least as big as the frame size, but, you can make the readout window larger than the frame size. This will slow the camera data down and increase exposure. You should try to find the settings that give you the frame rate you need and hit the exposure you need. It’s a tradeoff.