RT1062 sync flash with capturing image

Hi. I have a quite specific problem. I am trying to sync a flash with a snapshot where the flash should be on only for a maximum of 10 to 15ms. I have been able to do this for resolutions up to HD by triggering the ov5640 by using its FREX function then calling snapshot after the END OF FRAME interrupt has been set to 1. Now i could have just used the sensor.get_frame_available() but this only returns False when only 1 frame buffer is used, which is the case for me where I’m trying to make this work for WQXGA2. My code is below.

import sensor
import time
from machine import Pin, Timer

EOFADDRESS = 0x3F0C
pin8 = Pin(8, Pin.OUT)
def check_eof_status():
    # Read the EOF status register
    eof_status = sensor.__read_reg(EOFADDRESS)
    eof_bit = (eof_status >> 5) & 1  # Extract bit 6 for EOF status
    return eof_bit
def reset_eof_status():
    eof_status = sensor.__read_reg(EOFADDRESS)
    eof_status |= (1 << 5)
    sensor.__write_reg(EOFADDRESS,eof_status)

def led_func(t):
    pin8.value(0)


sensor.reset()
sensor.set_pixformat(sensor.RGB565)
sensor.set_framesize(sensor.WQXGA2)
sensor.__write_reg(0x3B07, 0x01) # mode 1
sensor.__write_reg(0x3B00, 0x81)  # FREX enable
sensor.skip_frames(time=2000)  # Wait for settings take effect.

i=1
timer = Timer(-1)
while i < 6: #try taking 5 images with flash
    start = time.ticks_ms()
    print(check_eof_status())
    reset_eof_status()
    sensor.__write_reg(0x3B08, 0x01)  # Trigger FREX request
    pin8.value(1)
    timer.init(period=(10), mode=Timer.ONE_SHOT, callback=led_func)
    
    while sensor.__read_reg(0x3B08) == 1:
        pass
    
    endtrigger = time.ticks_ms()
    while check_eof_status() == 0:
        print("not available")
        time.sleep_ms(1)
    print(check_eof_status())
    img = sensor.snapshot()
    reset_eof_status()
    print("Time to trigger FREX: ", endtrigger-start, "ms")
    img.save("frex" + str(i) + ".jpg")
    print(i)
    i+=1
    time.sleep(1)
   

My problem is that at higher resolution, snapshot misses the timing. This is probably because there is only 1 framebuffer available so when the triggered image is being written to the buffer and i call snapshot, it is instead skipped. I’ve tried calling snapshot as soon as possible after i trigger the camera but it doesn’t help unfortunately. Do you know any solution to this?

Any help is appreciated :slight_smile:

Hi, set the number of frame buffers to be larger than 4 to create a frame buffer fifo.

When you call snapshot, it accumulates frames coming out of the camera. snapshot() waits to capture the first frame and then accumulates more of them afterwards.

The frame you want, the properly exposed one, should be somewhere in the FIFO, which will be emptied per snapshot() call.

You may need to set the disable_full_flush() flag on the sensor driver so that when the FIFO fills up, it doesn’t flush all frames.

Thanks for the quick reply. Unfortunately i can’t set the frame buffer to be higher than one (i tried also to set it to higher than 4) because of the resolution.
I get:

RuntimeError: Frame buffer overflow, try reducing the frame size.

I’m wondering if its worth adjusting the snapshot() sensor.c driver to not skip the frame currently writing in singlebuffer mode. I don’t really care about frame pacing issues since i just want to take pictures.

Yeah, you’re going to need to edit the sensor.c to do what you want to do.

There’s some code for frame exposure already available in the driver which you can put the trigger register write at.

Basically, you have to start all the DMA hardware and etc. which the sensor driver does for you, before triggering the frame capture start.

Alright just one more question. When building is the target “OPENMV_RT1060” for rt1062?

Yes, as it’s the 1060x line of MCUs.