We have an application with an OpenMV and Wifi shield to do a vision analysis of a product in line.
The app also streams video to an HMI screen.
After some time of working, the image shows very dark, as if the autoexposure didn’t work properly.
In this situation, with an extra powerfull light the image is seen properly.
The application works in auto_exposure mode (default operation), and in our tests the light range covereded is very good.
Can you guess what could be the problem?
I have tried the command sensor.set_auto_exposure(False, … but i don’t see clearly its function.
Do you have any idea about what can be happening?
Wait, is that after a while the image gets dark?
Yes, that is.
And in that situation if I light with a powerfull torch the image recovers.
I do tests with a simple program (helloworld) and the image can be seen almost in plain darkness.
It looks as if the working program, after some time of working, gets corrupt.
Could there be some memory overlapping?
Hello again, This a simplified version of the program: import sensor, image, time, sys, math, utime import ubinascii, ure from pyb import Pin from pyb import I2C from pyb import LED import ujson # SSID ='MOVISTAR_7EBA' # Network SSID # KEY ='8E133E3D6F2E6992C3BE' # Network key # HOST ='' # Use first available interface # PORT = 8080 # Arbitrary non-privileged port # Reset sensor sensor.reset() pin9 = Pin('P9', Pin.IN) ledR = LED(1) # red led ledG = LED(2) # green led ledB = LED(3) # blue led ledR.off() ledG.off() ledB.off() stledB = False srledG = False stledR = False conta = 0 conta1 = 0 MINL = 0 MAXL = 0 MINA = 0 MAXA = 0 MINB = 0 MAXB = 0 # thresholds = [(19, 100, -24, 10, -18, 70)] # (29, 100, -36, 4, 13, 39) with open('Filtros.json') as file: data = ujson.load(file) # print('data: ',data) print('') MINL = data['MINL'] MAXL = data['MAXL'] MINA = data['MINA'] MAXA = data['MAXA'] MINB = data['MINB'] MAXB = data['MAXB'] thresholds = [(MINL, MAXL, MINA, MAXA, MINB, MAXB)] i2c = I2C(2, I2C.MASTER) # The i2c bus must always be 2. data = bytearray(9) data = 0x50 data = 0 data = 0 data = 0 data = 0 data = 0 data = 0 # --------------------------- def EnviaI2C_ana(valor): data = int(valor / 256) # Byte alto data = int(valor & 255) # Byte bajo # print("Data0: ", data, "Data1: ", data) i2c.send(data, 0x60) # --------------------------- # --------------------------- def EnviaI2C_bv(valor): data = int(valor / 256) # Byte alto data = int(valor & 255) # Byte bajo # print("Data0: ", data, "Data1: ", data) i2c.send(data, 0x60) # --------------------------- EnviaI2C_ana(4095) utime.sleep_ms(500) EnviaI2C_bv(930) ledB.on() stledB = True utime.sleep_ms(500) # ----------------------------------------------------------------------------------- # INICIO NORMAL # Set sensor settings # sensor.reset() sensor.set_framesize(sensor.QVGA) sensor.set_pixformat(sensor.RGB565) sensor.set_auto_whitebal(False,[62.71649, 60.20714, 63.79094]) sensor.skip_frames(time=1000) timeIni = utime.time() while (True): conta1 = 0 while(True): frame = sensor.snapshot() # frame.draw_rectangle(30,25, 265, 190, color = (0,255,0), thickness = 3, fill = False) frame.draw_rectangle(0, 0, 320, 25, color=(60,120,255), fill=True) frame.draw_rectangle(0, 215, 320, 25, color=(60,120,255), fill=True) frame.draw_rectangle(0, 25, 30, 190, color=(60,120,255), fill=True) frame.draw_rectangle(295, 25, 30, 190, color=(60,120,255), fill=True) lista_blobs = frame.find_blobs(thresholds, pixels_threshold=1500, area_threshold=1500, merge=True) if(len(lista_blobs)==0): EnviaI2C_ana(int(4095)) for blob in lista_blobs: tamano = math.sqrt(blob.pixels()/math.pi)*2.4 if blob.cx()>130 and blob.cx()<190 and blob.cy()>90 and blob.cy()<150: posok = True else: posok = False if tamano>50 and posok==True: circulo = blob.enclosing_circle() centrox = blob.cx() centroy = blob.cy() radio = int(circulo*3/4) frame.draw_circle(centrox, centroy, radio, color=(0,255,0), thickness=2) frame.draw_keypoints([(blob.cx(), blob.cy(),int(math.degrees(blob.rotation())))], color=(255,0,0), size=40, thickness=2) # print(blob.area(), " ", "%.0f" % tamano, " ", blob.cx(), " ", blob.cy()) diam = math.floor(tamano) frame.draw_string(5, 0, str(diam), color = (255, 255, 255), scale = 3, mono_space = False, char_rotation = 0, char_hmirror = False, char_vflip = False, string_rotation = 0, string_hmirror = False, string_vflip = False) frame.draw_string(50,4, "mm", color = (255,255,255), scale=2) analog = (tamano-30) * (4096/220) if(analog > 4095): analog = 4095 #--------- dac.write(int(4096-analog)) EnviaI2C_ana(int(4096-analog)) print(blob.pixels(), " " , 4096-analog) else: #--------- dac.write(4095) EnviaI2C_ana(int(4095)) #3722 - 3V conta1 = conta1+1 if conta1 == 15: ledG.off() stledG = False ledB.toggle() stledB = not stledB if stledB ==True: EnviaI2C_bv(930) else: EnviaI2C_bv(0) conta1 = 0
Can you create the bug in like 10 lines? I can’t debug that for you.
I’m afraid that I can’t recreate the faillure in 10 lines. In fact the real program is larger than what I sent.
I’m doing tests with a Plus camera and it seems that it works correctly for a long period of time. I’m trying to confirm this.
Can there be a memory problem?
Could be. But, I need you to narrow the issue down for me if you can.
We have seen that sometimes the camera hangs when taking the snapshot.
If we write:
frame = sensor.snapshot()
except RuntimeError as e:
We get the message ‘Frame capture has timed out’, and the camera recovers.
For what we are checking this does not happen with the Plus camera.
I didn’t receive any answer about my last question…
We are having two different issues with our cameras, one is a hanging of the program, the other is the apparent autoexposure faillure.
We don’t know if they could be related.
On one of the cameras with the ‘hanging’ problem we programmed:
frame = sensor.snapshot()
escept RuntimeError as e:
with this, the camera auto recovers.
We are not experiencing this problem with the program running on the H Plus cameras.
On the other hand, I have not seen on any example the use of try… with the sensor.snapshot() function.
What do you think?
Hi, did you see this thread? Issue with Upgraded Firmware - #6 by Aditya
Could the error be related?
Also, try turning off triple buffering:
Right before snaspshot(). If this works then there’s some issue with our DMA subsystem stopping.
I have tried:
and it keeps crashing
The thread that you mention is for the Plus camera, and we have the problem with the H7 R1 camera.
If fact, on our tests the program seems to run well on the Plus camera.
How can I know if the Plus is needed in our application?.
Yeah, the only thing I can think of is that the I2C bus is messing with the hardware timings. I can’t say we’ve tested that.
Can you comment out the I2C code and see if it works? And if it’s still crashing the DAC and then Pin code?
The issue would be that the DMA system we’ve setup is being disturbed by something. Can you try disabling functions that invoke things in the hardware until the issues stop? Then we know what the problem is.
Currently, frame capture on the H7 depends on the DMA2 and MDMA subsystems on the chip. There’s not really any interrupt load on the main CPU. Just one per frame. But, if we’re using DMA2 time for I2C that could cause an issue.
The sensor timeout error means that the CPU was waiting for a frame to complete and it never did. The camera is always producing frames. So, the issue is almost certainly the DMA subsystem dropped some data and got out of sync.
We are checking the I2C issue, we still don’t have a test result. We’ll tell you when we have it.
Regarding the autoexposure issue (we still don’t know if is related to the other crash cases) we are trying to understand how the autoexposure works.
In our application we don’t program anything related to exposure, so we assume the camera works in auto mode, as it states in the documentation.
Anyway, we made some tests with manual exposure and don’t understand very well how it works.
If we write:
sensor.set_auto_exposure(False, exposure_us = 10)
is seems that it reads the light on the start and then continues with this exposition.
So, if we point to a dark area when starting the program, the exposure gets excessive when we point to a bright area.
On the opposite, if we point to a bright area when starting, the exposure sets to a point where the darker areas are seen black.
Is this normal? Can you provide some info about how this works?
Could we monitor the state of the exposure to autocorrect it if it goes wrong?
Did you tried sensor skipframes at least 2000ms?
In addition try skipframes excact after the set_pixels format and before the settings for whitebalance and exposure. Try to manually zero all the others.
Also try to use the setframebaffers(1) outside the while() loop and before the skip frames.
sensor.set_framesize(sensor.QVGA) sensor.set_pixformat(sensor.RGB565) sensor.set_framebuffers(1) sensor.skip_frames(time=2000) sensor.set_auto_whitebal(False,[62.71649, 60.20714, 63.79094]) sensor.set_auto_exposure(False, exposure_us=500) sensor.set_contrast(0) sensor.set_saturation(0) sensor.set_brightness(0) sensor.set_auto_gain(False,gain_db=0)