Frame capture randomly timing out

I believe this topic may have already been raised many times before, however after reading through the existing forum posts about it it seems there still isn’t a solution to the problem yet. My problem in particular is that I am getting frame timeouts after a random period of time, could be a few seconds to a few minutes, but eventually the frame capture times out. All I’ve done in the script is just blob tracking and a few basic math calculations.
image

Now here’s a question of my own - once I save the script to the openMV board as main.py and just power it without the IDE running (hence no streaming to the IDE), if it runs into a frame capture issue, does it automatically restart or would it just stay bricked?

I’m using an OpenMV H7 R2.

Hi, unless you handle the exception, it would stop running.

So, you’d wrap a try catch around your code loop. That said, you generally shouldn’t hit this issue.

So, you’re not doing anything with interrupts that would delay the system timing? If not, then the only thing I can think of it that it’s double buffered mode.

Do:

sensor.set_framebuffers(1) or sensor.set_framebuffers(3)

After setting your resolution. You may not be able to fit 3 frame buffers on the H7 R2. So, you might be able to only do 1.

Please let me know if this fixes the issue. I have a feeling there’s a weird race condition in double buffer mode where you can be in this loop waiting for a frame:

openmv/src/omv/ports/stm32/sensor.c at master · openmv/openmv (github.com)

Nope, all I’ve done is blob tracking and abit of math.

I’ve tried setting frame buffers to 3, and let it run for about 10 minutes, so far it hasn’t timed out so I’m assuming this works and will not time out if I were to let it run for even longer.

Surprisingly, this was not an issue, there were no error messages and everything worked like a charm.

I’m not exactly sure what setting the frame buffers do, or what a frame buffer even is, but increasing it seems to work. Is there any tradeoffs or disadvantages to increasing the frame buffers though?

The system tries to keep 1/2 of RAM available for the default frame buffer layout. So, it won’t choose 3 buffers if that is violated. That said, this is a heuristic that has nothing to do with how much RAM the algorithm you are running needs.

Sounds like you solved the problem.

One more question. Can you do:

print(sensor.get_framebuffers())

And let me know the default number of buffers before you forced it? If it’s 2 then I think I need to investigate and fix a bug.

Yep, setting frame buffers to 3 seems to have solved the issue.

Commenting out the line that sets the frame buffer and printing sensor.get_framebuffers() returns a value of 1. Not sure if grayscale has anything to do in particular with causing the issue as well.
image

Call if after setting the pixformat. It’s only valid after pixformat and framesize have been called.

Note, you should only set the number of frame buffers too after setting pixformat and frame size… They overwrite the value you set when they are called.

Thats strange, it worked when I set buffers before set_pixformat. Anyways, I printed get_framebuffers() after sensor.set_pixformat and it still returned a value of 1.

Huh, weird.

I expected that to return a value of 2.

I’ll follow up on this myself.

Thanks,

No problem, glad I could help find this issue. Let me know if you need my code to reproduce the error message again (although it is kinda random).

Yeah, I know it exists now. I’ll make a bug tracker.

1 Like