any H7 updates?

M7 is a great product and my son and I have done some cool projects with it, EyeDriving with machine vision autonomously following AprilTag - YouTube

Wondering how H7 is going and if we are likely to get more AprilTag recognition distance from it?

Um, we’re still working on getting the FLIR sensor operational. But, otherwise it’s ready.

See this script for more AprilTag distance:

https://github.com/openmv/openmv/blob/master/scripts/examples/16-Codes/find_small_apriltags.py

Thanks. I saw your earlier post about that method and I plan to try it. I am still interested to know how the H7 performs not using the two step method.

The FPS doubles. However, ST… allocated the RAM weirdly on the H7 and we only gain 128 KB extra in the frame buffer. At 160x120 grayscale we have to store 8 copies of the image in RAM basically. Plus the default image.

So, 160x120*(8+1) = 172800 bytes and the rest of the memory is used for a large temporary heap. However, for QVGA we’ll need: 691200 bytes… and then still space for a heap. So, we can’t give you a double distance increase as we only went from 384 KB of frame buffer to 512 KB of frame buffer (even though there’s 1 MB of RAM on chip ST did not make it contiguous - for some unknown reason).

Um, that said, you can just test multiple ROIs on the image at the higher speed to deal with farther distances. So, for example, while you can only operate at 160x120… you can down sample the image to 160x120 and then test that for tag, and if non are found you can try to slide a window around the image (not every point though).

From the video… was your son holding the tag in the white border? That ruins the detection. You need to keep things out of the white border area. You should be able to see the tag you were looking at from over 8 ft away.

Thanks and yes, he probably was holding in the border. how much border do we typically need? I think we are getting around 5 feet with him holding at the border. We plan to try to put the tag on a shirt so it doesn’t have to be held. Will be interesting to see how well that works.

A little over a year ago we created a device that allows me to drive the powerchair using my eye gaze system. But the sun is a problem for IR outdoors so we got the idea to follow a caregiver with an AprilTag. It works pretty well, still fine tuning and a little more distance would help.

The border should be the same width as the internal black border.

I can update the generator in the IDE to make this more clear. Maybe add another black border. This way you won’t put you hands in the white one.

In the mean-time. Embed the images the IDE generates inside of another picture so this is obvious.

Printing it on a shirt is a great idea. That should work really well. And since you get 6DoF info you technically could make the car align itself to always be perfectly behind the caregiver too.

Thanks.

Center X is mainly what we need to send appropriate driving commands to our arduino system. We are also playing with distance Z but are mainly relying on a ultrasonic proximity sensor to ensure that I don’t run over the caregiver. Nice to have options. AprilTags are very cool.

One hopefully last question. Does Uart 1 work? We use Uart 3 to the arduino and it works fine. We initially tried to add Uart 1 for the proximity sensor but every time we tried to read Uart 1 the M7 reset. I am pretty sure the sensor was sending a TTL signal. We eventually switched to use an analog signal to the ADC on pin 6, which works fine. Mainly just curious on this one.

Hi, if you have an example script that fails with UART1 please create a code necessary to debug the issue and great a github issue ticket for it and we’ll fix it.

I will find the code we were using and try to do that. Thanks.

I didn’t save the Uart1 code. As I think about it there is a possibility that I made a mistake with my use of UART.any() and I had a watchdog in place so that might have caused the resets. If I encounter the problem again I will raise it as a bug.