Car tracking project

Hello,

I’m currently working on a project in which I’m hoping the OpenMV cam M7 and IDE can be the answer.

The project is supposed to track the movements of a car in designated area (120 degree angle,~30m radius) and point a servomotor in the direction on the moving car.
I guess that the template matching function would be ideal for detecting the car, but I’m having a hard time working it out as I’m completely new to IDE/ python ( I do have a bit of C / Arduino experience). Is it possible to realize this project with the M7 cam?

Currently I’m stuck on getting the template matching example code to work as it can’t find my template (which is a 8 kb JPG file on a SD card placed in the M7).

template = image.Image("/template1.pgm",copy_to_fb = True)

Is there something I’m missing here?

Thanks in advance.

Hi, as you can see in the template matching code you posted it wants to open a pgm file which is not a jpg file. PGM is an uncompressed grayscale file format for images. You can generate one by using GIMP or Photoshop. GIMP is free for this.

As for template matching to find the car… Well it can work if you had a template per car location. However, it’s going to be really hard unless you control variables. Template matching is good for like process control stuff where the camera is in a fixed position relative to where the object you are looking at is and you are just trying to find the x/y coordinate of the object on a flat plane. It doesn’t handle rotation at all so you’ll need to match against multiple rotated templates.

Far easier is to just use our AprilTag feature and put one on the car. This solves your problem with an example script and you can get the rotation from the AprilTag trivially.

That said, the scale at which you are looking at is huge. 30m is a very large area. The camera can’t see that much area with it’s resolution. Can you provide some pictures of what you want to do? You will most likely need to buy some rather expensive cameras, compute power, and more to do what you want to do given that operational area size.

Hello Kwagyeman, thanks for the feedback.

I made a quick slide show to demonstrate what I’m trying to accomplish. Due to various reasons I can’t give you the background information, except for the basic concept.

All images are shown in top down view.





I figured the distance might become a problem, although it’s worth noting that all the cars are the same models and basicly truck/van sized. Also 30 meters is kind of an indicator. If the detection range is 20-25m it will be adequate. The device will be placed on a flat surface with the cam M7 standing on a 1 meter high tripod, so there won’t be any tilt movement. It should ‘just’ be able to follow a car driving in a straight line from point A to B. I don’t think the AprilTag feature is going to help, manually placing tags on passing cars is excactly the kind of work we’re trying to get rid off.

That being said, is template matching still not a viable option? Or does the cam M7 have features which are more suitable for this task?

Thank you in advance.

Ah, okay, um, so the system can do what you want but you need an algorithm I have implemented yet. Basically, you want to use the cam shift algorithm.

https://docs.opencv.org/3.1.0/db/df8/tutorial_py_meanshift.html

I haven’t code this algorithm for the system yet however. It comes with OpenCV if you want to use a SBC Linux computer. If you’re good with C code however and want to write the method it’s quite easy to code.

Anyway, given the features we have onboard right now… Only find_blobs() is going to work the way you need. Is the background a particular color? If so, you have have find_blobs() track the opposite of that background color to track all objects that aren’t the background.