Python script to output MAVLink

is the new mavlink msg via the zed work going to help wit any of this?

Please note that I will resume testing once I receive a 2 axis Gimbal , and I am building a new Q330 for indoor testing. Should report back in a couple of days.

Here we go…Another rainy weekend in the ThunderDrone

So I finally had a little time to play with OpenMV and I added the color tracking script to MavLINK

Pretty Basic actually here is the mod.:
img = sensor.snapshot()
for blob in img.find_blobs([thresholds[threshold_index]], pixels_threshold=100, area_threshold=20, merge=True):
send_landing_target_packet(blob, img.width(), img.height())
print("TRACK %f %f " % (,

and modified Mavlink Message
def send_landing_target_packet(blob, w, h):
global packet_sequence
temp = struct.pack("<qfffffbb",
((( / w) - 0.5) * h_fov)/2.25,
((( / h) - 0.5) * v_fov)/1.65,
0, #int(z_to_mm(tag.z_translation(), tag_size) / -10),


Actually, this is behave very correctly, its like a little dog following the red ball :slight_smile:
This tracking is smooth and just stop at the last tracking position in case the signal is lost.
We need to implement this behavior with the tag, because the tag tracking signal is shooting an extreme vector when its losing the signal (occlusion, out of window, , too far, etc.) making the quad zoom like crazy.

This what the signals and environment looks like:

great work patrick! any idea what the main difference is in the code between the color blob tracking and the apriltag tracking that causes the “zooming?”

What is that graph on the left? What did you plot? Was it distance from the target?
What tool did you use to scan the MAVLink and display them on the screen?


Thanks cglusky, I will test tag when I receive the gimbal.

ash27, the graph is mavlink landing target x-y signals generated by OpenMV and read from QGroundContol-Widget-Analyse, this is a great tool.
Distance is frm Lidar Lite V3 Range finder, bear in mind that I am flying indoor using PX4Flow : optical Flow

Please note that this post is now on DyDrones thanks to Chris Anderson

Can you explain me what you are doing in the code?
I am using Intel Aero. Which has build in camera. Trying to use that capability.

What are these math operations doing?

((( / w) - 0.5) * h_fov)/2.25,
((( / h) - 0.5) * v_fov)/1.65,
0, #int(z_to_mm(tag.z_translation(), tag_size) / -10),

What is this value qffffbb? I am using dronekit-mavlink command. So I dont understand this.

temp = struct.pack("<qfffffbb",


… Intel Aero… Why does Intel have a drone…?

Anyway, um, so, how does you question relate to OpenMV? If the camera is already built-into the drone you have no need for MAVLink.

To answer your question:

The first thing is computing the movement vector, which has to be returned in radians or something weird, for the MAVLink protocol.

The second part is packing a struct using python. Please read up on how to serial bytes/words/longs in python using the struct module.

First off, thanks very much to @iabdalkader for writing the script to output MAVLink!

I’m one of the ArduPilot developers ( and I would like to help integrate the OpenMV camera for use as an optical flow sensor (and later as a precision landing camera) so I have written a new AP Optical Flow driver which accepts the OPTICAL_FLOW messages that this example script sends.

I’ve found a couple of problems with the script which I’ve corrected in this modified version of the script:

  • the y-axis output seems to be reversed (assuming the camera is meant to be orientated so that it’s facing downwards with the camera lens towards the front of the vehicle)
  • the output values seem to be too low for both x and y axis. I scaled up the X output 3.5 times and the Y output 5.3 times

Even with these changes though I’m finding that my quadcopter is not holding position well. I have not narrowed down the exact cause yet but I think it could be:

  • the OPTICAL_FLOW messages may not be sent if the “displacement.response < 0.1”. It would be best to send the flow rates as quickly as possible and as regularly as possible even if the camera does not see any movement
  • I’m occasionally seeing large spikes in the flow data. This seems to happen when the attitude of the vehicle is far off from level but of course the camera doesn’t know it’s orientation so I’m unsure what’s happening

Anyway, thanks very much and I hope we can get this working. I’m keen to help

EDIT: here is a graph of the vehicle’s gyro values (i.e. rotation rates) and the output from the flow sensor which shows the large spikes mentioned above.

Thanks for getting back to us on this. Can you send a PR to replace the script on our github?

Note that the camera needs features below it to output a reading. Adding some filtering to it’s output might help. Like, a smoothing filter and removes values that are obviously bad.

@kwagyeman, thanks for responding!

Sure, I’ll raise a PR to update the script to fix the known issues (the reversal and the scaling).

I think the spikes are probably the reason the vehicle isn’t holding position well (but I should investigate more to be sure). I could filter out the spikes with a “mode” filter but that will add some lag so I’d rather avoid that. The spikes seem to happen mostly when the flow begins to slow down. I don’t suppose there’s a chance of an issue in the flow calculation? I guess there’s no known issues with the calculation that can lead to spikes?

When the calculation confidence is low or there’s a lack of features on the ground the code fails. I’m not sure how to make the output super stable. Right now, the algorithm can’t handle rotation/scale, so when those happen the output is likely bad. The confidence value is supposed to help with this. Otherwise, the calculation is correct.

@kwagyeman, thanks again for the feedback.

I will see if perhaps there’s some way to use the confidence value to remove some spikes. From a quick look at the ArduPilot logs it appears that sometimes it helps but often the glitch appears in the sample after the bad quality is reported.

So there’s one reading with a low quality, then the next reading has a high quality but shows a large spike. I suppose in the OpenMV script or on the AP side we could throw away the next one or two samples if one arrives with a low quality value.

Maybe update the message format to accept a quality value for the kalman filter in the Ardu pilot? Since the output of the camera is a noisy sensor it technically has to be filtered too.

OK thanks. Actually we’ve just remembered that we have a limitation on the update rate that we accept (20hz) but the sensor updates at about 42hz (on my camera anyway) so we’re going to increase the maximum rate our EKF can handle.

So I’ve got it basically working now and have done a test flight on my regular quadcopter with an OpenMV camera mounted on the bottom providing flow values up to the flight controller. No GPS is being used in this flight.

Here is the YouTube video of it in action. Basically it works but it drfits around a bit especially when rotating. We probably shouldn’t be surprised by this because kwagyeman said earlier that the current flow algorithm doesn’t handle rotation well.

Anyway, decent progress, thanks to those who have helped get us this far!

Hi, so, I had to do a lot of library fixes for random things in the last release. However, I will be fixing some things the algorithm requires to enable rotation/up/down correction and then once that is done it should be quite robust.

Please send the PR to replace the script in the main repo. Can you also update the AprilTag script too if the output format is incorrect for the AprilTag MAVLINK output.

@kwagyeman, the algo improvements sound great. I’ll be happy to test them once they’re available in whatever form.

I definitely own’t forget to PR the changes to the script. I’m away this weekend but I should be able to PR it by next Tuesday.

I’m not familiar with the AprilTags script but I was planning to eventually test the OpenMV for use as a precision landing sensor. Let me look at my schedule a bit before making a promise on when I’ll do that.