Sending Lepton Image over UART

Hello all!

I’m fairly new to coding, and I have an OpenMV H7 with the Lepton attachment (using Lepton 2.5). My goal is to take a snapshot with the temperature data of each pixel and then send all of this data over UART as fast as I can so I can potentially produce live video on the receiver device.

  1. On the transmitting side, do I just compress the image snapshot and then use the UART write() function? Equally as important, how would I send the temperature data of each pixel (i.e. something like a 4800 element list with the temperature of each pixel) over UART?
  2. On the receiving side (which uses Python 3.5), how could I receive the snapshot data and visualize it? Assuming that there is a “uart” object with a read function that receives the data, i.e.
 data_received = uart.read()
  1. Bonus: Drawing a bounding box around the areas of a certain temperature (using the received temperature data) would also be cool. I think I might know how to do this, but any guidance/advice would definitely be appreciated.

Thank you in advance! Please let me know if any more information is needed.

Hi, we will have an interface library out for the Arduino shortly that will make this very easy.

Until then… Can the receiving device handle the data rate? The OpenMV Cam is a high spec processor. It can overwhelm other devices easily.

To send the jpeg image you just write() the image after calling compress on it. This is one line of code.

As for visualizing the image. It’s a jpeg. You have to figure that out. Lots of software can show jpeg images.

Finally, see our examples in the IDE for drawing bounding boxes.

Thank you for the response! I think the receiving device would be able to handle the data rate.

How do I get a list with the temperature of each pixel? Building off of the “lepton_get_object_temp.py” example, I set the grayscale min-max value to 0-255 and the temperature range min-max values from -10 to 140 degrees Celsius. I also set the pixels_threshold and the area_threshold of img.find_blobs() to zero. Lastly, I set merge variable to False. However, my blob_stats list doesn’t seem to have the temperature values of each pixel but instead has a temperature value average for just one blob (which I assume is the average temperature over the entire image). What am I doing wrong?

If you want the temperature of each pixel just send the jpeg compressed image out the serial port with the min and max temperature ranges used for the min. On the receiving device, uncompress the image, and the use the min and max values to undo the scaling applied to get the temp per pixel.

Thanks for the help! I’m really stumped on how to decompress the image. The receiver I have provides me with a list of ASCII characters (with the same size as the compressed jpeg image sent from the camera), which I convert to bytes using raw_unicode_escape (or latin1) encoding.

Do you have any Python modules you would recommend? I’ve been scraping the internet, trying imageio.imread and io.BytesIO but with no success (returns a solid black picture for some reason). I’ll keep looking through different forum threads, but I thought I would ask anyways.

Just as a note: I am using the Tinkerforge RS232 bricklet’s UART port as the receiver.
https://www.tinkerforge.com/en/doc/Hardware/Bricklets/RS232_V2.html

Thanks! However, using the pygame script on that link still gives me a fully black picture (which I assume corresponds to an array of zeros).

I was thinking that it might have to do with my receiver. The Baud rates match (921600), but it could be something else…

EDIT1: Here’s a little more on what I’m trying to do: I’m trying to only send one sensor snapshot over UART rather than a constant stream of snapshots for a live video. On my receiver I get a list of characters (called pic_data) which I convert to a bytes object via:

pic_data_string = bytes(''.join(pic_data), 'raw_unicode_escape')

EDIT2: Interesting. So I made the image snapshot into a byte array using the bytearray function, and all of the bytes appear to be \x00, which I assume is zero. I wonder why this is the case?

EDIT3: Figured it out! For some reason, I need to take a couple of snapshots before I actually capture a proper image. The first couple of snapshots are just a black screen. Not sure why this is the case, but this is fine for my project purposes. Also can send live video over UART as well using Pygame. Thanks for the help!

Can you post your code? I am having trouble decompressing the image as well, though I am using the H7 camera.