How to send the original data of the lepton camera through WiFi instead of just sending a JPG image

hi master

i use openmv4 and lepton3.5

i hope to send the ordinary data of lepton to other pc by using mqtt
i think may be two ways

1 sensor.snapshot() get the image,then use ‘ get_pixel’ or ‘get_statistics’
but i get error ‘MemoryError: memory allocation failed, allocating 16824 bytes’,i think may be head size not enough,May i use stack to map a large array?and how?

256KB .DATA/.BSS/Heap/Stack
512KB Frame Buffer/Stack
256 KB DMA Buffers
(1MB Total)

2 i can send the jpeg compressed image,then uncompressed the image,use the min and max temperature,maybe can get the ordinary data

i I refer to demo of

    client.publish("openmv/test", cframe)

i get err the image not get ‘len’,and How can I convert an image to a format that mqtt can forward, such as byte, string, or JSON

Thanks a lot

Hi, we had a customer in another thread dig deep into the accuracy of the jpeg image. And… he found its fine.

Just send the grayscale jpeg image in measurement mode to the PC and then the min and max temperatures. With that you can undo the scaling on the image and get the temperature per pixel.

See the FLIR grayscale measurement mode examples.

As for sending the image, jpg data is a byte stream. Use .bytearray() on the image to get the byte stream.

thanks for reply

i use bytearray,but i get error “AttributeError: ‘Image’ object has no attribute ‘bytearray’”
my framework is 3.4.2

while (True):
    clock.tick() # Track elapsed milliseconds between snapshots().
    img = sensor.snapshot()
    img.to_rainbow(color_palette=sensor.PALETTE_IRONBOW) # color it
    print("FPS %f - Lepton Temp: %f C" % (clock.fps(), sensor.ioctl(sensor.IOCTL_LEPTON_GET_FPA_TEMPERATURE)))

Is there a problem there?

That’s the right code. Are you on the latest firmware?

need 3.6.7 thanks

it is ok! thanks