Wifi Streaming of RGB565 Images

For my project, I need to stream RGB565 images (160x120, QQVGA) using the wifi shield. I’ve looked at the mjpegstreamer and rtsp_video_server examples. I’ve also looked at the mqtt and http examples. However, my project requires uncompressed RGB565 images.

I’ve written client/socket code for the OpenMV. It works fine when I set the sensor format to JPEG. Here are the relevant snippets.

OpenMV Client

...
sensor.set_pixformat(sensor.JPEG)
... 
s = usocket.socket(usocket.AF_INET, usocket.SOCK_STREAM)
s.connect((HOST, PORT))
s.settimeout(2.0)  
...
frame = sensor.snapshot()
frame = frame.bytearray()
sent = s.write(frame)
...

PC Server

import io
from PIL import Image, ImageFile
...
while True:
    data = conn.recv(4000)
    if not data:
        break
    if msglen < imLen:
        msglen += len(data)       
        img += data
    if msglen >= imLen:
        break
image = Image.open(io.BytesIO(img))
...

The above code streams well when I set the format to JPEG. However, when I change the format to RGB565 using sensor.set_pixformat(sensor.RGB565), the bytearray can no longer be decoded.

OSError: cannot identify image file <_io.BytesIO object at 0x10ce322f0>

Any help is appreciated! If there is a better way to stream RGB565 images over wifi, please let me know!

The Python struct module is probably a more efficient way of packing and unpacking the byte data. However, my main problem is decoding the RGB565 image after it is unpacked. I’m unable to convert it back to an RGB image on my PC.

Um, I would just send a JPG at a really high quality and then convert back into an RGB565 image. There’s no issues sending an RGB565 image, but, it’s more data than normal. Any reason you need it raw?

Yeah, um, I don’t think PC libraries easily can decode RGB565. JPG transfer is kinda the only universal way.

The project is contactless vital signs monitoring inside a hospital. The current literature provides methods for calculating heart rate, respiratory rate, etc. but it requires uncompressed RGB pixels over 10 seconds to make accurate calculations. If possible, the stream needs to be with full RGB images.

On the subject of JPEG compression, is it possible to have JPEG compression at the highest quality of 100? In the helloworld example, I can only set the quality to 98.

sensor.set_pixformat(sensor.RGB565) # Set pixel format to RGB565 (or GRAYSCALE)
#sensor.set_quality(98)

At 99 and 100 the images glitch and the camera disconnects.

That’s probably a bug if you can’t set 100. That said, you won’t be able to see a difference above 90.

To note, all that code is doing is setting: https://github.com/openmv/openmv/blob/master/src/omv/ov5640.c#L740

Does this help? Just uses PIL and Numpy. You could possibly use this on the consumer side to decode it to visualize it.