For my project, I need to stream RGB565 images (160x120, QQVGA) using the wifi shield. I’ve looked at the mjpegstreamer and rtsp_video_server examples. I’ve also looked at the mqtt and http examples. However, my project requires uncompressed RGB565 images.
I’ve written client/socket code for the OpenMV. It works fine when I set the sensor format to JPEG. Here are the relevant snippets.
OpenMV Client
...
sensor.set_pixformat(sensor.JPEG)
...
s = usocket.socket(usocket.AF_INET, usocket.SOCK_STREAM)
s.connect((HOST, PORT))
s.settimeout(2.0)
...
frame = sensor.snapshot()
frame = frame.bytearray()
sent = s.write(frame)
...
PC Server
import io
from PIL import Image, ImageFile
...
while True:
data = conn.recv(4000)
if not data:
break
if msglen < imLen:
msglen += len(data)
img += data
if msglen >= imLen:
break
image = Image.open(io.BytesIO(img))
...
The above code streams well when I set the format to JPEG. However, when I change the format to RGB565 using sensor.set_pixformat(sensor.RGB565), the bytearray can no longer be decoded.
OSError: cannot identify image file <_io.BytesIO object at 0x10ce322f0>
Any help is appreciated! If there is a better way to stream RGB565 images over wifi, please let me know!