For my project, I need to stream RGB565 images (160x120, QQVGA) using the wifi shield. I’ve looked at the mjpegstreamer and rtsp_video_server examples. I’ve also looked at the mqtt and http examples. However, my project requires uncompressed RGB565 images.
I’ve written client/socket code for the OpenMV. It works fine when I set the sensor format to JPEG. Here are the relevant snippets.
import io
from PIL import Image, ImageFile
...
while True:
data = conn.recv(4000)
if not data:
break
if msglen < imLen:
msglen += len(data)
img += data
if msglen >= imLen:
break
image = Image.open(io.BytesIO(img))
...
The above code streams well when I set the format to JPEG. However, when I change the format to RGB565 using sensor.set_pixformat(sensor.RGB565), the bytearray can no longer be decoded.
OSError: cannot identify image file <_io.BytesIO object at 0x10ce322f0>
Any help is appreciated! If there is a better way to stream RGB565 images over wifi, please let me know!
The Python struct module is probably a more efficient way of packing and unpacking the byte data. However, my main problem is decoding the RGB565 image after it is unpacked. I’m unable to convert it back to an RGB image on my PC.
Um, I would just send a JPG at a really high quality and then convert back into an RGB565 image. There’s no issues sending an RGB565 image, but, it’s more data than normal. Any reason you need it raw?
The project is contactless vital signs monitoring inside a hospital. The current literature provides methods for calculating heart rate, respiratory rate, etc. but it requires uncompressed RGB pixels over 10 seconds to make accurate calculations. If possible, the stream needs to be with full RGB images.
On the subject of JPEG compression, is it possible to have JPEG compression at the highest quality of 100? In the helloworld example, I can only set the quality to 98.
sensor.set_pixformat(sensor.RGB565) # Set pixel format to RGB565 (or GRAYSCALE)
#sensor.set_quality(98)
At 99 and 100 the images glitch and the camera disconnects.