Low latency gstreamer rtsp mjpeg

Hi all,
I’ve been working on getting rtsp video from my new RT1062 and I’m running into some issues to get it to work the way I want.
Contrary to what is mentioned in the example on rtsp, on my Ubuntu Jammy PC, VLC does not want to play my stream, but I was able to get ffmpeg/ffplay to play it with low latency, similarly to the video in the IDE. For those that are interested, this is the command I used:

ffplay rtsp://192.168.0.184:554 -fflags nobuffer -flags low_delay -probesize 32 -vf setpts=0

However, I need to integrate the rtsp stream into a gstreamer pipeline and there seem to be some issues. I’ve tried this pipeline with the jpegdec:

gst-launch-1.0 -e rtspsrc location=rtsp://192.168.0.184:554 ! rtpjpegdepay ! jpegdec ! autovideosink sync=false

This opens a window, but gives the following error

../ext/jpeg/gstjpegdec.c(1569): gst_jpeg_dec_handle_frame (): /GstPipeline:pipeline0/GstJpegDec:jpegdec0:
Decode error #63: Invalid JPEG file structure: two SOI markers

This maybe caused by the decoder expecting I420 and OpenMV providing yuvj422p according to ffplay, but it seems a bit strange.
Another option I tried is to use the avdev_mjpeg plugin from ffmpeg, which is supposed to be able to process the format (same as ffplay) with this command:

gst-launch-1.0 --gst-debug=1 rtspsrc location=rtsp://192.168.0.184:554 ! rtpjpegdepay ! image/jpeg ! avdec_mjpeg max-errors=-1 ! video/x-raw ! videoconvert ! autovideosink

but this also fails, with the following errors:

0:00:00.288413816 17564 0x7f0ca001a920 ERROR                  libav :0:: overread 8
0:00:00.288451992 17564 0x7f0ca001a920 ERROR                  libav :0:: get_buffer() failed

Any help or insight is grealy appreciated!

Hi, how are you generating the jpeg files you are sending?

The RTSP code compresses them using YUV422 by default: openmv/scripts/libraries/rtsp.py at master · openmv/openmv

This should output a YUV422 image. This is confirmed working with VLC. Also, already compressed images produced by sensor.set_pixformat(sensor.JPEG).

Hi,
I am generating the mjpeg stream from a FLIR Lepton using the rtsp server as in the LAN RTSP example. I’ve also tried the default camera which comes with the RT1062. Both rtsp streams do not work with my default VLC, perhaps due to a missing livemedia-utils dependency which is not available for Ubuntu 22.04.
Anyway, I’m more interested in a way to process the streamed images with gstreamer. Your comment suggests that the easiest way would be to change the subsampling in the rtsp library to JPEG_SUBSAMPLING_420? That’s a format the jpegdec plugin seems to support.
Thanks for the input, I will give this a try.

Yeah, if you change it to that then that will force the software jpeg compressor to output 420 images. You can check that it does by looking at the traffic in wireshark and typically you can see the whole jpeg packet and all it’s headers.

perhaps due to a missing livemedia-utils dependency which is not available for Ubuntu 22.04.

Yes, that would cause vlc to have issues.