Hello,
I am trying to streaming video with the wifi shield and the “rtsp_video_server_1.py” program example.
The program apparently runs on the OpenMV. Its output is:
Running in Station mode…
IP Address: Port 192.168.0.139:554
Running…
But I can’t get to watch the video. I have tried with VLC and with Chrome browser on a PC in the same network.
What can be wrong?
NOTE: I have already updated the firmware using the “fw_update.py” example.
Thanks in advance.
Are you using the OpenMV Cam H7 Plus with an OV5640? And did you point the RTSP URL in VLC to rtsp://192.168.0.139:554 ?
It should just work.
Hello,
I got to watch the streaming using another wifi. With a low resolution (QVGA) the framerate is enough.
Now I have an issue:
I need to run a vision program and at the same time streaming what the camera is seeing.
The vision program requires the setting of pixformat as RGB565 and the streaming program as JPEG.
I don’t get to make any of them working with the other setting.
Is there a solution to this?
Thanks!
Just do .compress() on the image to send it.
Hello again,
What I need is to execute a vision program to measure an object and, at the same time, streaming the video.
Both functions work separately, but I cant get them to work simultaneously.
If I do somenthing like:
…
server.stream(image_callback)
…
while(True):
…
(vision program)
…
The loop does not execute.
I understand that the stream does not return, and blocks the program. The function is also blocking if inserted into the loop.
I considered creating a thread, but I understand that it is not possible in OpenCam.
What should I do?
Thanks in advance
Hi, you just want to do the processing in the image_callback. Assuming someone connects to the stream then you process data while they are connected. The second they disconnect you stop processing data.
Also, this just got added:
https://docs.openmv.io/library/uasyncio.html
You can modify the RSTP streaming code to use the uasyncio to run multiple co-routines at once.
Hello,
I put the vision code in the callback function as you told me and it works.
Now I have another issue to solve:
As I need to use RGB565 pixformat to do the vision algorithm, at the end I compress the image to send it by streaming, but the streaming is intermittent, not good.
If I , at the end of the callback function, switch to JPEG pixformat and take a snapshot to send the streaming, the streaming is good. But then I can’t draw on the image to show the result of the vision algorithm…
Is there any solution to this?
Thak you for your patience…
Yes, can you give me a month? I have two updates to the firmware that should fix your issue.
-
We will be adding triple buffering to the firmware. This will double your FPS.
-
I will be updating the JPEG code to include the missing JPEG JIFF header along with making jpeg compression faster.
These two fixes should make the whole streaming perfect.
Ok thanks!
I wait for your update.