Real Time Streaming and Speed

OpenMV related project discussion.
riche
Posts: 3
Joined: Tue May 16, 2017 7:19 pm

Real Time Streaming and Speed

Postby riche » Tue May 16, 2017 8:13 pm

Hello, I'm working to stream camera data from a Pi to a Ubuntu machine once motion is detected. Speed is of the essence to get image data to the Ubuntu machine for processing. Got everything working for the most part, and pretty fast by bypassing read/write IO on streaming, but running into challenge with the activation of motion recognition, continuous motion recognition detection, and then feeding a MJPEG stream via PiCamera (built in Raspberry Pi camera) all on the Raspberry Pi's 4 cores with Python.

According to earlier forum messages, I see the Cam 7 can detect motion in approximately 16 ms it looks like. However, the stream is not quite there yet, 640x480 in grayscale at 15 FPS at best, which is understandable for what it is.

What I'm wondering is if I could use the Cam 7 for motion detection, and simply have it send a signal when motion is detected or disappears, and how fast would that signal get transmitted through the pins, or USB?

The Cam 7 could run on its own without using the Pi GPU/CPU really then as well, correct?
User avatar
kwagyeman
Posts: 1033
Joined: Sun May 24, 2015 2:10 pm

Re: Real Time Streaming and Speed

Postby kwagyeman » Tue May 16, 2017 9:11 pm

Hi,

The M7 with the current firmware should be able to run motion detection using frame differencing in Grayscale at 30 FPS at 320x240. I don't know what the speed is for 640x480. Unless you have a specific requirement to be able to detect differences of 4 pixels or so doing anything at 640x480 offers no benefit and takes more time since noise in the real world will require you to toss detection areas that are too small.

In general, I'd run the detection algorithm on 160x120 grayscale. With our current firmware you'll be able to detect motion at 30FPS and then determine all areas of difference using the find blobs function.

...

So, we're about to release a new firmware image with a higher clock frequency to the camera sensor which boosts it from running at 60 FPS currently where we are able to grab every other image to 120 FPS where we can grab every other image. This should put you in the 60 FPS territory. A smaller resolution might put you even higher up to 90 FPS or so.

...

As for transmitting the detection to the PC... the camera appears as a VCP port when plugged in and could just send some serial bytes. However, most PC OSs, like ubuntu, are not real time so the OS might impose a 20 Ms delay to getting around to reading serial data. Not much you can do about this without tossing standard desktop software. Or, you can hack the kernel to get it to service the serial port and transmit the data into user space real time.

Anyway, the OpenMV Cam can run by itself. I see getting the desktop OS to start capturing images on demand the hard part.
Nyamekye,
riche
Posts: 3
Joined: Tue May 16, 2017 7:19 pm

Re: Real Time Streaming and Speed

Postby riche » Wed May 17, 2017 1:36 pm

Thank you for the fast reply.

Ordered one and looking forward to using it.
riche
Posts: 3
Joined: Tue May 16, 2017 7:19 pm

Re: Real Time Streaming and Speed

Postby riche » Mon May 22, 2017 9:12 pm

kwagyeman wrote:
Tue May 16, 2017 9:11 pm
Hi,

The M7 with the current firmware should be able to run motion detection using frame differencing in Grayscale at 30 FPS at 320x240. I don't know what the speed is for 640x480. Unless you have a specific requirement to be able to detect differences of 4 pixels or so doing anything at 640x480 offers no benefit and takes more time since noise in the real world will require you to toss detection areas that are too small.

In general, I'd run the detection algorithm on 160x120 grayscale. With our current firmware you'll be able to detect motion at 30FPS and then determine all areas of difference using the find blobs function.

....

As for transmitting the detection to the PC... the camera appears as a VCP port when plugged in and could just send some serial bytes. However, most PC OSs, like ubuntu, are not real time so the OS might impose a 20 Ms delay to getting around to reading serial data. Not much you can do about this without tossing standard desktop software. Or, you can hack the kernel to get it to service the serial port and transmit the data into user space real time.

Anyway, the OpenMV Cam can run by itself. I see getting the desktop OS to start capturing images on demand the hard part.

Ok, so I got my Cam 7 today, really excited to work with it. The IDE software is pretty nice.

I'm going through the tutorials, and they don't seem to be made for more, um, newbie kind of folks with sample code, etc.

The programming setup seems to be more script like then OOP. So, I don't understand why this didn't work to run to even test out serial port:

Code: Select all

from pyb import UART

uart.init(9600, bits=8, parity=None, stop=1)
uart = UART(3, 9600)
uart.write('hello')
uart.read(5)
Also, there is nothing on frame differencing other then a few method calls, which I thought was a more accessible feature of the system. http://docs.openmv.io/library/omv.image ... difference

Generally (or specifically) how would I go about setting up a frame differencing system, and then upon motion writing the camera stream to the uart methods for passing through to the VCP port?
User avatar
kwagyeman
Posts: 1033
Joined: Sun May 24, 2015 2:10 pm

Re: Real Time Streaming and Speed

Postby kwagyeman » Tue May 23, 2017 1:26 am

>> I'm going through the tutorials, and they don't seem to be made for more, um, newbie kind of folks with sample code, etc.

You're an early adopter! :) More tutorial pages will be done given my free time. Feature expansion has been the priority currently.

Code: Select all

from pyb import UART

uart.init(9600, bits=8, parity=None, stop=1)
uart = UART(3, 9600)
uart.write('hello')
uart.read(5)
Should be:

Code: Select all

from pyb import UART

uart = UART(3, 9600, bits=8, parity=None, stop=1)
uart.write('hello')
uart.read(5)
Not sure what you mean by OOP. You program in python. You can make classes if you like. However, there's not really a huge need unless you plan to do more things with data structures.

>> Generally (or specifically) how would I go about setting up a frame differencing system, and then upon motion writing the camera stream to the uart methods for passing through to the VCP port?

There's an example script called advanced frame differencing. Please see the IDE examples folder. There are about 60+ examples in there which basically serve as our documentation on how to do things currently. See the pixy emulation scripts for examples of long programs that do computer vision and then write formatted binary strings to a UART. See the advanced frame differencing script on how to diff frames. The frame differencing script I believe is under the filters example folder.

...

As just ask questions. Response time is pretty fast.
Nyamekye,
mjs513
Posts: 71
Joined: Sun Apr 30, 2017 12:52 pm

Re: Real Time Streaming and Speed

Postby mjs513 » Tue May 23, 2017 7:36 am

Hi Riche

Been playing around with the advanced script a bit for motion detection, or for me, no motion if a bot gets stuck. Anyway, I used the advanced script but changed to gray scale the image instead of color, and right after the img.difference statement I added these lines:

Code: Select all

    if img.get_statistics().stdev() < 5:
        print("no motion detected")
    else:
        print("motion Detected")
Planning on adding appropriate logic for bot action when no motion is detected. In process of putting a test bot together now.

Mike
slow_one
Posts: 66
Joined: Fri Jun 02, 2017 11:25 am

Re: Real Time Streaming and Speed

Postby slow_one » Mon Jun 05, 2017 5:00 pm

can you explain the "streaming part" of this?

I think I understand how to do the machine vision side of things ... at least I've done it using python and a webcam.

But I don't quite "get" the streaming idea. I'm assuming we'd have to push the data over a serial port and listen to that port... somehow.
Can you point me in the right direction?
User avatar
kwagyeman
Posts: 1033
Joined: Sun May 24, 2015 2:10 pm

Re: Real Time Streaming and Speed

Postby kwagyeman » Mon Jun 05, 2017 10:40 pm

The camera is sending out its detection result over a serial port each frame. Thus a stream of serial data is coming out of the camera.
Nyamekye,
slow_one
Posts: 66
Joined: Fri Jun 02, 2017 11:25 am

Re: Real Time Streaming and Speed

Postby slow_one » Tue Jun 06, 2017 3:40 pm

kwagyeman wrote:
Mon Jun 05, 2017 10:40 pm
The camera is sending out its detection result over a serial port each frame. Thus a stream of serial data is coming out of the camera.
I get that part. I'm asking how to read that stream of serial data. Or a point in the direction that describes how to do that
User avatar
kwagyeman
Posts: 1033
Joined: Sun May 24, 2015 2:10 pm

Re: Real Time Streaming and Speed

Postby kwagyeman » Tue Jun 06, 2017 9:20 pm

Look at the the pixy serial emulation script in the examples folder. It shows how to do this with a state mahine that reads a byte at a time.
Nyamekye,

Return to “Project Discussion”

Who is online

Users browsing this forum: No registered users and 1 guest