Image Stream

Hey guys,

I´m currently working on an image stream from openMV to PC by using LabVIEW. I´m trying to achieve, to import the snapshots from the cam to LabVIEW as png images. I noticed that the Image Class does not support png. So I am transfering a jpg stream. I am able to get that stream in LabVIEW and write it to a jpg file. That´s working perfectly. But I have two questions:

  1. I´m currently using this code on the openMV module:
import time, pyb, sensor

#some init code

while(usb.isconnected()):
    img = sensor.snapshot()
    img.save("test.jpg")
    with open("test.jpg", mode='rb') as file:
        fileContent = file.read()
        usb.send(data=fileContent)
    time.sleep(200)

So as you can see, I´m creating a jpg file on the openMV module just to read it out and stream the data to the PC. Is there a solution to convert the data stored in img to jpg (or even png) without writing a jpg file?

  1. Does anybody know the algorithm to convert jpg data to png? I mean the binary data. Otherwise I receive the jpg stream on the PC, create a jpg file and have to convert that file to png…

Best case scenario would be to get a png stream to my LabVIEW application without creating any file. Just take a snapshot, convert the data to png and transfer the png data to PC. Or take a snapshot, convert the data to jpg, transfer the jpg data to PC and convert it to png there.

Thank you in advance!

Allright, so I found a great DLL to convert the JPEG stream into an image in LabVIEW. Awesome. Taking a snapshot, sending it to the PC and show it on my GUI is taking about 25 ms now. That would be fast enough for my application. But I´m still curious if there´s a way to generate that JPEG string without creating a jpg file on the openMV memory.

Hi, you can get jpg images via the USB debug protocol. Would this work for you? It’s a lot of code though in C to use that… Not sure labview can do.

Thank´s for your answer. I have no idea what you mean by USB debug protocol and was not able to find it searching the web.
And you´re saying c and LabVIEW - does that mean I would send the sensor.snapshot()) data to PC and convert it to jpg there?
Best solution would be to convert the object sensor.snapshot() returns to a jpg string on the openMV controller. Isn´t that possible without writing a jpg file and read it binary to get that string?

The USB debug protocol is what OpenMV IDE uses to pull images off the OpenMV Cam at near the FPS the camera is running at. It’s not documented except in code.

We have two files which you can use to pull the jpeg frame buffer. There’s an openmv.py script in the usr directory on our github that does the protocol and is still valid. Please see that script and make frame buffer calls like it does in labview using a serial port to pull the jpeg compressed frame buffer at near native speed.

Aaaah, I tried to find out how the IDE gets the image data using a USB sniffer, but couldn´t make it.
Thank you so much for the suggestion. That sounds really great and like a much better solution than my ugly workaround. I´ll definitely check that out tomorrow.

All right.

  1. Using an aquivalent to the fb_size() function results in getting an Array [320, 240, ~3000]. Is ~3000 a plausible value for the frame buffer size? On my working alternative the jpg files have about 4000 Bytes.

  2. When I use a function that represents the fb_dump() of the openmv.py script I get binary data (see attached screenshot) which cannot be interpreted as jpg. In the original fb_dump() function there´s one code line to decode the buffer data:
    buff = np.asarray(Image.frombuffer(“RGB”, size[0:2], buff, “jpeg”, “RGB”, “”))
    But I don´t think I can create a LabVIEW function to replace that line. So that protocol does not seem to work for me. Except I call a python script from LabVIEW that contains that line. But I guess that slows down the application to much. Thank you for your help anyway!

Mmm, well, look at the guys of OpenMV IDE and you’ll be able to see the protocol in action using C. There’s a lot of code though…

Thank´s but due to a deadline I´ll stick to my working code. I can get my head around a smarter alternative if there´s time at the end of the project.

Hi, the Python script is outdated. The camera only sends JPEG images now, which is what you need, right ?

Update:

Although the script is outdated, I think it still works, it just assumes there are other formats sent from the camera.

This line decodes JPEG to RGB888:

buff = np.asarray(Image.frombuffer(“RGB”, size[0:2], buff, “jpeg”, “RGB”, “”))

I don´t really need RGB8888. I just need the jpg data. So sending the __USBDBG_FRAME_DUMP command should get me that data, right? If you take a look at the screenshot I attached in an earlier reply, you see the data I get back when sending that command. But that data is not jpg format. When I log that binary string as a jpg file I cannot open it.

What script is running?

The hello world script. Off course it´s running as main.py on the camera. Like mentioned the commands seem to work. The __USBDBG_FRAME_SIZE command returns data that looks legit. But the binary data returned when sending the __USBDBG_FRAME_DUMP command does not.

This is how it looks like in LabVIEW. Pretty easy to read I guess.

Here’s the actual code openmv ide uses:

The IO layer is the transaction layer and the serial port layer is another thread which executes commands and returns the response.

Note that the IDE does… a lot more… to be compatible on a lot of systems. You don’t need to handle quite all the stuff it does.

Um, so, the protocol that the openmv.py script uses hasn’t changed. So, wondering if there’s some byte scrambling problem for you.

Um, can you verify the first command to get the frame size works correctly? As in it returns a sane frame size?

The first command Returns 12 Bytes. You guys Interpret this 12 Bytes as 3 integers with 4 bytes each:

__serial.write(struct.pack("<BBI", __USBDBG_CMD, __USBDBG_FRAME_SIZE, __FB_HDR_SIZE))
struct.unpack("III", __serial.read(12))

I do the same and get the integers 320, 240 and something about 3000. 320x240 is the size of the Image. This third integer should be the frame size. Now you take that third integer and use it in fb_dump() like this:

num_bytes = size[2]
__serial.write(struct.pack("<BBI", __USBDBG_CMD, __USBDBG_FRAME_DUMP, num_bytes))
buff = __serial.read(num_bytes)

That´s what I am doing, too. ‘buff’ in your script equals ‘frame buf read’ in my application. But that is no jpg data. It contains the same line over and over again.

Can labview handle the data rate? It’s at 12 MBps. We’ve seen issues with this kinda thing before. Older Macs require the frame buffer to be dumped in MTU sized chunks.

That is not a LabVIEW issue. I also look at the data using an usb sniffer. Looks the same…