I´m currently working on an image stream from openMV to PC by using LabVIEW. I´m trying to achieve, to import the snapshots from the cam to LabVIEW as png images. I noticed that the Image Class does not support png. So I am transfering a jpg stream. I am able to get that stream in LabVIEW and write it to a jpg file. That´s working perfectly. But I have two questions:
I´m currently using this code on the openMV module:
import time, pyb, sensor
#some init code
while(usb.isconnected()):
img = sensor.snapshot()
img.save("test.jpg")
with open("test.jpg", mode='rb') as file:
fileContent = file.read()
usb.send(data=fileContent)
time.sleep(200)
So as you can see, I´m creating a jpg file on the openMV module just to read it out and stream the data to the PC. Is there a solution to convert the data stored in img to jpg (or even png) without writing a jpg file?
Does anybody know the algorithm to convert jpg data to png? I mean the binary data. Otherwise I receive the jpg stream on the PC, create a jpg file and have to convert that file to png…
Best case scenario would be to get a png stream to my LabVIEW application without creating any file. Just take a snapshot, convert the data to png and transfer the png data to PC. Or take a snapshot, convert the data to jpg, transfer the jpg data to PC and convert it to png there.
Allright, so I found a great DLL to convert the JPEG stream into an image in LabVIEW. Awesome. Taking a snapshot, sending it to the PC and show it on my GUI is taking about 25 ms now. That would be fast enough for my application. But I´m still curious if there´s a way to generate that JPEG string without creating a jpg file on the openMV memory.
Thank´s for your answer. I have no idea what you mean by USB debug protocol and was not able to find it searching the web.
And you´re saying c and LabVIEW - does that mean I would send the sensor.snapshot()) data to PC and convert it to jpg there?
Best solution would be to convert the object sensor.snapshot() returns to a jpg string on the openMV controller. Isn´t that possible without writing a jpg file and read it binary to get that string?
The USB debug protocol is what OpenMV IDE uses to pull images off the OpenMV Cam at near the FPS the camera is running at. It’s not documented except in code.
We have two files which you can use to pull the jpeg frame buffer. There’s an openmv.py script in the usr directory on our github that does the protocol and is still valid. Please see that script and make frame buffer calls like it does in labview using a serial port to pull the jpeg compressed frame buffer at near native speed.
Aaaah, I tried to find out how the IDE gets the image data using a USB sniffer, but couldn´t make it.
Thank you so much for the suggestion. That sounds really great and like a much better solution than my ugly workaround. I´ll definitely check that out tomorrow.
Using an aquivalent to the fb_size() function results in getting an Array [320, 240, ~3000]. Is ~3000 a plausible value for the frame buffer size? On my working alternative the jpg files have about 4000 Bytes.
When I use a function that represents the fb_dump() of the openmv.py script I get binary data (see attached screenshot) which cannot be interpreted as jpg. In the original fb_dump() function there´s one code line to decode the buffer data:
buff = np.asarray(Image.frombuffer(“RGB”, size[0:2], buff, “jpeg”, “RGB”, “”))
But I don´t think I can create a LabVIEW function to replace that line. So that protocol does not seem to work for me. Except I call a python script from LabVIEW that contains that line. But I guess that slows down the application to much. Thank you for your help anyway!
I don´t really need RGB8888. I just need the jpg data. So sending the __USBDBG_FRAME_DUMP command should get me that data, right? If you take a look at the screenshot I attached in an earlier reply, you see the data I get back when sending that command. But that data is not jpg format. When I log that binary string as a jpg file I cannot open it.
The hello world script. Off course it´s running as main.py on the camera. Like mentioned the commands seem to work. The __USBDBG_FRAME_SIZE command returns data that looks legit. But the binary data returned when sending the __USBDBG_FRAME_DUMP command does not.
I do the same and get the integers 320, 240 and something about 3000. 320x240 is the size of the Image. This third integer should be the frame size. Now you take that third integer and use it in fb_dump() like this:
That´s what I am doing, too. ‘buff’ in your script equals ‘frame buf read’ in my application. But that is no jpg data. It contains the same line over and over again.
Can labview handle the data rate? It’s at 12 MBps. We’ve seen issues with this kinda thing before. Older Macs require the frame buffer to be dumped in MTU sized chunks.