Image Stream

Can you attach the data you get ?

Hi, what I mean by MTU sized chunks is that on older MACs we’ve seen the OS drop bytes. Like, literally fom receiveing a USB packet to then giving it to the user application the OS drops bytes because it doesn’t expect that much data to go through a serial port at USB speeds.

Oh, okay, so you see the same data with a USB sniffer… Um, question, are all the bytes between 0x80 and 0xBF?

Even if the driver is dropping bytes, the data won’t be the same . I’m interested in a hexdump maybe it will give us a clue to what the cam is sending.

Update:
Or maybe you’re getting the header (which should be the same) anyway a hex dump will help a lot.

My guess ist that I´ve sent wrong commands. The constant 12 that I send with the USB_FRAME_Size is a string. Means the two characters ‘1’ and ‘2’ are sent and not a 4 Byte integer. I do the same at the USB_FRAME_DUMB command. I just cast the frame size to a string which results in sending a string like “3459” instead of an integer 3459. I will change that in my code on monday, try again and forward the results to you guys.
Thank´s for the help so far. I really appreciate it.

All right. I´ll attach some files with the binary data I got back from the cam. I tried a couple of different settings. That´s why the content is different. I noticed one thing: When I use a baud rate of 115200 I often get back the FPS instead of the Frame buf. Or let´s say I believe it´s the FPS (values of about 28.xxx).

There was an error uploading the files. Here they are.
frame buf 5.txt (2.51 KB)
frame buf 6.txt (2.75 KB)
frame buf 3.txt (696 Bytes)
frame buf 4.txt (2.76 KB)
frame buf 2.txt (1.77 KB)
frame buf.txt (1.77 KB)

Guys what about sending the data sensor.snapshot() returns to the pc. Is there a way to convert that data to a jpg or png or whatever? For example by using the PIL modul python provides. I can start python scripts using LabVIEW.

Yeah, you can do that directly. Um, just do:

img = sensor.snapshot()
print(img.compressed_for_ide())

This will come out of whatever serial port the thing is attached too. Here’s how to decode the data stream:

alternatively, for a raw jpeg file create a serial usb vcp object, I think you did this before and just write:

serial_vcp.write(sensor.snapshot().compressed())

The compressed() method compressed the image and returns a jpeg version of it. The print() statement won’t print the data normally unless it’s been compressed for the IDE which will the compressed image and then update the FB. To just send raw compressed jpeg data you have to send it out through the uart/vcp/spi with write().

Remember the camera supports REPL. You can command it via a script from the PC if you want. MicroPython probably has a way to send the camera a file to execute… look around in their repo… and get the serial responses.

Thank you so much. Jesus, that ‘.compressed()’ is the thing I was looking for the whole time.

Always remember to search the docs. :slight_smile:

Haha, you´re absolutely right. I did and I saw the function but somehow I misunderstood its functionality. If I had just tried it. But no worries. I´m just glad that everything works fine now. Even though the image stream is slower than I expected. I get an image every 80 ms. In my opinion that´s way so slow ain´t it?

...	#init stuff
while(True):
    try:
        while(not usb.isconnected()):
            streamFlg = False
            time.sleep(100)
        while(streamFlg and usb.isconnected()):
            usb.send(data=sensor.snapshot().compressed())
            usb.send(data=10)							#Termination character
            usb.send(data="END")
            usb.send(data=10)
    except KeyboardInterrupt:
        streamFlg = not streamFlg[\Code]

So… We use really low compression quality to the GUI. Try lowering the compression quality and if you can… Please log where time is spent. I have a feeling though that it’s spending more time compressing. Compress has a quality arg. Lower it.

Here’s a hex dump of “frame buf.txt” looks like you’re getting the JPEG header over and over again.

00000000  20 20 20 20 f0 20 20 20  16 07 20 20 6a 70 65 ea  |    .   ..  jpe.|
00000010  24 1d 99 5d b3 6d f0 39  aa 7c 82 08 2d 01 c7 7f  |$..].m.9.|..-...|
00000020  8e 77 bd c5 5e d1 6f 62  ff 09 5b d9 08 bd bc c4  |.w..^.ob..[.....|
00000030  32 20 61 e0 21 71 ac 0d  a6 83 76 95 bd 14 9d e6  |2 a.!q....v.....|

00000040  20 20 20 20 f0 20 20 20  16 07 20 20 6a 70 65 ea  |    .   ..  jpe.|
00000050  24 1d 99 5d b3 6d f0 39  aa 7c 82 08 2d 01 c7 7f  |$..].m.9.|..-...|
00000060  8e 77 bd c5 5e d1 6f 62  ff 09 5b d9 08 bd bc c4  |.w..^.ob..[.....|
00000070  32 20 61 e0 21 71 ac 0d  a6 83 76 95 bd 14 9d e6  |2 a.!q....v.....|

00000080  20 20 20 20 f0 20 20 20  16 07 20 20 6a 70 65 ea  |    .   ..  jpe.|
00000090  24 1d 99 5d b3 6d f0 39  aa 7c 82 08 2d 01 c7 7f  |$..].m.9.|..-...|
000000a0  8e 77 bd c5 5e d1 6f 62  ff 09 5b d9 08 bd bc c4  |.w..^.ob..[.....|
000000b0  32 20 61 e0 21 71 ac 0d  a6 83 76 95 bd 14 9d e6  |2 a.!q....v.....|
000000c0  20 20 20 20 f0 20 20 20                           |    .   |

Thank you very much. Do you have any idea why this is happening? Sorry for the late reply. I´ve been working on something else for quite a while now.

I’m sorry, I don’t. We’ve demonstrated that our code works well for this. I don’t know how NI’s stuff operates. Is it possible for Lab View to call a system process that invokes our python script to save a file with the image data?

Allright. Then I will stick to my current version. Thank you both for the kind support :smiley:

When time allows I will create some python scripts for interfacing to the OpenMV Cam using OpenMV IDEs robust comm stack. This should allow others to get stuff working nicely.

Hi, I think you’re not reading the whole frame, just the first packet. Note in the hex dump, you’re reading exactly 64 byte packets (the max size of the USB FS end point). Not sure how your code works, but you should request the whole frame size from the cam, then the cam will keep sending 64bytes packets plus the remainder/last packet <= 64 bytes.