wifi shield transfer

Thank you for your support

As you gave me advice
Running two files on Ubuntu on Linux
I was able to display the camera image.

I will stop the serial stream.
That is why it turned out that Pygame does not work on C #

So advice on how to transfer data using wifi shield
Please give me

Hi, see the mjpeg_streamer.py example.

I have been playing with this. The pygame tool is better in that you can save frames but its not straightforward with the mjpeg_streamer.py tool you can only see the stream.

If you run openmv like a standard UVC webcam then you can use mjpg-streamer MJPG-streamer download | SourceForge.net which allows both streaming and snapping frames for downloads, I came across a video UVC Webcam Support for your OpenMV Cam - YouTube where you can set up UVC on the OpenMV but I couldn’t get it to work. I couldn’t find any more documentation.

UVC comes standard with JeVois and using a Raspberry camera so it would be great to figure this out with OpenMV but then when running it this way you are not harnessing the its image processing tools then it just becomes an expensive webcam. So more ideally you just want a way to download pictures and image processing data conveniently but this is what is perplexing with this tool since I find it pretty cryptic to use. You can also do image processing offline with a webcam using tools like python/opencv using a standard webcam.

I just replied to you in your other thread, you can enable UVC by uploading uvc.bin, you’ll find it here (until it’s released with the IDE)

https://github.com/openmv/openmv/tree/master/firmware/OPENMV3

When using UVC we assume you have a powerful host to do the image processing (i.e RPi) doesn’t make sense to do any processing then on OpenMV.

This feature is implemented specifically to allow folks to use their expensive FLIRs with other boards, otherwise it’s kinda pointless since you could just use any webcam.


EDIT: To revert back to the default firmware, just upload firmware.bin, however you need to connect the cam after clicking on Run Bootloader->Run.

Thank you for your support

Although it is mpen_streamer.py,
It works with OPEN MV IDE for windows, but
I tried to make it run on Linux
“ImportError: no module named 'usocket”
I get an error

How to install modules
I do not know so please advise

Thank you

The modules are built-in (inside the firmware) shouldn’t matter if you run the script from Windows or Linux. Can you post a screenshot of the error ?

Thank you for your support

Paste the screenshot

It is linux ubuntu

Thank you


[ubuntu.png][/img]

This module and scripts only run on the OpenMV camera. Do you have an OpenMV camera ? If yes, please download the IDE from here:

And start with the example scripts.

Thank you for your support

mjpeg_streamer.py OPENMV IDE worked (linux, windows)
Although wifi works, there is a problem that an image can not be displayed

In addition, I confirmed the image display in cpython on linux. (wifi does not work)
"ImportError: no module named ‘usocket’ occurred

Solve both problems
Using pygame on windows or linux
Please tell me how to help you make your own application

As a product image, it is several meters beyond the camera equipped with wifi
I want to create an application to display on PC

Thank you

Hi, our scripts don’t run on CPython in linux. MicroPython which runs on our board parses and compiles python code. However, this doesn’t mean the modules/libraries etc. are the same.

Anyway, is your goal to the use the OpenMV Cam as a webcam? This is not it’s design purpose. We built it to process images onboard and to not send them really anywhere. While we have some examples showing this off… and while it is possible the performance for this type of application isn’t really great. Anyway, if you really need to send image data you won’t want to use WiFi. Pretty much all the Microcontroller WiFi based solutions don’t have the buffers onboard to handle image data through and offer poor results. If you want to send image data using the OpenMV Cam’s VCP USB interface will offer the best results.

Thank you for your support

As a method to use this product, data written to the SD card
I thought about how to transfer using WIFI

  1. I was able to write Snapshot.py and write data to the SD card

Please tell me how to read SD card data and transfer it to Wifi

As a product image, it is several meters beyond the camera equipped with wifi
I want to create an application to display on PC

Thank you

Hi, you can open the file like you would any file in python and then use the wifi shield to open a socket to transfer the image over TCP. This is more or less straight forward python code.

Um, I can’t actually write the code to do all of this. I don’t really have the time anymore… but, also, there are several steps. Once big issue you’ll find is that the latency on doing this type of thing will be very high… the OpenMV Cam is not a WiFi camera. Is your goal to live stream images? If so, we have an MJPEG example, but, that’s about the best you’re going to get.

What are your high level goals?

Thank you for your support

The goal of my overall goal is

is

It is an HO gauge version

Thank you

Hmm, so, do you want video quality as high as that camera in the video? Our product really isn’t designed around steaming frames. We can JPEG compress stills and send them to the wifi shield over SPI… But, the WiFi shield’s internal MCU can’t really buffer large data packets which makes it slow at streaming live video. It’s really just meant for MQTT like data transfer.

Thank you for your support

I think that it does not have to be high quality for images

Also the transfer speed of the image is also a story
Ideal, I think that it is no problem even at about 10-20 fps

Whether or not the customer agrees with that goal is different

Thank you

Okay, um, so, wifi is definitely the fastest way to move data. However, you’re going to want to send UDP packets because TCP causes a lot of issues.

I don’t really have a template for how to do this data transfer… however, luckily, we have some infrastructure setup for you. So, the first thing to do would be to get UDP packets sent from your camera to an application. You may use our WiFi shield using python UDP sockets… or you can use an EPS32. Whatever the case, we’ve got all the code in place for JPEG compressing images fast and giving you that byte stream over serial or SPI.

As for a protocol, so we have this method called compressed_for_ide() which jpeg compresses an image and then reformats the binary data so you can deal with byte loss and adds a leading and trailing byte flag to the image to know when the data is fully received. This method allows you to just transfer the image with no sync information on the data channel and if all the bytes get through you can display the image. Our IDE technically has support for viewing this through our Open Terminal feature too. However, I haven’t tested if any of this stuff works.