Interface Library Discussion

Hi all,

We want to make the OpenMV Cam more friendly to interface to other processors and we will now support Interface libraries for the camera to the Arduino and RaspberryPi. The plan is to offer interface libraries for:

The OpenMV Cam as slave processor over Aysnc Serial (UART, RX, TX) to the Arduino
The OpenMV Cam as an I2C Slave processor (SCL, SDA) to the Arduino
The OpenMV Cam as an SPI Slave processor (SS, SCK, MISO, MOSI) to the Arduino
The OpenMV Cam as slave processor over VCP Serial (USB) to the RapsberryPi (or any linux processor)

I have looked at porting the firmata library but itā€™s a rather large amount of code and functionality to emulate. For this library Iā€™m trying a light touch.

The interface library modules will be written in python and then baked into the OpenMV Cam firmware as frozen modules. We want to do it this way so that you have access to the python code easily if you want to modify our libraryā€¦ but, otherwise, if you are using it unmodified itā€™s already onboard.

We will also supply the other half of the code running on the Arduino/Pi that will talk to the camera.

The plan is to make these 4 libraries just about control of the camera and not about running any particular functionality. This is so that others can build off of them.

Anyway, the point of this forum thread is to gather feedback on what features I should put in these interface libraries. Thoughts are welcome.

Hereā€™s my plan for the library setup:

First thereā€™s an init method which will create the library and the underlying hardware channel for you.

Second, you will register callbacks on the OpenMV Cam that can be run on receiving a command from the other device. Callbacks will execute on request from the host processor and return a response.

And thatā€™s it. There will be no while(true): loop anymore in your code. Instead you just create the interface library device, register some callbacks with particular IDs that your host program should know, execute a method to enter an event loop, and then when the host program executes a command with the right ID the callback is run and you get data back. This gives you freedom to do whatever you want. The interface libraries primarily will deal with moving bytes across a link.

ā€¦

Next, for example purposes, I will have examples showing off the client (OpenMV Cam) and host side of the code for:

Face Detection
Person Detection
Qrcode Detection
AprilTag Detection
DataMatrix Detection
Barcode Detection
Color Tracking
Image Capture

As an example, on the Arduino, the data structure being moved will be described in C and then on the OpenMV Cam the data structure will be serialized using the struct module to match the Arduino structure. On the Pi, the struct will just be un-packed in python.

I feel that setting up things in this way is very generic and allows folks to do anything they need and makes the I/O part easy. That said, I understand it requires some understanding of data-structures. However, I think having 8 examples should give folks enough tools to copy from to do what they need.

Note that this library code is designed to make the OpenMV Cam a slave device on-purpose since thatā€™s the best way for other processors to handle the data rate from the camera.

Thoughts?

ā€¦ Regarding the VPC stuff. As time allows, I will work on features in the IDE to still be functional if the VPC port is used. Also, MicroPython technically enabled having two VPC ports now in the current firmware releases to it may be possible to have another VPC port that the IDE doesnā€™t have to share.

This is awesome!! Thank you for working on this!

Questions:

  1. Will the libraries also have APIs for the Flir Lepton IR camera?
  2. Will you also support the Jetson Nano?


    Cheers!

Just so I donā€™t go off on a tangent and i understand correctly.

  1. The OpenMV cam is a ā€˜slaveā€™ in this use case. itā€™s basically doing itā€™s own thing, but sends something to the host when queried? i say ā€˜slaveā€™ because itā€™s not really designed to act as a slave; whereas this sounds more like a co-processor scenario.
  2. The host is basically sending a command to the openmv cam, it does itā€™s operation, and returns the result to the host.

What ā€˜resultā€™ is being transferred? Is it strictly numbers like the ā€œGrove AI Hat for Edge Computingā€ or are we talking full blown images like a webcam?

If i understand the meaning, itā€™s much like the Wifi-Nina scenario with the Arduino and ESP32. The Nina core has a bunch of pre-programmed commands and an interpreter built in to act as a co-processor. It does itā€™s own thing, and shuttles data back and forth to the host. from the host perspective, you just ask for a webpage. From the co-processor perspective, it gets the request for a web-page, handles all of the internet related things and communication, along with any other processing it needs to, and returns the data as a result to the host. the host gets an object, either readable data or byte streams as an object and the user is left to do with it what they will. Itā€™s not really a slave, more of a co-processor.

If this is the case Iā€™m curious as to what an example would be, since iā€™d assume on the host side youā€™d send a command to the openmv cam to say check for an AprilTag. On the Cam side, you have the interface library set up a call back in a loop doing something that would tell the user code there to return something when the interrupt is triggered.

Or, am I completely misunderstanding and the goal here is to simply create a ā€˜standardizedā€™ communication protocol between the Openmv Cam and the host that will in essence just transmit commands to engage the functions in openmv already?

Another item; does the Uart, SPI and I2C strictly need to be Arduino or can it be done that way via the RPI as well; specifically for python. something akin to what the Grove AI Hat for edge computing should have been but isnā€™t.

A further question from here would comedown to threading. Since the Openmv Cam didnā€™t support threading, how do you envision the callback or interrupt taking place ā€˜slaveā€™ side? if the camera isnā€™t using an Async communication protocol like USB or Uart, and is busy processing something when it receives a request; wonā€™t this cause an issue without having the FIFO buffer to fall back on? iā€™m not talking about the host needing to be programmed correctly to do a handshake and wait for data, but instead the cam acting as a ā€˜slaveā€™ missing any of that initial handshake unless the Cam is sitting in a blocked state waiting for the host.

Or have i completely misunderstood.

The goal is to make the OpenMV Cam into a co-processor that just does something when requested and then returns a result. Hereā€™s a simple example:

On the Arduino I want to read a QR code string, so, Iā€™ll be able to attach an OpenMV Cam to the Arduino and use the interface library to do this.

  1. On the Arduino I call a method to activate a callback on the OpenMV Cam with some ID value that both the OpenMV Cam and the Arduino have set the same in code.
  2. The camera receives the command and uses the ID value to lookup a callback to run. That callback runs, does whatever it needs to do taking as much time as it needs, and then returns some byte array result.
  3. The Arduino receives the byte array result and the writes that to like a C struct so that you can use the result in your code.

So, effectively, the Arduino is able to ask the OpenMV Cam to do image processing for it.

The interface library will block the Arduino while the OpenMV Cam is working. Similarly, the OpenMV Cam will not do any work until it gets a command from the Arduino.

Given that you can transfer generic byte structs this means you can transfer images, objects, whatever. Also, you can edit the code running on the OpenMV Cam to do more than just return the result of find_blobs(), etc. You can do all the filtering on the OpenMV Cam and then just return the final result you think is good.

ā€¦

Regarding the Arduino/Pi, I can make the I/O library for both.

As for how I plan to handle SPI and I2Cā€¦ well, itā€™s tricky. Itā€™s not impossible to get slave mode working with SPI and I2C on the OpenMV Cam, itā€™s just hard. But, I will write the required code to make this work so you donā€™t need to think about it. Essentially, the master device will query the OpenMV Cam trying to get a response and it will keep trying until the OpenMV Cam has gotten to a point where itā€™s able to transmit data.

For example, with I2C, until I execute the I2C send method and then the OpenMV Cam starts waiting for a master device to clock I2C data out of it the master device will get NCKs on the I2C bus. So, the solution is to make the library poll the camera on the I2C bus until the camera responds. Similarly, you have to do the same thing for SPI. The value add of the library is that I write all this code and hand-shaking stuff for you and get it working nicely.

1 Like

Hi Jcp13,

  1. Thereā€™s no API for anything specific, if you want to use the FLIR Lepton you can just use the interface library with it.
  2. I would need community support to support the jetson. Iā€™m just one person.

This will be great to have. Writing a robust and reliable communication interface always seemed pretty daunting to me, so I appreciate the effort that will go into it.

It would be really nice if one of the libraries could allow for a single Arduino to accommodate multiple OpenMVā€™s as slave processors. Early in this forum, Iā€™ve inquired about daisy chaining multiple OpenMVā€™s over UART and and an I2C mux to allow for multiple OpenMVā€™s on the I2C bus. Iā€™d be happy with whatever makes the most sense between SPI, UART, and I2C.

Most of my work is with finding circles, so if you can throw in an example using that would be awesome.

Thanks!

Iā€™ll allow you to set the I2C bus address of the OpenMV Cam for this. As for a finding circles exampleā€¦ if I have time.

Excellent. Iā€™m sure your time is limited, so I completely understand.

I have no sense of how long all this would take. Would the I2C portion of it be done by the time the H7 plus is in stock? My project involves multiple OpenMVā€™s at a higher resolution, so Iā€™m on hold until those are available. If the I2C interface is available then as well, that would be sweet.

Thanks!

Yeah, maybe. I will try. I have a bunch of driver firmware development to finish right now followed by clearing out bugs on GitHub, and ide release, and then this new library on my to-do list.

Seems ambitious, but this whole project/business is pretty ambitious so I guess it comes with the territory. Thanks!

Hi kwagyeman,

I can give it a shot to build something for the Jetson Nano. I will need some guidance from you if you donā€™t mind. I may be able to start in early summer (I have a number of projects on my plate at this time). Is it ok to reach out when it gets closer to summer?

Hi, Iā€™m almost to working on this. I have some other things involving drivers to finish up first but I expect to begin in the last two weeks of March.

Hi all,

Iā€™m almost done with the interface library:

https://github.com/kwagyeman/openmv/blob/kwabena/interface_library/scripts/examples/34-Remote-Control/as_the_remote_device.py
https://github.com/kwagyeman/openmv/blob/kwabena/interface_library/scripts/examples/34-Remote-Control/as_the_controller_device.py

The above 4 files form the library. The RPC module and mutex need to be copied to your OpenMV Cam disk. They will be compiled into the firmware in the next release. The as the remote device script should run on your OpenMV Cam and the controller script should run on a MicroPython board controlling the OpenMV Cam.

So far, Iā€™ve debugged UART control and that works perfectly. The protocol is self-healing and can handle being disconnected and then reconnected in the middle of operation without anything crashing. All transfers are CRC16 checked too.

ā€¦

For Arduino support we have someone working on an Arduino interface library which will be released to the Arduino package repo for everyone along with creating a RPi python library too. These libraries will appear soon after the weā€™ve released the library for the OpenMV Cam.

Anyway, you can demo the library with UART control between two OpenMV Cams right now. If you want to do that save the mutex.py, rpc.py on both OpenMV Cams and then the as_the_remote_device.py script as main.py on an OpenMV Cam you want to remote control. Connect the UARTs (P4->P5 and P5->P4 and GND) between the two cameras. Then finally connect to the master camera and run the as_the_controller_device.py script.

For debugging both scripts if you decide to edit things, I recommend opening both example scripts in OpenMV IDE, connecting to the slave device in the main window, and then using the Open Terminal feature to connect to REPL of the master device and send it the master script. This is a little advanced to setup but works nicely.

NOTE: Only the UART works right now, I will debug and fix any issues with SPI, I2C, CAN, and USB in the coming days.

1 Like