Thanks for getting back to me. I have to admit, I’m a little lost, but I have some time today to dig into things a little more.
First of all, in the Stream Mode documentation, it says this,
Please see the Arduino Stream Master and Arduino Stream Slave sketches for how to use the RPC library in stream mode. Note that we do not supply examples for how to use the RPC library with the OpenMV Cam in stream mode as the OpenMV Cam will trivially overrun the data buffers on all but the most advanced Arduinos.
The streaming examples that I see stream from a camera to a “computer”, so it’s unclear that streaming mode is viable here. I’m only streaming a few bytes (AprilTag data), so I’m guessing that it’s not really a problem. Can you clarify? Is there a camera (as secondary) to uC (as master) steaming example to start with?
–
More fundamentally, I don’t understand all of the synchronization business. In my mind, I2C transactions should act more like, say, an IMU:
The uC (as master) makes a write call to the camera (as a secondary) to tell it to start searching for AprilTags (this isn’t even strictly necessary, since that’s all the camera is doing)
The uC, “at it’s leisure”, makes read call to the camera to ask for the latest data
The camera returns the number of tags, followed by tag data
Upon getting a 0, the uC drops the connection
Upon getting 1 or more tags, the uC reads them
All of this is dependent on an I2C class on the camera that properly responds to I2C calls. In the Arduino framework, for example, to make an “Arduino” a secondary device, you give it an address and define requestEvent(). An interrupt on the Arduino is used to call that function in response to queries. That doesn’t seem to be an option with the OpenMV. In fact, putting an oscilloscope on the I2C bus, it appears that most of the calls get NACKed, indicating that the camera is not even picking up the line when requested. This is because the I2C bus is torn down after each call to get/put_bytes.
Somewhere, I take it, things get synchronized, so that the camera is listening for requests from time-to-time, but I don’t understand at the low-level how I2C calls are being serviced. Everything seems to boil down to I2C.send on the camera side, but I can’t seem to find where I2C.send is defined in pyb to understand what is going on.
–
Ultimately, I could get the behaviour I want by just writing to a UART. For pedagogical reasons, however, I wanted to explore using I2C. With the UART, it was fairly trivial to just set the camera to performing AprilTag searches and then spit them out over the UART when found (outside of the RPC library). The uC just listens and accumulates data in a buffer until it can process it. I could, in principle, do something similar with I2C by making the camera the master, but then I have extra headaches of a multi-master system, since the uC also needs to talk to the IMU and a controller elsewhere. There is the I2C example in the OpenMV library, but it is peppered with warnings about losing connections. The apriltags_pixy_i2c_emulation.py example appears to have the functionality I’m looking for, as well, but it, too, will be prone to errors because the bus only appears to be able to send in limited windows.