No output on P7 marked as P6 for analog servo use

using the examples and trying to follow the conversation but now can’t get this going at all. Is this even available at this point? I see that you’re selling a shield for the I2C servos. Does the analog approach work at all? Can you say how?
Thanks,
Stimpy

More info:
No electrical activity whatsoever on the pin. Pinned down to a few microvolts.
Discrepancies in the examples vs what the microPY will allow. Too much I don’t know about is wrong. Can you help?
Stimpy

Please post more information on the issue, how to reproduce it, on which board ? Post code, post images, how is the board powered ? etc…

Hi;
I am trying to get your own example code working with the goal of using it as an image/object tracking sort of sensor.
First step is to operate with the examples some of which work, others don’t.
Anyway, this is the older R2 camera with the upgrade coming in tomorrow. Meanwhile, since the pulse proportional servo output seems non-function, I’m moving everything off the camera except the image processing. So to remedy that, I’ll simply move the servo control to the attending arduino. Now the trick will be to get the I2C bus coupled up between the ARD and the MVcam. Then the ARD can do all the logistics around the image data.
So started to work with the Example that would be your file named “arduino_i2c_slave_1”. I am able to connect to my MEGA, but getting gobbldy gook for characters no errors or drops, so the actual bus seems good. I think it’s a big-little endian problem. Any suggestions?
Stimpy

Please use the RPC library: GitHub - openmv/openmv-arduino-rpc: Remote Procedure/Python Call Library for Arduino

I still want to use the camera al la pixy, but my way.
I now have the I2c ticking over nicely. So my blob finding is now being used to generate the servo parameters I’m sending over to the ARD. I’m going to try to add the circle finding to further cement target verification in addition to color. Of course I now have to have the ARD master pounding on the i2C, but seems to be fairly tight connection. Happy with that. Once I can combine all 3 image processing techniques, I think we can really have something. As I’ve read all the documentation/code little by little, I see little useful tricks scattered about. I feel like this will work after a fashon.

Hi Kwagyeman;
Now that I’m working the CAM data thru the I2C pipe, it feels like it all works much better. No sporadic crashes either. So all good there. I have a small problem perhaps you could advise on? I’m using the single color example in your Doc example to base my color finding on using the very same approach. :single_color_rgb565_blob_tracking. The file contains a table used to call out the color signatures to supply to the blob finder. In my case the red filter at index ‘0’, seems to work very well. The blue fails to find anything. Any clues? How do I tweak that for a particular primary?
Thanks for your time,
Stimpy

Hi Kwagyeman;
Forgot to ask if you have an example of using the CAM as an I2C master to request instructions from the ARD with the ARD as the slave? A simple few commands would be good. Maybe a discrete digital IN would work I suppose? Examples?
Stimpy

There’s no example but library supports this. it should be straight forward to write. There’s master/slave examples for the OpenMV Cam and there’s master/slave examples for the Arduino.

Hi Kwagyeman;
Still very poor Blue response in general. Anything I can do?
Stimpy

Hi, you really need to like post your code and then ask an actual specific question.

Well, you’ve been sort of helpful. Honestly, the example code has helped a lot to see what works and not. Again, your blue filter is not really working straight off your example. No code there but what you’ve sent. Any clue? Since you haven’t answered I guess not.

Yeah, you need to modify the thresholds. See the threshold editor. It’s under tools → machine vision → threshold editor.

Here’s what I found out. I can use the camera’s RGB spectrum scanner to pullout the RGB value peaks, send those into on online RGB to LAB conversion using the ‘D55’ lighting value for my situation and then use those converted LAB numbers to tweak the filters in this code. Worked well. Now can lock right onto the Blue target. Also your device can’t really manage an entire image process cycle for both blobs and find circles. Runs out of mem fast. So you have to whittle down the image. First I do a blob search on these color keys, obtain the ROI tuple from that then pass that down to the Circle finder to identify my target with higher probability. This seems to work if I can just search for one color blob in each compute cycle. Just. I don’t think object classification will also work with this algorithm. likely to much work. We’ll see if I ever get it.

Hi kwagyeman;
Question on micro Python. I see that Python favors prefix operation and I’m trying my best to work with that, but I’ve come to a point where I absolutely need to have a parent scope reference. Simply, I’m executing in the main while loop, but need to update a state that is external to that for the purpose of filtering the data stream as the main loop executes. I don’t see any good examples. Problem is I keep getting out of scope errors for variables declared at the highest scope. In real python this is allowed. Any clues?
Stimpy

Typically if you want to modify a variable in a function call you do:

var = 0

def function():
    global var
    var++

Hi Kwagyman;
Found a problem with your program emulating the pixy output.
You call a round() operation to scale the screen to the x or y output.
I was getting the same x value out of the cam despite report on the main loop via a print statement that it was producing correct values. However after this round operation, they don’t change at all or very little. If I don’t use the round() the program complains of float to int conversion. Rightly so. Have you fixed this one?

You need to set the servo limits in the code. The default ones aren’t very aggressive.

Progressing, but now interested in the IR capability. What IR frequency is the cam good on ?850 nm?

Forgot to as where are the examples of Arduino Master & Slave stored?