I want to track if a objects in view are green enough to be likely to be a plant (on a fairly uniform non-green background but in changing light conditions).
The thing is that the colour green doesn’t really seem to be picked up that well. In the attached image that was made using the auto colour detection example code (automatic rgb565 colour detection), the blob marked as 3 are fresh leaves of a dandelion. They are as green as leaves get. They appear mostly white in the image however, and and when I train the code with those leaves, basically anything white that is in shade is matched.
In the image you see a notebook beeing matched (white) as well as the white walls of the room. There isn’t even anything green in the room to generate a reflection onto the walls.
I tried playing with sensor.set_contrast() and sensor.set_saturation() but it didn’t really seem to make much of a difference. Any suggestions? How can I get green to be picked up? Do you see any other ways to detect if something is likely to be a plant?
Hi, you have to turn off white balance immediately. White balance will change the color gains of everything in the image to make it appear gray. If there’s a lot of green in the image this will result in the green being changed to another tone.
Hi and thanks for getting back so quickly.
I assume you are talking about this setting?
sensor.set_auto_whitebal(False) # must be turned off for color tracking
It was set to False and left as it was in the example code. The image I uploaded was made with that setting. auto_gain is also set to false. Anything else I can do?
btw, on a side note I messed around with the output of find_blobs() and played with it in a python editor (I use jupyter notebook.) I kept getting syntax errors from the output because the last value in the output dictionary is not a name value pair - all the others are. Pls. see attachment. It kind of looks like a blunder since all the other values are name value pairs. I thought I’d let you know in case it actually is a mistake.
You have to turn off white balance faster than 2 seconds. I.e. remove the skip frames call. You have to turn it off immediately.
As for that bug, yes, that’s a bug. Can you open a GitHub issue on that? Thanks for finding it.
Ah, ok. Thanks If that is the case, perhaps you would want to consider changing it in the example code provided in the OpenMV IDE. The 2 sek wait I had in my code was there in the original example. I didn’t mess with that.
As for the bug, I messed around with the output of find_blobs. I wanted to get all the values of cx and cy (blobs centre position). I assumed I could use either index numbers or the get method. So if the ouput is in blobs, I would use either blobs[n] or blobs.get(“cx”). Here’s one example of what I tried:
for i in range(len(blobs)):
a = blobs_plants_y.append(a.get(“cy”))
When I use get() I get the error "blob object has no attribute “get”
If I use keys, I get the message “blob indices must be intergers, not str”
Both work in Jupyter notebook, however.
Edit: …and then it dawned on me I could just index values since they are in the same place each time, so is’t mostly fyi that normal methods doesn’t really seem to work as I would expect.
Notice that my variable is called blobs (in plural). The blob object is something else.
As for GitHub, yes I could, but I am going to need a bit more pointers as to what to do. I never used GitHub before. Perhaps it would be faster if you did it on your own._
You just do blob.cx() to get the cx(). All the properties are method functions.