(small) Lane following

I’m interested in navigating between 15cm rows of crops. The plants are green, in between is brown earth.

Is there any starter code anywhere? I’m aware of the line following stuff, could that be adapted? Or would it be starting from scratch?

I’d have some capacitative sensors too to call a halt if we actually touch the plants.

Just spotted this lane following: Line Following - Blob Spotting/Decoding robot - #31 by kwagyeman @zlite is that still the state of the art? Or has anyone worked on it since then?

Similar things
github. com/pengxbin/Crop-Rows-Images-Using-OpenCV
github. com/HenriqueToledo/Semantic-segmentation-applied-to-crop-rows

Dataset:
http://www.etfos.unios.hr/r3dvgroup/index.php?id=crd_dataset

Had to break some links, as newbie

1 Like

Hi, what you want to do is generally challenging and is a rather large project…

The OpenMV Cam isn’t going to do particularly well on this unless you know what you are doing. You’re asking about making a multi-line segmentor that’s rather high up off the ground.

1 Like

I may have described it poorly. This is a tiny skid steer robot, 15cm x 20cm or so. Basically 4 × 18650 with small bldc a esp32 and a openmv. It has rows of plants either side of it. It needs to stay in the brown bit in between. It doesn’t have to move fast. If it goes wrong and touches a plant the capacitive sensor stops it.

Yes, this type of problem is hard for the camera since it’s going to be plants on either side. I’d just a linear 1d point distance sensor. Then angle these out of either side of the robot.

1 Like

I am in the process of creating a similar robot platform. We are making a simple line follower that will drop seeds in a furrow. In order to make the robot stay on track we are going to use a simple black hose or black poly pipe to make it follow the line and drop seeds into the furrow. While its simple it should get the job done.

This is the code and basic idea I have been using:

I have also brainstormed the potential of watering the furrow first so that there is a dark line running through the center of the furrow so we don’t have to muck about with the hose. Currently I am stalled out learning more python and right now my robot just randomly goes in circles every which way. I have much more to learn. Another idea that I thought of the other day was to use the edge impulse platform and create a data set that will show the hose to the right of center and the left of center and center.

Then we can label the images accordingly in order to send the servo relevant direction information.This is a journey for certain. I have used the donkey car platform and have all the parts printed and the servos and camera hooked up for the differential drive and its all working just not how I want it to. Admittedly I have taken a break to work on other things but it is good to see someone else on here looking to do a similar thing. I am planning to get back to this in the next few weeks or so.

Cheers and best of luck to you! Keep going and don’t give up! This is more advice for myself than for you of course :stuck_out_tongue:

1 Like

Nice, wouldn’t a white/blue hose give better contrast though? (My soil is quite dark)

On kwagymans suggestion I’m investigating running a couple of TOF sensors: GitHub - samuk/TFMini-Plus-I2C: Arduino library for the Benewake TFMini-Plus LiDAR distance sensor in I2C communication mode

I did run into this the other day, which may well be of interest if pursuing a vision approach: GitHub - PRBonn/visual-crop-row-navigation: Visual-servoing based navigation for monitoring row-crop fields.

Do you have a picture of the environment?
We did something similar in agriculture crops. You can do it the hard way obviously, but we simply set a blob detection to both sides of the screen scanning for the green plant collor. Then we got the position of all the green plants, combining the cx values gives you the center of this row of plants. Doing the same on the other side gives you the center between there. If you write a code that moves the skidsteer with camera attached to roughly the middle point between those coordinates gives you a simple functioning piece of equipment. The treshold is tough in this way of using the camera, but we made some sliders to simply adjust the treshold while driving on gps.
Works slow, because you just look directly straight from above, but worked keeping our weeder from touching the plants at roughly 2km/h.
I could try to find some videos, but that will be later.

2 Likes

This sounds really interesting, would love to see some documentation.