Connection of "OpenMV Cam H7" with "Arduino nano"

I have a trained neural network that is integrated into the “OpenMV IDE”. This neural network finds certain objects in the image from the camera “OpenMV Cam H7” and circles them (the first script). I also have an Arduino Nano, a 28BYJ-48 stepper motor with an ULN2003 driver. The second script is the code from the “Arduino IDE” that makes the motor work. Is it possible to make the motor stop working if the neural network detects an object?

The first script:

#include <Stepper.h>
const int stepsPerRevolution = 200;  // change this to fit the number of steps per revolution
// for your motor
// initialize the stepper library on pins 8 through 11:
Stepper myStepper(stepsPerRevolution, 8, 9, 10, 11);
int stepCount = 0;         // number of steps the motor has taken
void setup() {
  // initialize the serial port:
void loop() {
  // step one step:

The second script:

# Edge Impulse - OpenMV Object Detection Example

import sensor, image, time, os, tf, math, uos, gc

sensor.reset()                         # Reset and initialize the sensor.
sensor.set_pixformat(sensor.RGB565)    # Set pixel format to RGB565 (or GRAYSCALE)
sensor.set_framesize(sensor.QVGA)      # Set frame size to QVGA (320x240)
sensor.set_windowing((240, 240))       # Set 240x240 window.
sensor.skip_frames(time=2000)          # Let the camera adjust.

net = None
labels = None
min_confidence = 0.5

    # load the model, alloc the model file on the heap if we have at least 64K free after loading
    net = tf.load("trained.tflite", load_to_fb=uos.stat('trained.tflite')[6] > (gc.mem_free() - (64*1024)))
except Exception as e:
    raise Exception('Failed to load "trained.tflite", did you copy the .tflite and labels.txt file onto the mass-storage device? (' + str(e) + ')')

    labels = [line.rstrip('\n') for line in open("labels.txt")]
except Exception as e:
    raise Exception('Failed to load "labels.txt", did you copy the .tflite and labels.txt file onto the mass-storage device? (' + str(e) + ')')

colors = [ # Add more colors if you are detecting more than 7 types of classes at once.
    (255,   0,   0),
    (  0, 255,   0),
    (255, 255,   0),
    (  0,   0, 255),
    (255,   0, 255),
    (  0, 255, 255),
    (255, 255, 255),

clock = time.clock()

    img = sensor.snapshot()

    # detect() returns all objects found in the image (splitted out per class already)
    # we skip class index 0, as that is the background, and then draw circles of the center
    # of our objects

    for i, detection_list in enumerate(net.detect(img, thresholds=[(math.ceil(min_confidence * 255), 255)])):
        if (i == 0): continue # background class
        if (len(detection_list) == 0): continue # no detections for this class?

        print("********** %s **********" % labels[i])
        for d in detection_list:
            [x, y, w, h] = d.rect()
            center_x = math.floor(x + (w / 2))
            center_y = math.floor(y + (h / 2))
            print('x %d\ty %d' % (center_x, center_y))
            img.draw_circle((center_x, center_y, 12), color=colors[i], thickness=2)

    print(clock.fps(), "fps", end="\n\n")

Sure, since you are just transmitting 1-bit of information just use the machine module to turn an I/O pin high or low on the OpenMV Cam and then read that I/O pin state with the Arduino.

I’m bad at programming. Please help me with the script. To begin with, as I understand it, I need to connect the camera and the board with wires using the “uart” method, and then?
Thanking you in advance.

Hi, I cannot just give you a script. If I do that, I end up doing everyone’s work.

I provide help support. You have to do the bulk of the work.

You don’t even need a UART. Just one I/O pin. This is well documented online in our docs and very easy to do.