Trouble Establishing Connection between Arduino Uno and OpenMV


Hello everyone,

I recently purchased an OpenMV H7 camera and wanted to establish a UART connection between the camera and my Arduino Uno. I followed the steps indicated in abdalkader’s github directory https://github.com/openmv/openmv/blob/master/scripts/examples/00-Arduino/arduino_uart.py, but I am not seeing any outputs to the serial in any of the microcontrollers.

This is the code I am using for Arduino

void setup() {
  // put your setup code here, to run once:
  Serial.begin(19200);
}

void loop() {
  // put your main code here, to run repeatedly:
  if (Serial.available()) {
    // Read the most recent byte
    byte byteRead = Serial.read();
    // ECHO the value that was read
    Serial.write(byteRead);
    Serial.println(byteRead);
  }
}

This is the Code I am using in OpenMV IDE

import time
from pyb import UART

# UART 3, and baudrate.
uart = UART(3, 19200)

while(True):
    uart.write("Hello World!\n")
    if (uart.any()):
        print(uart.read())
    time.sleep(1000)

With Connections:
OpenMV Cam Ground Pin ----> Arduino Ground
OpenMV Cam UART3_TX(P4) ----> Arduino Uno UART_RX(0)
OpenMV Cam UART3_RX(P5) ----> Arduino Uno UART_TX(1)


Attached is an image of my wiring


I would really appreciate some help!

Hi, you’re in luck! Our interface library is almost done! But, not quite ready… But, you can use it now for serial comms.

It’s not done yet… And docs aren’t completed. But, there’s an example for comms with UART, SPI, and I2C that works with an example on your OpenMV Cam.

Give it a shot.

Also, the reason you are having issues is because you are trying to use the same serial port the Arduino uses to send debug info to the PC for communication to the OpenMV Cam. This will not work. At all.

Hi kwagyeman,

Thanks for such a rapid response! I’ve read through the github repository, but I am having some trouble figuring out which files to have on my directory. You mentioned that one of the examples is currently working, I think you are referring to face_detection. I’m still quite new to this field, my apologies.

The project is intended for a beginner’s course on electronics and computing.The idea I currently have for my project is to have the OpenMV camera placed on a wall. If a face is detected, a signal should be sent to my Arduino Uno indicating a face detection. In turn, my Arduino would execute a number of commands.

I tried creating a folder which contained the files:

popular_features_as_the_controller_device_example.ino
openmvrpc.cpp
openmvrpc.h

But, my Arduino spits out the error message:
openmvrpc.h: No such file or directory

Perhaps I am approaching this incorrectly. I’d appreciate if you could provide some guidance, thanks again!

I should perhaps add that my Arduino’s circuit will also contain a number of switches and LED lights, a basic character LCD, and a piezo buzzer.

Hi, I’m guessing you don’t know how to install an Arduino Library.

Download the folder as a zip file. Then unzip it. Then put it in your Arduino libraries folder (e.g. home/documents/Arduino/libraries). Restart the Arduino IDE. Then the library will appear for imports along with the example.

Open the example, follow the text for how to connect things. On the OpenMV Cam side use OpenMV IDE to run the example under Examples->Remote Control->Popular Features as the Remote Device. Then run that example after uncommenting the interface on the OpenMV Cam.

I designed the interface library to be very robust and fault tolerant. Once you have both boards running code with the right interfaces selected and then once the correct I/O pins are hooked up things will just work.

The example script covers how to make the camera do face detection, color tracking, bar code decoding, Apriltags, qrcodes, and datamatrices. Everything works.

I made a small change in the code. I realized that the files within the src folder

openmv-rpc-cpp-master/src/

made use of CAN.h, which was not within the directory. So I commented out the lines that made use of CAN in both files within the src folder and got rid of the issue (of not being able to find ‘CAN.h’) when I tried to run the Arduino script.

Was this a proper approach?

I believe I managed to get everything hooked up. I am also not getting any more errors and I am able to successfully upload both files into their corresponding boards.

There is a slight issue I am encountering, however. When executing the program on my H7 camera, the file runs for approximately 1 second before the FPS drop to 0 and the display for the camera as well as the RGB Color Display freeze. The camera, however, seems to remain connected. Do you happen to know what the issue may be?

CAN.h is a library. If you put the file in the Arduino Library manager it would download that dependency for you.

The issue you are encountering is that the comms aren’t working. The OpenMV Cam will run for about a second and then wait for the Arduino to control it.

What interface do you have on the OpenMV Cam and what interface do you have on the Arduino? They need to match.

Hi, my apologies for the delayed response.

I believe I somewhat managed to obtain the desire result, with a minor issue. I am currently using a software UART interface with the following codes:

Arduino:

// Remote Control - As The Controller Device
//
// This script configures your Arduino to remotely control an OpenMV Cam using the RPC
// library.
//
// This script is designed to pair with "popular_features_as_the_remote_device.py" running
// on the OpenMV Cam.

#include <openmvrpc.h>

// The RPC library above provides mutliple classes for controlling an OpenMV Cam over
// CAN, I2C, SPI, or Serial (UART).

// We need to define a scratch buffer for holding messages. The maximum amount of data
// you may pass in any on direction is limited to the size of this buffer minus 4.

uint8_t scratch_buffer[256 + 4];

///////////////////////////////////////////////////////////////
// Choose the interface you wish to control an OpenMV Cam over.
///////////////////////////////////////////////////////////////

// Uncomment the below line to setup your Arduino for controlling over CAN.
//
// * message_id - CAN message to use for data transport on the can bus (11-bit).
// * bit_rate - CAN bit rate.
//
// NOTE: Master and slave message ids and can bit rates must match. Connect master can high to slave
//       can high and master can low to slave can lo. The can bus must be terminated with 120 ohms.
//
// openmv::rpc_can_master interface(message_id=0x7FF, bit_rate=250000, sampling_point=75);

// Uncomment the below line to setup your Arduino for controlling over I2C.
//
// * slave_addr - I2C address.
// * rate - I2C Bus Clock Frequency.
//
// NOTE: Master and slave addresses must match. Connect master scl to slave scl and master sda
//       to slave sda. You must use external pull ups. Finally, both devices must share a ground.
//
//openmv::rpc_i2c_master interface(scratch_buffer, sizeof(scratch_buffer), 0x12, 100000);

// Uncomment the below line to setup your Arduino for controlling over SPI.
//
// * cs_pin - Slave Select Pin.
// * freq - SPI Bus Clock Frequency.
// * spi_mode - See (https://www.arduino.cc/en/reference/SPI)
//
// NOTE: Master and slave settings much match. Connect CS, SCLK, MOSI, MISO to CS, SCLK, MOSI, MISO.
//       Finally, both devices must share a common ground.
//
// openmv::rpc_spi_master interface(scratch_buffer, sizeof(scratch_buffer), 10, 1000000, SPI_MODE2);

// Uncomment the below line to setup your Arduino for controlling over a hardware UART.
//
// * baudrate - Serial Baudrate.
//
// NOTE: Master and slave baud rates must match. Connect master tx to slave rx and master rx to
//       slave tx. Finally, both devices must share a common ground.
//
// WARNING: The program and debug port for your Arduino may be "Serial". If so, you cannot use
//          "Serial" to connect to an OpenMV Cam without blocking your Arduino's ability to
//          be programmed and use print/println.
//
// openmv::rpc_hardware_serial_uart_master -> Serial
// openmv::rpc_hardware_serial1_uart_master -> Serial1
// openmv::rpc_hardware_serial2_uart_master -> Serial2
// openmv::rpc_hardware_serial3_uart_master -> Serial3
//
 openmv::rpc_hardware_serial_uart_master interface(scratch_buffer, sizeof(scratch_buffer), 115200);

// Uncomment the below line to setup your Arduino for controlling over a software UART.
//
// * rx_pin - RX Pin (See the reference guide about what pins can be used)
// * tx_pin - TX Pin (see the reference guide about what pins can be used)
// * baudrate - Serial Baudrate (See the reference guide https://www.arduino.cc/en/Reference/SoftwareSerial)
//
// NOTE: Master and slave baud rates must match. Connect master tx to slave rx and master rx to
//       slave tx. Finally, both devices must share a common ground.
//
 //openmv::rpc_software_serial_uart_master interface(scratch_buffer, sizeof(scratch_buffer), 2, 3, 19200);

void setup() {
    Serial.begin(115200);
}

//////////////////////////////////////////////////////////////
// Call Back Handlers
//////////////////////////////////////////////////////////////

void exe_face_detection()
{
    struct { uint16_t x, y, w, h; } face_detection_result;
    if (interface.call_no_args(F("face_detection"), &face_detection_result, sizeof(face_detection_result))) {
        Serial.print(F("Largest Face Detected [x="));
        Serial.print(face_detection_result.x);
        Serial.print(F(", y="));
        Serial.print(face_detection_result.y);
        Serial.print(F(", w="));
        Serial.print(face_detection_result.w);
        Serial.print(F(", h="));
        Serial.print(face_detection_result.h);
        Serial.println(F("]"));
    }
}

void exe_person_detection()
{
    char buff[32 + 1] = {}; // null terminator
    if (interface.call_no_args(F("person_detection"), buff, sizeof(buff) - 1)) {
        Serial.println(buff);
    }
}

void exe_qrcode_detection()
{
    char buff[128 + 1] = {}; // null terminator
    if (interface.call_no_args(F("qrcode_detection"), buff, sizeof(buff) - 1)) {
        Serial.println(buff);
    }
}

void exe_apriltag_detection()
{
    struct { uint16_t cx, cy, id, rot; } apriltag_detection_result;
    if (interface.call_no_args(F("apriltag_detection"), &apriltag_detection_result, sizeof(apriltag_detection_result))) {
        Serial.print(F("Largest Tag Detected [cx="));
        Serial.print(apriltag_detection_result.cx);
        Serial.print(F(", cy="));
        Serial.print(apriltag_detection_result.cy);
        Serial.print(F(", id="));
        Serial.print(apriltag_detection_result.id);
        Serial.print(F(", rot="));
        Serial.print(apriltag_detection_result.rot);
        Serial.println(F("]"));
    }
}

void exe_datamatrix_detection()
{
    char buff[128 + 1] = {}; // null terminator
    if (interface.call_no_args(F("datamatrix_detection"), buff, sizeof(buff) - 1)) {
        Serial.println(buff);
    }
}

void exe_barcode_detection()
{
    char buff[128 + 1] = {}; // null terminator
    if (interface.call_no_args(F("barcode_detection"), buff, sizeof(buff) - 1)) {
        Serial.println(buff);
    }
}

void exe_color_detection()
{
    int8_t color_thresholds[6] = {30, 100, 15, 127, 15, 127}; // generic_red_thresholds
    // int8_t color_thresholds[6] = {30, 100, -64, -8, -32, 32}; // generic_green_thresholds
    // int8_t color_thresholds[6] = {0, 30, 0, 64, -128, 0}; // generic_blue_thresholds
    struct { uint16_t cx, cy; } color_detection_result;
    if (interface.call(F("color_detection"), color_thresholds, sizeof(color_thresholds), &color_detection_result, sizeof(color_detection_result))) {
        Serial.print(F("Largest Color Detected [cx="));
        Serial.print(color_detection_result.cx);
        Serial.print(F(", cy="));
        Serial.print(color_detection_result.cy);
        Serial.println(F("]"));
    }
}

// Execute remote functions in a loop. Please choose and uncomment one remote function below.
// Executing multiple at a time may run slowly if the camera needs to change camera modes
// per execution.

void loop() {
    exe_face_detection(); // Face should be about 2ft away.
     //exe_person_detection();
    // exe_qrcode_detection(); // Place the QRCode about 2ft away.
    // exe_apriltag_detection();
    // exe_datamatrix_detection(); // Place the Datamatrix about 2ft away.
    // exe_barcode_detection(); // Place the Barcode about 2ft away.
    // exe_color_detection();
}

OpenMV:

# Remote Control - As The Remote Device
#
# This script configures your OpenMV Cam as a co-processor that can be remotely controlled by
# another microcontroller or computer such as an Arduino, ESP8266/ESP32, RaspberryPi, and
# even another OpenMV Cam.
#
# This script is designed to pair with "popular_features_as_the_controller_device.py".

import image, network, math, rpc, sensor, struct, tf

sensor.reset()
sensor.set_pixformat(sensor.RGB565)
sensor.set_framesize(sensor.QVGA)
sensor.skip_frames(time = 2000)

# The RPC library above is installed on your OpenMV Cam and provides mutliple classes for
# allowing your OpenMV Cam to be controlled over CAN, I2C, SPI, UART, USB VCP, or WiFi.

################################################################
# Choose the interface you wish to control your OpenMV Cam over.
################################################################

# Uncomment the below line to setup your OpenMV Cam for control over CAN.
#
# * message_id - CAN message to use for data transport on the can bus (11-bit).
# * bit_rate - CAN bit rate.
# * sampling_point - Tseg1/Tseg2 ratio. Typically 75%. (50.0, 62.5, 75, 87.5, etc.)
#
# NOTE: Master and slave message ids and can bit rates must match. Connect master can high to slave
#       can high and master can low to slave can lo. The can bus must be terminated with 120 ohms.
#
# interface = rpc.rpc_can_slave(message_id=0x7FF, bit_rate=250000, sampling_point=75)

# Uncomment the below line to setup your OpenMV Cam for control over I2C.
#
# * slave_addr - I2C address.
#
# NOTE: Master and slave addresses must match. Connect master scl to slave scl and master sda
#       to slave sda. You must use external pull ups. Finally, both devices must share a ground.
#
# interface = rpc.rpc_i2c_slave(slave_addr=0x12)

# Uncomment the below line to setup your OpenMV Cam for control over SPI.
#
# * cs_pin - Slave Select Pin.
# * clk_polarity - Idle clock level (0 or 1).
# * clk_phase - Sample data on the first (0) or second edge (1) of the clock.
#
# NOTE: Master and slave settings much match. Connect CS, SCLK, MOSI, MISO to CS, SCLK, MOSI, MISO.
#       Finally, both devices must share a common ground.
#
# interface = rpc.rpc_spi_slave(cs_pin="P3", clk_polarity=1, clk_phase=0)

# Uncomment the below line to setup your OpenMV Cam for control over UART.
#
# * baudrate - Serial Baudrate.
#
# NOTE: Master and slave baud rates must match. Connect master tx to slave rx and master rx to
#       slave tx. Finally, both devices must share a common ground.
#
interface = rpc.rpc_uart_slave(baudrate=115200)

# Uncomment the below line to setup your OpenMV Cam for control over a USB VCP.
#
# interface = rpc.rpc_usb_vcp_slave()

# Uncomment the below line to setup your OpenMV Cam for control over WiFi.
#
# * ssid - WiFi network to connect to.
# * ssid_key - WiFi network password.
# * ssid_security - WiFi security.
# * port - Port to route traffic to.
# * mode - Regular or access-point mode.
# * static_ip - If not None then a tuple of the (IP Address, Subnet Mask, Gateway, DNS Address)
#
# interface = rpc.rpc_wifi_slave(ssid="",
#                                ssid_key="",
#                                ssid_security=network.WINC.WPA_PSK,
#                                port=0x1DBA,
#                                mode=network.WINC.MODE_STA,
#                                static_ip=None)

################################################################
# Call Backs
################################################################

# Helper methods used by the call backs below.

def draw_detections(img, dects):
    for d in dects:
        c = d.corners()
        l = len(c)
        for i in range(l): img.draw_line(c[(i+0)%l] + c[(i+1)%l], color = (0, 255, 0))
        img.draw_rectangle(d.rect(), color = (255, 0, 0))

# Remote control works via call back methods that the controller
# device calls via the rpc module on this device. Call backs
# are functions which take a bytes() object as their argument
# and return a bytes() object as their result. The rpc module
# takes care of moving the bytes() objects across the link.
# bytes() may be the micropython int max in size.

# When called returns x, y, w, and h of the largest face within view.
#
# data is unused
def face_detection(data):
    sensor.set_pixformat(sensor.GRAYSCALE)
    sensor.set_framesize(sensor.QVGA)
    faces = sensor.snapshot().gamma_corr(contrast=1.5).find_features(image.HaarCascade("frontalface"))
    if not faces: return bytes() # No detections.
    for f in faces: sensor.get_fb().draw_rectangle(f, color = (255, 255, 255))
    out_face = max(faces, key = lambda f: f[2] * f[3])
    return struct.pack("<HHHH", out_face[0], out_face[1], out_face[2], out_face[3])

# When called returns if there's a "person" or "no_person" within view.
#
# data is unused
def person_detection(data):
    sensor.set_pixformat(sensor.RGB565)
    sensor.set_framesize(sensor.QVGA)
    scores = tf.classify("person_detection", sensor.snapshot())[0].output()
    return ['unsure', 'person', 'no_person'][scores.index(max(scores))].encode()

# When called returns the payload string for the largest qrcode
# within the OpenMV Cam's field-of-view.
#
# data is unused
def qrcode_detection(data):
    sensor.set_pixformat(sensor.RGB565)
    sensor.set_framesize(sensor.VGA)
    sensor.set_windowing((320, 240))
    codes = sensor.snapshot().find_qrcodes()
    if not codes: return bytes() # No detections.
    draw_detections(sensor.get_fb(), codes)
    return max(codes, key = lambda c: c.w() * c.h()).payload().encode()

# When called returns a json list of json qrcode objects for all qrcodes in view.
#
# data is unused
def all_qrcode_detection(data):
    sensor.set_pixformat(sensor.RGB565)
    sensor.set_framesize(sensor.VGA)
    sensor.set_windowing((320, 240))
    codes = sensor.snapshot().find_qrcodes()
    if not codes: return bytes() # No detections.
    draw_detections(sensor.get_fb(), codes)
    return str(codes).encode()

# When called returns the x/y centroid, id number, and rotation of the largest
# AprilTag within the OpenMV Cam's field-of-view.
#
# data is unused
def apriltag_detection(data):
    sensor.set_pixformat(sensor.RGB565)
    sensor.set_framesize(sensor.QQVGA)
    tags = sensor.snapshot().find_apriltags()
    if not tags: return bytes() # No detections.
    draw_detections(sensor.get_fb(), tags)
    output_tag = max(tags, key = lambda t: t.w() * t.h())
    return struct.pack("<HHHH", output_tag.cx(), output_tag.cy(), output_tag.id(),
                       int(math.degrees(output_tag.rotation())))

# When called returns a json list of json apriltag objects for all apriltags in view.
#
# data is unused
def all_apriltag_detection(data):
    sensor.set_pixformat(sensor.RGB565)
    sensor.set_framesize(sensor.QQVGA)
    tags = sensor.snapshot().find_apriltags()
    if not tags: return bytes() # No detections.
    draw_detections(sensor.get_fb(), tags)
    return str(tags).encode()

# When called returns the payload string for the largest datamatrix
# within the OpenMV Cam's field-of-view.
#
# data is unused
def datamatrix_detection(data):
    sensor.set_pixformat(sensor.RGB565)
    sensor.set_framesize(sensor.VGA)
    sensor.set_windowing((320, 240))
    codes = sensor.snapshot().find_datamatrices()
    if not codes: return bytes() # No detections.
    draw_detections(sensor.get_fb(), codes)
    return max(codes, key = lambda c: c.w() * c.h()).payload().encode()

# When called returns a json list of json datamatrix objects for all datamatrices in view.
#
# data is unused
def all_datamatrix_detection(data):
    sensor.set_pixformat(sensor.RGB565)
    sensor.set_framesize(sensor.VGA)
    sensor.set_windowing((320, 240))
    codes = sensor.snapshot().find_datamatrices()
    if not codes: return bytes() # No detections.
    draw_detections(sensor.get_fb(), codes)
    return str(codes).encode()

# When called returns the payload string for the largest barcode
# within the OpenMV Cam's field-of-view.
#
# data is unused
def barcode_detection(data):
    sensor.set_pixformat(sensor.GRAYSCALE)
    sensor.set_framesize(sensor.VGA)
    sensor.set_windowing((sensor.width(), sensor.height()//8))
    codes = sensor.snapshot().find_barcodes()
    if not codes: return bytes() # No detections.
    return max(codes, key = lambda c: c.w() * c.h()).payload().encode()

# When called returns a json list of json barcode objects for all barcodes in view.
#
# data is unused
def all_barcode_detection(data):
    sensor.set_pixformat(sensor.GRAYSCALE)
    sensor.set_framesize(sensor.VGA)
    sensor.set_windowing((sensor.width(), sensor.height()//8))
    codes = sensor.snapshot().find_barcodes()
    if not codes: return bytes() # No detections.
    return str(codes).encode()

# When called returns the x/y centroid of the largest blob
# within the OpenMV Cam's field-of-view.
#
# data is the 6-byte color tracking threshold tuple of L_MIN, L_MAX, A_MIN, A_MAX, B_MIN, B_MAX.
def color_detection(data):
    sensor.set_pixformat(sensor.RGB565)
    sensor.set_framesize(sensor.QVGA)
    thresholds = struct.unpack("<bbbbbb", data)
    blobs = sensor.snapshot().find_blobs([thresholds],
                                         pixels_threshold=500,
                                         area_threshold=500,
                                         merge=True,
                                         margin=20)
    if not blobs: return bytes() # No detections.
    for b in blobs:
        sensor.get_fb().draw_rectangle(b.rect(), color = (255, 0, 0))
        sensor.get_fb().draw_cross(b.cx(), b.cy(), color = (0, 255, 0))
    out_blob = max(blobs, key = lambda b: b.density())
    return struct.pack("<HH", out_blob.cx(), out_blob.cy())

# When called returns a jpeg compressed image from the OpenMV
# Cam in one RPC call.
#
# data is unused
def jpeg_snapshot(data):
    sensor.set_pixformat(sensor.RGB565)
    sensor.set_framesize(sensor.QVGA)
    return sensor.snapshot().compress(quality=90).bytearray()

# Register call backs.

interface.register_callback(face_detection)
interface.register_callback(person_detection)
interface.register_callback(qrcode_detection)
interface.register_callback(all_qrcode_detection)
interface.register_callback(apriltag_detection)
interface.register_callback(all_apriltag_detection)
interface.register_callback(datamatrix_detection)
interface.register_callback(all_datamatrix_detection)
interface.register_callback(barcode_detection)
interface.register_callback(all_barcode_detection)
interface.register_callback(color_detection)
interface.register_callback(jpeg_snapshot)

# Once all call backs have been registered we can start
# processing remote events. interface.loop() does not return.

interface.loop()

I am currently using the Face Detection method, which seems to accurately find a face. The last issue I am encountering is the feed of uncomprehensive information.


The program seems to constantly send a series of characters which I fail to comprehend. Once a face is detected, the proper message is communicated, but the characters soon reappear. Is this meant to happen?

Thanks in advance!

I realize the initial image may be illegible. Below is a more readable output.

It’s because you are using hard UART 1 which is the same one the debug port is on to communicate to the camera.

The code literally warns you about this. There’s a comment right above where you uncommented the hardware UART line.

You are seeing the communication to the camera happening. If you don’t want to see that use the software serial UART on pins 2 and 3.

Got it to work, thank you for your patience.