Serial communication on RT1062

Camera Module: RT1062
Env: OpenMV v4.5.9; MicroPython v1.23.0-r19; OpenMV IMXRT1060-MIMXRT1062DVJ6A

Hardware Setup:

  • STM32-L432KC – RT1062 (USART1)
  • STM32-L432KC – RF module (USART2)

Intent: I am trying to use the stm32 to send a capture command to the camera. The camera then sends chunks of data over UART to the stm32. the stm32 adds some meta data and send its over RF. Another module running on a host PC tacks the binary image data from the RF link and reconstructs the image.

Problem

  1. ideally I would like to use SPI. What is the easiest way to use SPI on the RT1062? As I understand, it cannot use the pyb library (not an stm23 based module)

  2. with my basic UART test, does my camera script look reasonable? I cannot seem to get the UART data to place nice with the stm32 in my setup. Is it possible I am not accessing the recently captured image from the SD card as expected? Has the camera not passed it from internal memory to the SD by the time I am accessing it?


Camera script:

import sensor, image, time, os
from machine import UART

#==== hardware UART UART(1) || VCP on UART(3) ====
# Change baudrate/timeouts to match your PC environment.
uart = UART(1, baudrate=9600, timeout=10000) # start slow for testing

# Keep track of the last snapped image filename
LAST_FILENAME = None
def capture_image():
    """
    Perform a camera snapshot and save it to a file named with the current
    millisecond timestamp. Then reply 'SNAP OK' or 'SNAP FAIL'.
    """
    global LAST_FILENAME
    print("[Debug] capture_image() called: starting snapshot...")
    try:
        sensor.reset()
        sensor.set_pixformat(sensor.GRAYSCALE)
        sensor.set_framesize(sensor.QVGA)
        sensor.set_quality(50)
        #sensor.set_auto_gain(False)
        #sensor.set_auto_exposure(False)
        #sensor.set_auto_whitebal(False)
        #sensor.set_lens_correction(True, 50, 80)
        sensor.skip_frames(time=1000)

        # Snap
        img = sensor.snapshot()

        # Generate a filename
        filename = "snapshot_{}.jpg".format(time.ticks_ms())
        img.save(filename)
        LAST_FILENAME = filename

        # Confirm success
        print("[Debug] SNAP OK - Image saved as:", filename)
        uart.write(b"SNAP OK\r\n")
    except Exception as e:
        print("[Debug] SNAP FAIL:", e)
        uart.write(b"SNAP FAIL\r\n")

def chunk_image():
    """
    Send back the size of the last snapped image, then stream the raw bytes.
    Finally, send a DONE line. If no image has been snapped, SIZE=0 + DONE.
    """
    global LAST_FILENAME
    print("[Debug] chunk_image() called...")
    if not LAST_FILENAME:
        print("[Debug] No snapped image yet; sending SIZE 0 + DONE.")
        uart.write(b"SIZE 0\r\n")
        uart.write(b"DONE\r\n")
        return

    try:
        filesize = os.stat(LAST_FILENAME)[6]
        # Build the size line
        size_line = "SIZE {}\r\n".format(filesize)

        # Print exactly what we're sending for debugging
        print(f"[Debug] chunk_image() sending line: {repr(size_line)}")

        # Send "SIZE NNN\r\n"
        uart.write(size_line.encode())
        time.sleep_ms(50)  # optional short delay before sending the file data

        print("[Debug] Sending file:", LAST_FILENAME, "Size:", filesize)

        # --- Added debug chunk counter here ---
        chunk_count = 0

        with open(LAST_FILENAME, 'rb') as f:
            while True:
                chunk = f.read(1024)
                if not chunk:
                    break
                chunk_count += 1
                # Debug line to show each chunk's number and length
                print(f"[Debug] chunk #{chunk_count}, len={len(chunk)}")

                uart.write(chunk)

        uart.write(b"DONE\r\n")
        print(f"[Debug] Done sending file data. Total chunks sent: {chunk_count}")
    except Exception as e:
        print("[Debug] CHUNK error:", e)
        # Fallback
        uart.write(b"SIZE 0\r\n")
        uart.write(b"DONE\r\n")

def main_loop():
    """
    Continuously listen for commands:
    SNAP  -> capture_image()
    CHUNK -> chunk_image()
    otherwise -> UNKNOWN COMMAND
    """
    print("[Debug] main_loop started: listening for commands on UART(1)...")
    cmd_buffer = b""
    while True:
        if uart.any():
            data = uart.read(uart.any())
            if data:
                cmd_buffer += data
                if b"\n" in cmd_buffer:
                    lines = cmd_buffer.split(b"\n")
                    # Process all complete lines
                    for line in lines[:-1]:
                        line = line.strip()
                        if line == b"SNAP":
                            print("[Debug] Received 'SNAP' command.")
                            capture_image()
                        elif line == b"CHUNK":
                            print("[Debug] Received 'CHUNK' command.")
                            chunk_image()
                        else:
                            print("[Debug] Unknown command:", line)
                            uart.write(b"UNKNOWN COMMAND\r\n")
                    # Keep any partial leftover
                    cmd_buffer = lines[-1]
        time.sleep_ms(100)  # slight delay for CPU relief

if __name__ == "__main__":
    main_loop()


Camera Serial Terminal Debugs (UART(3))

[Debug] main_loop started: listening for commands on UART(1)…
[Debug] Received ‘SNAP’ command.
[Debug] capture_image() called: starting snapshot…
[Debug] SNAP OK - Image saved as: snapshot_511462.jpg
[Debug] Received ‘CHUNK’ command.
[Debug] chunk_image() called…
[Debug] chunk_image() sending line: ‘SIZE 2501\r\n’
[Debug] Sending file: snapshot_511462.jpg Size: 2501
[Debug] chunk #1, len=1024
[Debug] chunk #2, len=1024
[Debug] chunk #3, len=453
[Debug] Done sending file data. Total chunks sent: 3


Output on the RF data link


Hi @aqsnyder - I’m happy to help you. But, it’s not possible for me to debug a huge program for you with various components. You need to ask a specific question about something that’s going wrong or not working the way you expect.

I can answer these:

  1. ideally I would like to use SPI. What is the easiest way to use SPI on the RT1062? As I understand, it cannot use the pyb library (not an stm23 based module)

You can use the machine module to act as a SPI master. Then you can make the STM32 a SPI slave. This will give you the best image transfer speed.

  1. with my basic UART test, does my camera script look reasonable? I cannot seem to get the UART data to place nice with the stm32 in my setup. Is it possible I am not accessing the recently captured image from the SD card as expected? Has the camera not passed it from internal memory to the SD by the time I am accessing it?

I can’t answer this question. In generally, unless you need the STM32 I would just have the RT1062 do everything if possible. If it can directly drive the RF link then having it do so would make your life easier.

Note that the RT1062 can just keep the image in RAM. There’s no need for you to save to the SD card and then read it back again.

I am not expecting you to do anything more, I was just trying to give sufficient context without posting too much material. Thank you for addressing the two questions I posted.

  1. I need the RT1062 to act as a SPI slave, not a master. Is this possible?
  2. I need the STM32 in the loop for other features. I plan on using three RT1062’s in the system so I figured I would have the stm32 as master controlling the three cameras and my other sensors.
  1. I need the RT1062 to act as a SPI slave, not a master. Is this possible?

Not yet. MicroPython needs to add a machine SPI slave setup. If you hack the firmware it’s quite possible, but, not from MicroPython.

  1. I need the STM32 in the loop for other features. I plan on using three RT1062’s in the system so I figured I would have the stm32 as master controlling the three cameras and my other sensors.

Okay, I would recommend using the RPC library. It’s very good for doing UART comms between microcontrollers.

openmv/openmv-arduino-rpc: Remote Procedure/Python Call Library for Arduino

The camera has it built-in already. You just have to make the Arduino version work on your STM32 processor.

I need to save to the SD card and then read it back at a later time because the RF link it a bottle neck in data transmission. The camera’s first priority it to be ready to take pictures. When the camera is not being used it is allowed to start piping image data over RF. Therefore I cannot rely on reading image out of RAM at all times.