Rpc module for Raspberry Pi

Hi! I’m working on a project which needs images transferred from the camera to a Raspberry Pi. Is the rpc module suited for an rpi? If so, where can I find documentation for it?

Also, how could I send a signal from my rpi to start the script of the camera and start sending pictures?

Thanks in advance!

The RPC module runs on linux and thus on the Pi.

I need to implement linux SPI/I2C support. This is missing. Everything else is there.

You can use RPC to do everything.

Hi! I got both files for the slave and master, but I cant seem to be able to run them. Is there something missing in them or am I doing something else wrong?

Note: this is one error that it gives me: “could not open”/media/pi/4621-0000/rpc.py"" for reading. Either the file does not exist or you don have access to it".

I am trying to run the file from the terminal on the Pi.

Master:

# Image Transfer - As The Controller Device
#
# This script is made to pair with another OpenMV Cam running "image_transfer_raw_as_the_remote_device.py"
#
# This script shows off how to transfer the frame buffer from one OpenMV Cam to another.

import image, network, omv, rpc, sensor, struct, time

# The RPC library above is installed on your OpenMV Cam and provides mutliple classes for
# allowing your OpenMV Cam to control over CAN, I2C, SPI, UART, or WIFI.

##############################################################
# Choose the interface you wish to control an OpenMV Cam over.
##############################################################

interface = rpc.rpc_usb_vcp_master()
memory_view_object_result = interface.call("remote_function_or_method_name", 99)

##############################################################
# Call Back Handlers
##############################################################

def get_frame_buffer_call_back(pixformat, framesize, cutthrough, silent):
    if not silent: print("Getting Remote Frame...")

    result = interface.call("raw_image_snapshot", struct.pack("<II", pixformat, framesize))
    if result is not None:

        w, h, pixformat, size = struct.unpack("<IIII", result)
        img = image.Image(w, h, pixformat, copy_to_fb=True) # Alloc cleared frame buffer.

        if cutthrough:
            # Fast cutthrough data transfer with no error checking.

            # Before starting the cut through data transfer we need to sync both the master and the
            # slave device. On return both devices are in sync.
            result = interface.call("raw_image_read")
            if result is not None:

                # GET BYTES NEEDS TO EXECUTE NEXT IMMEDIATELY WITH LITTLE DELAY NEXT.

                # Read all the image data in one very large transfer.
                interface.get_bytes(img.bytearray(), 5000) # timeout

        else:
            # Slower data transfer with error checking.

            # Transfer 32/8 KB chunks.
            chunk_size = (1 << 15) if omv.board_type() == "H7" else (1 << 13)

            if not silent: print("Reading %d bytes..." % size)
            for i in range(0, size, chunk_size):
                ok = False
                for j in range(3): # Try up to 3 times.
                    result = interface.call("raw_image_read", struct.pack("<II", i, chunk_size))
                    if result is not None:
                        img.bytearray()[i:i+chunk_size] = result # Write the image data.
                        if not silent: print("%.2f%%" % ((i * 100) / size))
                        ok = True
                        break
                    if not silent: print("Retrying... %d/2" % (j + 1))
                if not ok:
                    if not silent: print("Error!")
                    return None

        return img

    else:
        if not silent: print("Failed to get Remote Frame!")

    return None

clock = time.clock()
while(True):
    clock.tick()

    # You may change the pixformat and the framesize of the image transfered from the remote device
    # by modifying the below arguments.
    #
    # When cutthrough is False the image will be transferred through the RPC library with CRC and
    # retry protection on all data moved. For faster data transfer set cutthrough to True so that
    # get_bytes() and put_bytes() are called after an RPC call completes to transfer data
    # more quicly from one image buffer to another. Note: This works because once an RPC call
    # completes successfully both the master and slave devices are synchronized completely.
    #
    img = get_frame_buffer_call_back(sensor.RGB565, sensor.QQVGA, cutthrough=True, silent=True)
    if img is not None:
        pass # You can process the image here.

    print(clock.fps())

Slave:

# Image Transfer - As The Remote Device
#
# This script is meant to talk to the "image_transfer_jpg_as_the_controller_device.py" on your computer.
#
# This script shows off how to transfer the frame buffer to your computer as a jpeg image.

import image, network, omv, rpc, sensor, struct

sensor.reset()
sensor.set_pixformat(sensor.RGB565)
sensor.set_framesize(sensor.QVGA)
sensor.skip_frames(time = 2000)

# Turn off the frame buffer connection to the IDE from the OpenMV Cam side.
#
# This needs to be done when manually compressing jpeg images at higher quality
# so that the OpenMV Cam does not try to stream them to the IDE using a fall back
# mechanism if the JPEG image is too large to fit in the IDE JPEG frame buffer on the OpenMV Cam.

omv.disable_fb(True)

# The RPC library above is installed on your OpenMV Cam and provides mutliple classes for
# allowing your OpenMV Cam to be controlled over USB or WIFI.

################################################################
# Choose the interface you wish to control your OpenMV Cam over.
################################################################

# Uncomment the below line to setup your OpenMV Cam for control over a USB VCP.
#
interface = rpc.rpc_usb_vcp_slave()


################################################################
# Call Backs
################################################################

# When called sets the pixformat and framesize, takes a snapshot
# and then returns the frame buffer jpg size to store the image in.
#
# data is a pixformat string and framesize string.
def jpeg_image_snapshot(data):
    pixformat, framesize = bytes(data).decode().split(",")
    sensor.set_pixformat(eval(pixformat))
    sensor.set_framesize(eval(framesize))
    img = sensor.snapshot().compress(quality=90)
    return struct.pack("<I", img.size())

def jpeg_image_read_cb():
    interface.put_bytes(sensor.get_fb().bytearray(), 5000) # timeout

# Read data from the frame buffer given a offset and size.
# If data is empty then a transfer is scheduled after the RPC call finishes.
#
# data is a 4 byte size and 4 byte offset.
def jpeg_image_read(data):
    if not len(data):
        interface.schedule_callback(jpeg_image_read_cb)
        return bytes()
    else:
        offset, size = struct.unpack("<II", data)
        return memoryview(sensor.get_fb().bytearray())[offset:offset+size]

# Register call backs.

interface.register_callback(jpeg_image_snapshot)
interface.register_callback(jpeg_image_read)

# Once all call backs have been registered we can start
# processing remote events. interface.loop() does not return.



try:
    interface.loop()

except:
    print("Error")

Hi, I’m going to release a new version of the IDE which fixes this error: “Note: this is one error that it gives me: “could not open”/media/pi/4621-0000/rpc.py”" for reading. Either the file does not exist or you don have access to it"."

This happens because the IDE thinks the RPC module is external and not built into the firmware.

You should have received a second error message which was the real error.

Hi! Awesome news!

The error it gives me is: File “call_test.py”, line 7, in /n import image, network, omv, rpc, sensor, struct, time /n Import Error: No module named image

Basically, it doesn’t import the libraries from the terminal. When I test it on the IDE, it just says that I can’t run the USB VCP while the IDE is connected, but nothing else.

I’ll also leave my updated code, which just tests if the call and callback were successful.

I changed the raw image arguments to jpeg.





Master:

import image
import network
import omv
import rpc
import sensor
import struct
import time


interface = rpc.rpc_usb_vcp_master()


def get_frame_buffer_call_back(pixformat, framesize, cutthrough, silent):
    if not silent:
        print("Getting Remote Frame...")

    result = interface.call("jpeg_image_snapshot",
                            struct.pack("<II", pixformat, framesize))
    if result is not None:

        w, h, pixformat, size = struct.unpack("<IIII", result)
        # Alloc cleared frame buffer.
        img = image.Image(w, h, pixformat, copy_to_fb=True)

        if cutthrough:
            # Fast cutthrough data transfer with no error checking.

            # Before starting the cut through data transfer we need to sync both the master and the
            # slave device. On return both devices are in sync.
            result = interface.call("jpeg_image_read")
            if result is not None:

                # GET BYTES NEEDS TO EXECUTE NEXT IMMEDIATELY WITH LITTLE DELAY NEXT.

                # Read all the image data in one very large transfer.
                interface.get_bytes(img.bytearray(), 5000)  # timeout

        else:
            # Slower data transfer with error checking.

            # Transfer 32/8 KB chunks.
            chunk_size = (1 << 15) if omv.board_type() == "H7" else (1 << 13)

            if not silent:
                print("Reading %d bytes..." % size)
            for i in range(0, size, chunk_size):
                ok = False
                for j in range(3):  # Try up to 3 times.
                    result = interface.call(
                        "jpeg_image_read", struct.pack("<II", i, chunk_size))
                    if result is not None:
                        # Write the image data.
                        img.bytearray()[i:i+chunk_size] = result
                        if not silent:
                            print("%.2f%%" % ((i * 100) / size))
                        ok = True
                        break
                    if not silent:
                        print("Retrying... %d/2" % (j + 1))
                if not ok:
                    if not silent:
                        print("Error!")
                    return None

        return img

    else:
        if not silent:
            print("Failed to get Remote Frame!")

    return None


clock = time.clock()
while(True):
    clock.tick()

    
    img = get_frame_buffer_call_back(
        sensor.RGB565, sensor.QQVGA, cutthrough=True, silent=True)
    if img is not None:
        print("Sent and Received")  # You can process the image here.

    print(clock.fps())

Slave:

# Image Transfer - As The Remote Device
#
# This script is meant to talk to the "image_transfer_jpg_as_the_controller_device.py" on your computer.
#
# This script shows off how to transfer the frame buffer to your computer as a jpeg image.

import image
import network
import omv
import rpc
import sensor
import struct

sensor.reset()
sensor.set_pixformat(sensor.RGB565)
sensor.set_framesize(sensor.QVGA)
sensor.skip_frames(time=2000)

# Turn off the frame buffer connection to the IDE from the OpenMV Cam side.
#
# This needs to be done when manually compressing jpeg images at higher quality
# so that the OpenMV Cam does not try to stream them to the IDE using a fall back
# mechanism if the JPEG image is too large to fit in the IDE JPEG frame buffer on the OpenMV Cam.

omv.disable_fb(True)


interface = rpc.rpc_usb_vcp_slave(baudrate=7500000)


def jpeg_image_snapshot(data):
    pixformat, framesize = bytes(data).decode().split(",")
    sensor.set_pixformat(eval(pixformat))
    sensor.set_framesize(eval(framesize))
    img = sensor.snapshot().compress(quality=90)
    return struct.pack("<I", img.size())


def jpeg_image_read_cb():
    interface.put_bytes(sensor.get_fb().bytearray(), 5000)  # timeout


def jpeg_image_read(data):
    if not len(data):
        interface.schedule_callback(jpeg_image_read_cb)
        return bytes()
    else:
        offset, size = struct.unpack("<II", data)
        return memoryview(sensor.get_fb().bytearray())[offset:offset+size]

# Register call backs.


interface.register_callback(jpeg_image_snapshot)
interface.register_callback(jpeg_image_read)

# Once all call backs have been registered we can start
# processing remote events. interface.loop() does not return.


try:
    interface.loop()

except:
    print("Error")

There’s no image objection in python on the desktop… because this is not the OpenMV Cam.

You’ve asked two questions now that are obvious python issues. Please check what system you are running what scripts on.

I’m sorry, I’m not that fluent in python. I’m running the “master” script on my Raspberry Pi, and I have my “slave” script on my OpenMV camera. Could you give me a small guide on what I need to do in order to transmit the images from my camera to the Pi?

You just have to copy the slave script to the camera and run the master script on the Pi. It should work without any issues then. There’s really not much more to it. Please install pygame to see the image output.

Thanks!

I have everything placed in, but when can I expect the firmware update so I can run the scripts?

Also, if I’m running the scripts from the terminal, are the imports affected?

You should be able to run the scripts right now.

Please ask an exact question about an issue you may encounter.

Hi. I tried running the scripts on both my Pi and my laptop without the IDE at first. The results were the same error:


Exception has occurred: ModuleNotFoundError
No module named ‘image’
File “C:\Users\gabri\Documents\DRACO-Payload\calltest.py”, line 7, in
import image, network, omv, rpc, sensor, struct, time

When I used the IDE, this was the error:

Could not open “D:\rcp.py” for reading. Either the file doesn’t exist, or you do not have access to it.

Was this solved on the latest update? Or am I missing something? Because I can’t run the files while the IDE is connected because I’m using a USB VCP connection.

You are trying to run the script meant for the camera on the Pi. Of-course this will not work. Please put the OpenMV Cam Python script on the OpenMV Cam.

Which scripts from the examples are you putting where?

I think I found the error. I haven’t imported the rcp library and the others into my device (image, network, omv, rpc, sensor, struct, time, rpc).

I’m using windows, what commands should I run on my command prompt?

This script needs to be on the Pi:

And this library:

Then on the OpenMV Cam you name this file as main.py on the disk that appears:

Then the example should work…

Hi! I added the files and installed pygame and serial through pip, but pygame and serial do not seem to appear as installed on the IDE.

When I try to run the controller script on the IDE, I get the error saying that there is no module named ‘serial’, but when I run it on the terminal, it says: “Import Error: no module named tools”

Note: I installed everything again just to be sure that they were on the latest versions, but it didn’t seem to help.

Any idea on how can I solve this?

Thanks and sorry for my noobie question :slight_smile:

Hi, the RPC library is not meant to be run inside of the IDE. It’s a python script you have to execute on the command line of the Pi. You have to pip install any dependencies to run the script on the Pi.

This is all explained in the README here:

Thanks for clearing that up for me!

Hi! I installed pyserial, but the IDE doesn’t recognize it yet, and it still gives me the error that it does not exist.

The module is stored in /usr/local/lib/python2.7/dist-packages

Any idea on what the problem might be? Thanks in advance.

Again, using the RPC library over USB doesn’t involve openmv ide. It’s all command line.

Oh gosh, I’m sorry, my head has been everywhere. The script ran, but I had to debug it cause the functions being called on the camera are not returning anything.
I put the same ports on both interfaces, and still, nothing happened.

I’'ll leave both scripts here to see if you could kindly see if something is wrong.

Camera: (main.py)

# Image Transfer - As The Remote Device
#
# This script is meant to talk to the "image_transfer_jpg_as_the_controller_device.py" on your computer.
#
# This script shows off how to transfer the frame buffer to your computer as a jpeg image.

import image, network, omv, rpc, sensor, struct

sensor.reset()
sensor.set_pixformat(sensor.RGB565)
sensor.set_framesize(sensor.QVGA)
sensor.skip_frames(time = 2000)

# Turn off the frame buffer connection to the IDE from the OpenMV Cam side.
#
# This needs to be done when manually compressing jpeg images at higher quality
# so that the OpenMV Cam does not try to stream them to the IDE using a fall back
# mechanism if the JPEG image is too large to fit in the IDE JPEG frame buffer on the OpenMV Cam.

omv.disable_fb(True)

# The RPC library above is installed on your OpenMV Cam and provides mutliple classes for
# allowing your OpenMV Cam to be controlled over USB or WIFI.

################################################################
# Choose the interface you wish to control your OpenMV Cam over.
################################################################

# Uncomment the below line to setup your OpenMV Cam for control over a USB VCP.
#
interface = rpc.rpc_usb_vcp_slave(port=/dev/ttyACM0)

# Uncomment the below line to setup your OpenMV Cam for control over WiFi.
#
# * ssid - WiFi network to connect to.
# * ssid_key - WiFi network password.
# * ssid_security - WiFi security.
# * port - Port to route traffic to.
# * mode - Regular or access-point mode.
# * static_ip - If not None then a tuple of the (IP Address, Subnet Mask, Gateway, DNS Address)
#
# interface = rpc.rpc_wifi_slave(ssid="",
#                                ssid_key="",
#                                ssid_security=network.WINC.WPA_PSK,
#                                port=0x1DBA,
#                                mode=network.WINC.MODE_STA,
#                                static_ip=None)

################################################################
# Call Backs
################################################################

# When called sets the pixformat and framesize, takes a snapshot
# and then returns the frame buffer jpg size to store the image in.
#
# data is a pixformat string and framesize string.
def jpeg_image_snapshot(data):
    pixformat, framesize = bytes(data).decode().split(",")
    sensor.set_pixformat(eval(pixformat))
    sensor.set_framesize(eval(framesize))
    img = sensor.snapshot().compress(quality=90)
    return struct.pack("<I", img.size())

def jpeg_image_read_cb():
    interface.put_bytes(sensor.get_fb().bytearray(), 5000) # timeout

# Read data from the frame buffer given a offset and size.
# If data is empty then a transfer is scheduled after the RPC call finishes.
#
# data is a 4 byte size and 4 byte offset.
def jpeg_image_read(data):
    if not len(data):
        interface.schedule_callback(jpeg_image_read_cb)
        return bytes()
    else:
        offset, size = struct.unpack("<II", data)
        return memoryview(sensor.get_fb().bytearray())[offset:offset+size]

# Register call backs.

interface.register_callback(jpeg_image_snapshot)
interface.register_callback(jpeg_image_read)

# Once all call backs have been registered we can start
# processing remote events. interface.loop() does not return.

interface.loop()

Master:

# Image Transfer - As The Controller Device

#

# This script is meant to talk to the "image_transfer_jpg_as_the_remote_device_for_your_computer.py" on the OpenMV Cam.

#

# This script shows off how to transfer the frame buffer to your computer as a jpeg image.


import io,rpc, serial , socket, struct, sys


from serial.tools import list_ports

# Fix Python 2.x.

try: input = raw_input

except NameError: pass


# The RPC library above is installed on your OpenMV Cam and provides mutliple classes for

# allowing your OpenMV Cam to control over USB or WIFI.


##############################################################

# Choose the interface you wish to control an OpenMV Cam over.

##############################################################


# Uncomment the below lines to setup your OpenMV Cam for controlling over a USB VCP.

#

# * port - Serial Port Name.

#

print("\nAvailable Ports:\n")

for port, desc, hwid in serial.tools.list_ports.comports():

    print("port: {} : {} [{}]".format(port, desc, hwid))

sys.stdout.write("\nPlease enter a port name: ")

sys.stdout.flush()

interface = rpc.rpc_usb_vcp_master(port=input())

print("")

sys.stdout.flush()


# Uncomment the below line to setup your OpenMV Cam for controlling over WiFi.

#

# * slave_ip - IP address to connect to.

# * my_ip - IP address to bind to ("" to bind to all interfaces...)

# * port - Port to route traffic to.

#

# interface = rpc.rpc_wifi_or_ethernet_master(slave_ip="xxx.xxx.xxx.xxx", my_ip="", port=0x1DBA)


##############################################################

# Call Back Handlers

##############################################################


def get_frame_buffer_call_back(pixformat_str, framesize_str, cutthrough, silent):

    if not silent: print("Getting Remote Frame...")


    result = interface.call("jpeg_image_snapshot", "%s,%s" % (pixformat_str, framesize_str))

    if result is not None:


        size = struct.unpack("<I", result)[0]

        img = bytearray(size)

        print(1)

        if cutthrough:

            # Fast cutthrough data transfer with no error checking.


            # Before starting the cut through data transfer we need to sync both the master and the

            # slave device. On return both devices are in sync.

            result = interface.call("jpeg_image_read")

            if result is not None:


                # GET BYTES NEEDS TO EXECUTE NEXT IMMEDIATELY WITH LITTLE DELAY NEXT.


                # Read all the image data in one very large transfer.

                interface.get_bytes(img, 5000) # timeout


        else:

            # Slower data transfer with error checking.


            # Transfer 32 KB chunks.

            chunk_size = (1 << 15)


            if not silent: print("Reading %d bytes..." % size)

            for i in range(0, size, chunk_size):

                ok = False

                for j in range(3): # Try up to 3 times.

                    result = interface.call("jpeg_image_read", struct.pack("<II", i, chunk_size))

                    if result is not None:

                        img[i:i+chunk_size] = result # Write the image data.

                        if not silent: print("%.2f%%" % ((i * 100) / size))

                        ok = True

                        break

                    if not silent: print("Retrying... %d/2" % (j + 1))

                if not ok:

                    if not silent: print("Error!")

                    return None


        return img


    else:

        if not silent: print("Failed to get Remote Frame!")


    return None



while(True):

    sys.stdout.flush()


    # You may change the pixformat and the framesize of the image transfered from the remote device

    # by modifying the below arguments.

    #

    # When cutthrough is False the image will be transferred through the RPC library with CRC and

    # retry protection on all data moved. For faster data transfer set cutthrough to True so that

    # get_bytes() and put_bytes() are called after an RPC call completes to transfer data

    # more quicly from one image buffer to another. Note: This works because once an RPC call

    # completes successfully both the master and slave devices are synchronized completely.

    #

    img = get_frame_buffer_call_back("sensor.RGB565", "sensor.QQVGA", cutthrough=True, silent=False)

    print(2)


    if img is not None:

       print("sent-")


    #print(clock.get_fps())