Stock image on raspberry to re use it on recognition person

I have tried the code in the RPC library for streaming the image but I want to save it also not just stream it on buffer frame

Add code to save the image. We offered example scripts.

thanks, kwagyman
I have tried this script and its work but I want to save in raspberry pi not save it on openMv

while(True):
clock.tick()
sensor.snapshot().save(“example.jpg”)
print(clock.fps())

Hi, did you see the RPC image streaming example scripts in our openmv github repo? These show how to stream images from the camera. You need to put the code to save images on the Pi.

can u send the full script

i read all what u said but no result
I use this script

# This script should NOT be run from the IDE or command line, it should be saved as main.py
# Note the following commented script shows how to receive the image from the host side.
#
# #!/usr/bin/env python2.7
# import sys, serial, struct
# port = '/dev/ttyACM0'
# sp = serial.Serial(port, baudrate=115200, bytesize=serial.EIGHTBITS, parity=serial.PARITY_NONE,
#             xonxoff=False, rtscts=False, stopbits=serial.STOPBITS_ONE, timeout=None, dsrdtr=True)
# sp.setDTR(True) # dsrdtr is ignored on Windows.
# sp.write("snap")
# sp.flush()
# size = struct.unpack('<L', sp.read(4))[0]
# img = sp.read(size)
# sp.close()
#
# with open("img.jpg", "w") as f:
#     f.write(img)

import sensor, image, time, ustruct
from pyb import USB_VCP

usb = USB_VCP()
sensor.reset()                      # Reset and initialize the sensor.
sensor.set_pixformat(sensor.RGB565) # Set pixel format to RGB565 (or GRAYSCALE)
sensor.set_framesize(sensor.QVGA)   # Set frame size to QVGA (320x240)
sensor.skip_frames(time = 2000)     # Wait for settings take effect.

while(True):
    cmd = usb.recv(4, timeout=5000)
    if (cmd == b'snap'):
        img = sensor.snapshot().compress()
        usb.send(ustruct.pack("<L", img.size()))
        usb.send(img)

but thony show me an error that I should import pyb

What camera are you using?

openmv m7

hi kwagyeman if u can modify this part of code # import sys, serial, struct

port = ‘/dev/ttyACM0’

sp = serial.Serial(port, baudrate=115200, bytesize=serial.EIGHTBITS, parity=serial.PARITY_NONE,

xonxoff=False, rtscts=False, stopbits=serial.STOPBITS_ONE, timeout=None, dsrdtr=True)

sp.setDTR(True) # dsrdtr is ignored on Windows.

sp.write(“snap”)

sp.flush()

size = struct.unpack(‘<L’, sp.read(4))[0]

img = sp.read(size)

sp.close()

with open(“img.jpg”, “w”) as f:

f.write(img)

Please use the RPC examples. The above script example is very brittle and not likely to work reliably.

can you please recommend a script that i should use to send this image and save it on my raspberry

See this. You’ll need to modify the pygame script code to save an image but you should be able to stream images:

i need a script for that