Hello
I’m trying to make a Time laps with the RT1062 but i want all pictures to be saved on my FTP server. The probleme is that all the files ( photos taken) are corrupted the file is readable on windows but the half photo is corrupted by artefacts.
Have to say that my code has been generated on chatGPT , I suspect the transfert function to not work correctly.
def send_ftp(filename):
try:
print("🔌 Connexion FTP...")
addr = usocket.getaddrinfo(FTP_SERVER, 21)[0][-1]
sock = usocket.socket()
sock.connect(addr)
sock.recv(1024)
def send(cmd):
sock.send(cmd + "\r\n")
resp = sock.recv(1024)
print("> " + cmd)
print("< " + str(resp))
return resp
send("USER " + FTP_USER)
send("PASS " + FTP_PASS)
send("TYPE I")
resp = send("PASV")
if b"(" not in resp or b")" not in resp:
print("❌ Mauvaise réponse PASV")
sock.close()
return
pasv = resp.decode().split("(")[1].split(")")[0].split(",")
ip = ".".join(pasv[:4])
port = (int(pasv[4]) << 8) + int(pasv[5])
data_sock = usocket.socket()
data_sock.connect((ip, port))
send("CWD home")
send("STOR " + filename)
with open(filename, "rb") as f:
while True:
buf = f.read(1024)
if not buf:
break
data_sock.send(buf)
data_sock.close()
send("QUIT")
sock.close()
print("✅ FTP terminé :", filename)
os.remove(filename)
print("🗑️ Supprimé :", filename)
except Exception as e:
print("❌ Erreur FTP :", e)
Hi, I would avoid saving the images to disk. When you call snapshot() you have a bytearray from the image you can just send.
data = snapshot().to_jpeg().bytearray()
Then ask chatgpt how to send a jpeg byte array via FTP using python. It should generate better code.
well i tried but i’m not really confident with software programing , I’m a Hardware person
def capture_to_memory():
led.on()
img = sensor.snapshot()
jpeg = img.compress(quality=85)
data = jpeg.bytearray()
Does anyone has an exemple of timelapses working ?
Here’s a micropython library that works via FTP:
from ftplib import FTP
from io import BytesIO
# Your FTP credentials and server details
ftp_host = 'ftp.example.com'
ftp_user = 'your_username'
ftp_pass = 'your_password'
remote_filename = 'image.jpg' # The name you want the image to have on the server
# Your image as a bytearray (already in memory)
jpeg_data = bytearray(...) # your actual bytearray here
# Create a BytesIO stream from the bytearray
image_stream = BytesIO(jpeg_data)
# Connect and login to the FTP server
with FTP(ftp_host) as ftp:
ftp.login(user=ftp_user, passwd=ftp_pass)
# Change to the desired directory if needed
# ftp.cwd('/path/on/server')
# Upload the image
ftp.storbinary(f'STOR {remote_filename}', image_stream)
print("Upload successful!")
OpenMV dont like it 
ImportError: no module named ‘ftplib’
Thx , i put the file in the flash but, All pictures on my FTP server are still corrupted , I supect this FTP function because it works well if i send the pictures on the flash.
Hi, the next step to debug this kinda of thing is typically to use wireshark to look at the packets and see what’s happening:
All of the OpenMV Cam’s API functions work correctly here. We can definitely stream MJPEG video via an HTTP connection and RTSP connection.
Maybe start here… try sending a simple bytearray using the above code with like some text data in it and see if that works. Then make the amount of data larger, like up to 10KB. Verify that’s fine, then you’ll know that the FTP transfer works. After that it’s then just an issue with sending the jpeg image.
Hello ,
I tried many times but i cant find why my files are not correct when transmitted via FTP
Is another way to transfert the photosvia wifi is possible , like SMB3 or something like that ?
thx
I mean, there are infinite ways. You just need to pick a server protocol to use. Typically, file upload is done via http multi-part transfers. The request library onboard the camera has support for this.
What’s odd is that you are able to send the files… but, they end up corrected right? Are the number of bytes being sent correct? Maybe there’s a bug with the micropython ftp library I linked to as it’s free-ware.