UART to NodeMcu

hello, i want to send an image from openMv to NodeMcu through UART, can you provide me with code do so? as i am unfamiliar with openMV
is the UART the best choice? or SPI would be better?

Uart is definitely what you want to do. Do something like this:

# Untitled - By: kwagyeman - Tue Jun 19 2018

import time
from pyb import UART

# Always pass UART 3 for the UART number for your OpenMV Cam.
# The second argument is the UART baud rate. For a more advanced UART control
# example see the BLE-Shield driver.
uart = UART(3, 115200, timeout_char=1000)

import sensor, image, time, struct

sensor.reset()
sensor.set_pixformat(sensor.RGB565)
sensor.set_framesize(sensor.QQVGA)
sensor.skip_frames(time = 2000)

clock = time.clock()

while(True):
    clock.tick()
    img = sensor.snapshot()
    uart.write(struct.pack("<l", img.size()))
    uart.write(img.compress(quality=50))
    print(clock.fps())

Basically, the first 4 bytes will be the image jpeg size followed by the actual data bytes of the image. Note that the serial stream in this case has no back pressure so it may be hard to receive. You’ll need some type of flow control I guess to make this more reliable. Otherwise, you’ll need to have a DMA buffer able to handle about 10KB or so of image data that gets sent in one go.

i don’t know what do you mean by no back pressure and flow control, and also i don’t know how to recive such big data by Nodemcu and send it through wifi :confused:

Yeah, so, you need some method to flow control how much data the camera sends per second. Um, what wires are you using to connect things?

i use jumber wires this https://www.amazon.com/dp/B01EV70C78?aaxitk=hM1UgLesW25RYhmSPgnKHw&pd_rd_i=B01EV70C78&pf_rd_m=ATVPDKIKX0DER&pf_rd_p=3930100107420870094&pf_rd_s=desktop-sx-top-slot&pf_rd_t=301&pf_rd_i=jumper+wires&hsa_cr_id=4414025220401

Sorry, I meant, what I/O pins and etc.

Note, since I keep getting questions about how to connect the OpenMV Cam to other devices I will write a tutorial about how to do this in the documentation for the next release.

i connect p4 and p5 in openMv with D9 and D10 in NodeMCU

thank you so much

Great, and both devices share a common ground wire right?

yes, sure

Okay, so, what part isn’t working then?

Can you run this script and confirm you receive 4 bytes that specify the image size?

# Untitled - By: kwagyeman - Tue Jun 19 2018

import time
from pyb import UART

# Always pass UART 3 for the UART number for your OpenMV Cam.
# The second argument is the UART baud rate. For a more advanced UART control
# example see the BLE-Shield driver.
uart = UART(3, 115200, timeout_char=1000)

import sensor, image, time, struct

sensor.reset()
sensor.set_pixformat(sensor.RGB565)
sensor.set_framesize(sensor.QQVGA)
sensor.skip_frames(time = 2000)

clock = time.clock()

while(True):
    clock.tick()
    img = sensor.snapshot()
    uart.write(struct.pack("<l", img.size()))
    print(clock.fps())

No image will be sent but you should have the image size in bytes. Please try to get this basic part working first with NodeMCU.

i am done with this, it prints 0,150,0,0

Great, so, you are able to receive the number of bytes of size of an uncompressed image. That number is 38400 which is what you received.

Now, we can do the image transfer part. I can give you the OpenMV Cam code for this. However, I need to know much space you have on the node MCU device to buffer an image. Images can be several kilo bytes in size. The way in which you should make he image transfer work is for the OpenMV Cam to know ahead of time how much space you have and I can then have it limit the jpeg compression to a certain image size.

So, can you determine how much of a buffer space you are able to allocate on node MCU? The more bytes you can allocate the better.

The NodeMcu is a ESP8266 chip and I believe it only has 36kB of RAM. I played with it a bit and it is great as a node for IOT applications with smaller RAM needs. I changed to its bigger brother the ESP32 because it has much bigger 520 kB SRAM and also there is dev boards with external psRAM of another 4mb or 8mb on top.

i have searched for long time but i found different answers! so i tried it my self and i managed to send max1930 char and received them with no problem,

i wander if i can divide each frame and send them part by part then re-construct the image agian?

if not, can you give me a code do send an image with lowest quality and i will test it then try to increase it gradually?

Replace the while loop with this:

while(True):
    clock.tick()
    img = sensor.snapshot().compress(quality=30)
    uart.write(struct.pack("<l", img.size()))
    uart.write(img)
    print(clock.fps())

This will send the image size in bytes as the first 4 bytes followed by the image data. There will be no breaks between bytes so be prepared for a lot of data. You should have time to malloc the necessary bytes on the ESP device. However, I doubt you can handle the data rate so I recommend you tell the camera to only generate frames given some type of serial command.