LCD type

I have heaps of little TFT screen already and would like to have a go at hooking 1 of my screens up to the OpenMV cam that will be compatible with the builtin LCD module

I have a st7735 with res of 160 x 128 that runs 16 bit colour so would be perfect for RGB565 in QQVGA. Will it be compatible with the LCD module?? Also if it is what’s the pin out as the docs just say P0, P2, P3, P6, P7, and P8 without defining MOSI, MISO, CS, D0, BL, RST

Probably, also, if you want to use another LCD screen please help us with coding! Submit a PR to the LCD module:

https://github.com/openmv/openmv/blob/master/src/omv/py/py_lcd.c

No worries I am happy to help. I have been playing with the little TFT’s a bit and have worked out how to use them well.

My history is that I only started to learn to program 18 months ago and have only used Basic and Python so far but it is about time that I learnt C. I will stare at your github and work it out.

Yes I think that’s the same LCD driver. Re the pinout, you’ll find it in the IDE (Help->About OpenMV Cam…)

Thanks I am going to have a play with hooking a couple of different screens and getting them to work with Open MV.

What is the max SPI2 speed OpenMV cam??

54 MHz. Maybe higher. It works via a clock divider prescaler so you’ll only get multiples of the internal system bus freq of 216 MHz.

I am going to make a up a little PCB to mount my screen on so that it will plug into the OpenMV cam.

I want to wire my screen the same as the official LCD so that it will not only work with the drivers that I write but also with the builtin OpenMV LCD module.

pins used by LCD :

MOSI: P0
MISO: P1
SCK: P2
CS: P3
DC: unknown, I need to know the data/command pin used by LCD module as it is vital to the operation of screen
Reset: unknown but I will just tie it to 3.3v so that the screen can’t be hard reset
BL: unknown but I will just tie to 3.3v so that back light is always on.

The only pins that I don;t know and need to be able to work with buitlin LCD module is the DC pin.
This is the line of code that defines it in the LCD module code you have named it RS instead of DC

#define RS_PIN GPIO_PIN_13

While physical pin on the board is GPIO13??

PD13 → P8 on the camera board

I have started today to play with a larger 2.4" screen with resolution 320 x 240 and have some small problems with QVGA compared to QQVGA.

I can make a 160 x 120 window and display QQVGA on the bigger screen but if I try to use QVGA and display the whole screen it seems to only stream a small part of the image. I am not sure if maybe the streaming to IDE is filling the DMA FIFO so that the FIFO doesn’t have the room to fit the whole image a second time for the SPI.

Is there a way to disable the streaming to the IDE so that I can test this??

see my video https://youtu.be/V5CYFImRQeY

From the data sheet of the STM32f765

DMA controller (DMA)
The devices feature two general-purpose dual-port DMAs (DMA1 and DMA2) with 8
streams each. They are able to manage memory-to-memory, peripheral-to-memory and
memory-to-peripher
al transfers. They fe
ature dedicated
FIFOs for APB/AHB
peripherals,
support burst transfer and are designed to provide the maximum peripheral bandwidth
(AHB/APB).

Hi, the processor has interrupts disabled while it is sending data to the display via SPI. Um, anyway, here literally a disable frame buffer button on the top right hand corner of the IDE.

OK I tried that and even tries saving it as main.py on the cam then powering from external power source and still same effect.

Here is a video of just taking 1 shot then display insteading of video stream so it is easier to see what’s happening - YouTube

Seems with QVGA it can only send a smaller amount of data at a time how can I break up the fb into several smaller parts to send in a row??

I don’t know what that screen expects so it’s hard for me to help you. However, you can manually disable interrupts of you feel they are the problem. See the pyb module. The methods to disable and re-enable interrupts are right there. Just call those methods before and after sending the data. That said, keep in mind on doing this USB may stop working due to how long you are not handling USB interrupts. So, the SPI data rate needs to be high.

If that doesn’t work… Please give me a data sheet for that module.

I don’t know what that screen expects so it’s hard for me to help you

The screen expects a 16bit 320x240 images so I basicly just want to to transfer what’s in the RGB565 QVGA fb to the SPI.

It all works very well in 160x120 RGB565 but when using 320x240 RGB565 then only the first 1/6th of the image is sent then the rest 5/6th of the fb is cropped.

I think what might be happening is that when you send data to SPI in MicroPython what MicoPython does is makes a copy of the data then sends the copy to the SPI. So when MicroPython makes this copy in 160x120 there is enough RAM to make a copy but when using 320x240 it runs out of RAM after the first 1/6th of the image.

I want to be able to send the whole 320x240 fb to the SPI but will need to do it in chunks. How can I send the fb to the SPI in say 10 chunks???

MicroPython doesn’t make a copy of the data. The buffer operation quite literally reads directly from the frame buffer data and sends that out of the SPI bus. I can verify exactly what happens with this tonight if you post your complete code.

If you have a SPI probe I’d verify that the LCD screen is not the issue. In particular, using logic problem you should be able to see if the whole frame was sent versus 1/8th of the frame. I don’t think the issue is on the OpenMV Cam.

Do you have a saleae logic probe or something you can decide SPI packets with? I will test tonight if you post code and show you a trace of what happens on the SPI lines.

Do you have a saleae logic probe or something you can decide SPI packets with?

unfortunately No as they are not cheap.

I have made a video to explain the problem well see https://youtu.be/Cqv-jPv-Yz0

I first load a 320x240 BMP file to the screen to show the screen works properly in full screen. I think stream QQVGA 160x120 to a window on the screen to show streaming the fb works fine. I thin show what happens if I try to stream QVGA 320x240 top the screen.

Here is both my bits of code

import sensor, image, time
from MP_ili9341 import ili9341

sensor.reset()
sensor.set_pixformat(sensor.RGB565)
sensor.set_framesize(sensor.QVGA)
sensor.skip_frames(time = 2000)

clock = time.clock()

screen = ili9341()
screen.set_window(0,0,320,24 0)
#screen.load_image('bots16_320x240.bmp')


while(True):
    clock.tick()
    img = sensor.snapshot()
    screen.send_spi(img, True)
    print(clock.fps())

and

from machine import Pin, SPI
import time
import ustruct


class ili9341():

  def send_spi(self,data, is_data):
    self.dc.value(is_data) #set data/command pin
    self.cs.value(0)
    self.hspi.write(data)
    self.cs.value(1)

  def __init__(self, cs='P3', dc='P9'):
    self.hspi = SPI(2, baudrate=54000000)
    self.cs = Pin(cs, Pin.OUT)
    self.dc = Pin(dc, Pin.OUT)

    for command, data in (
      (0xef, b'\x03\x80\x02'),
      (0xcf, b'\x00\xc1\x30'),
      (0xed, b'\x64\x03\x12\x81'),
      (0xe8, b'\x85\x00\x78'),
      (0xcb, b'\x39\x2c\x00\x34\x02'),
      (0xf7, b'\x20'),
      (0xea, b'\x00\x00'),
      (0xc0, b'\x23'),  # Power Control 1, VRH[5:0]
      (0xc1, b'\x10'),  # Power Control 2, SAP[2:0], BT[3:0]
      (0xc5, b'\x3e\x28'),  # VCM Control 1
      (0xc7, b'\x86'),  # VCM Control 2
      (0x36, b'\xF8'),  # Memory Access Control
      (0x3a, b'\x55'),  # Pixel Format
      (0xb1, b'\x00\x18'),  # FRMCTR1
      (0xb6, b'\x08\x82\x27'),  # Display Function Control
      (0xf2, b'\x00'),  # 3Gamma Function Disable
      (0x26, b'\x01'),  # Gamma Curve Selected
      (0xe0, b'\x0f\x31\x2b\x0c\x0e\x08\x4e\xf1\x37\x07\x10\x03\x0e\x09\x00'), # Set Gamma
      (0xe1, b'\x00\x0e\x14\x03\x11\x07\x31\xc1\x48\x08\x0f\x0c\x31\x36\x0f')):  # Set Gamma
      self.send_spi(bytearray([command]), False)
      self.send_spi(data, True)
    self.send_spi(bytearray([0x11]), False)
    #time.sleep(10)
    self.send_spi(bytearray([0x29]), False)


  def set_window(self, x0=0, y0=0, width=320, height=240):
    x1=x0+width-1
    y1=y0+height-1
    self.send_spi(bytearray([0x2A]),False)            # set Column addr command
    self.send_spi(ustruct.pack(">HH", x0, x1), True)  # x_end
    self.send_spi(bytearray([0x2B]),False)            # set Row addr command
    self.send_spi(ustruct.pack(">HH", y0, y1), True)  # y_end
    self.send_spi(bytearray([0x2C]),False)            # set to write to RAM

  #chuck size can be increased for faster wiring to the screen at cost of RAM
  def load_image(self, image_file, chunk_size=1024):
    BMP_file = open(image_file , "rb")
    data = BMP_file.read(54)              #seek position past header
    data = BMP_file.read(chunk_size)
    while len(data)>0 :                   #read data from file to SPI
      self.send_spi(data, True)
      data = BMP_file.read(chunk_size)
    BMP_file.close()

I found the issue. In the ST HAL code (from ST Microelectronics) they do this:

HAL_StatusTypeDef HAL_SPI_Transmit_DMA(SPI_HandleTypeDef *hspi, uint8_t *pData, uint16_t Size)

And MicroPython calls it with this:

HAL_SPI_Transmit_DMA(self->spi, (uint8_t*)src, len)

Where len is size_t (32-bits).

So, the top 16-bits of the length are getting chopped off because of 32-bit to 16-bit conversion.

This requires a firmware fix. I can patch the firmware really quick but this is also an MP bug. It will take longer to upstream this fix.

I seem to be good at finding bugs, not sure if developers love me or hate me :slight_smile:

How come it affected QVGA but doesn’t affect QQVGA ??

I did find a way to make it work although at a super slow frame rate. If I write each frame to the file then open the file and write it to SPI see - YouTube

using this code

import sensor, image
from MP_ili9341 import ili9341


sensor.reset()
sensor.set_pixformat(sensor.RGB565)
sensor.set_framesize(sensor.QVGA)
sensor.skip_frames(time = 2000)

screen = ili9341()
screen.set_window(0,0,320,240)

chunk_size = 1024

while True:
  img_writer = image.ImageWriter("/stream.bin")
  img = sensor.snapshot()
  img_writer.add_frame(img)
  img_writer.close()

  img_file = open("/stream.bin" , "rb")
  data = img_file.read(32)      #this must be the size of the header
  data = img_file.read(chunk_size)
  while len(data)>0 :                   
    screen.send_spi(data, True)
    data = img_file.read(chunk_size)
  img_file.close()

QQVGA is 1601202=38400 bytes… which is less than 65535.

I’ll have a firmware binary fix in a bit.

And no, you’re great.

Fixed. Run your original code with this.
firmware.zip (1.75 MB)

It works perfectly now :slight_smile:

I can stream 320x240 images to both the screen and IDE at 12fps and if I turn off the IDE then I get 16fps.

I will cleanup my code and make a nice module with both the code for the 160x128 screen and the 320x240 screen and make it public under MIT and also make a nice video too :slight_smile: