Best way to measure serial communication Latency | way to collect latencies in the program


Please what would be the best way to measure the latency that any process could consume during the execution of program?

For example, what I am currently doing is adding timestamps before and after the procedure that I want to verify. Then the timestamps differences are the latency that I should be looking for. The timestamps values come from the pyb.micros() funcition. So, in the code below:

t_0 = pyb.micros()
    img = sensor.snapshot()
    t_1 = pyb.micros()
    diff =  t_1 -  t_0

    show_process_timing(t_0,t_1,"Init Snapshot")

The init snapshot is the id of the process that I want to verify. Please here is here is the implementation of the show_timing_process() function

def show_process_timing(t_x , t_y , id_pos):
    print("\nDifference: {} us\n".format(t_y - t_x))

Should I rely on the values returned by the pyb.micros() function?

Also How would you evaluate the delay taken by an SPI , I2C or a UART transaction?

Thank you!


Yeah, this is pretty much valid.

As for the UART/SPI. The main thread waits until the data is sent. So, it’s fine to put a timer around any call.

Thank you.

I have this concern, is it normal that the UART transaction becomes faster than the I2C transaction?

if yes, How come?

Hi, microcontrollers usually communicate to each other via UARTs. SPI and I2C are for communication to more simple hardware chips typically.

In terms of coding… when using the OpenMV Cam you have a camera sensor that’s doing work. If you write your code to wait for I2C bus access always then no work can be done. The UART allows bytes to be received asynchronously however. SO, you can process bytes in a loop while doing vision stuff too and then send out results.