Hi,
Please what would be the best way to measure the latency that any process could consume during the execution of program?
For example, what I am currently doing is adding timestamps before and after the procedure that I want to verify. Then the timestamps differences are the latency that I should be looking for. The timestamps values come from the pyb.micros() funcition. So, in the code below:
t_0 = pyb.micros()
img = sensor.snapshot()
#break
t_1 = pyb.micros()
diff = t_1 - t_0
show_process_timing(t_0,t_1,"Init Snapshot")
The init snapshot is the id of the process that I want to verify. Please here is here is the implementation of the show_timing_process() function
def show_process_timing(t_x , t_y , id_pos):
print("\n{}".format(id_pos))
print("\nDifference: {} us\n".format(t_y - t_x))
Should I rely on the values returned by the pyb.micros() function?
Also How would you evaluate the delay taken by an SPI , I2C or a UART transaction?
Thank you!
Benjamin.