I am attempting to measure close to real-time speed of a motor using an encoder attached to the motor gearbox. I have setup an external interrupt to fire whenever there is a rising edge from the sensor. It seems like it measures the motor speed okay, but every 5-10 speed values vary greatly from the rest. It gets worse as the motor speed is increased.
I set it up to calculate the time between interrupt calls (sensor rising edges) in microseconds. I averaged the most recent 4 periods of the time between rising edges from the sensor (in microseconds) and then multiplied the reciprocal of this value by a constant to calculate the speed. Why is my measurement not accurate? To verify the output of the optical sensor, I checked the output with an oscilloscope and the signal is clean and constant. When the motor as at a medium-low speed, the period of the actual signal from the sensor is 2.3 ms. Any help would be greatly appreciated. Below is part of my code.
Code: Select all
period = 1000000 start = utime.ticks_us() def callback(line): # Encoder interrupt handler global period, start period = utime.ticks_diff(utime.ticks_us(),start) # I think the order of these arguments contradicts what the datasheet says but if they are reversed, it returns a negative value. start = utime.ticks_us() extint = pyb.ExtInt("P3", pyb.ExtInt.IRQ_RISING, pyb.Pin.PULL_NONE, callback) while(True); if (period > 100000): # prevents division by 0. actual_speed = 0 else: actual_speed = 10000/period speedBuffer.append(actual_speed) speedBuffer.pop(0) actual_speed = ((speedBuffer+speedBuffer+speedBuffer+speedBuffer)/4)