i’m relatively new in this topic. Hope someone can help me out.
I would like to detect defective displays from a device.
The camera is permanentely mounted in the test equipement and show the picture of the display.
I would like now to detect defective parts of the display via blob detection.
Here is a picture of what the camera see’s.
Now i’ve tried to recognize the gray areas via blob detection with this code:
import sensor, image, time, pyb areax = 150 areay = 100 hys = (0, 100, 8, 127, -128, 127) hys_sw = (200, 255) #hys_sw = (80, 100, 120, 140, 140, 180) r = ((640//2)-(areax//2), (480//2)-(areay//2), areax, areay) # 50x50 center of QVGA. are_thr = areax * areay print(r) sensor.reset() sensor.set_pixformat(sensor.GRAYSCALE) sensor.set_framesize(sensor.VGA) sensor.skip_frames(time = 2000) clock = time.clock() while(True): clock.tick() img = sensor.snapshot() # img.draw_rectangle(r, color = (127,127,127)) # Find blobs with a minimal area of 50x50 = 2500 px # Overlapping blobs will be merged blobs = img.find_blobs([hys_sw], area_threshold=are_thr, merge=True) # Draw blobs for blob in blobs: # Draw a rectangle where the blob was found img.draw_rectangle(blob.rect(), color=(127,127,127)) # Draw a cross in the middle of the blob img.draw_cross(blob.cx(), blob.cy(), color=(127,127,127)) print("blobs found") pyb.delay(100) # Pauses the execution for 100ms print(clock.fps())
All was i get is this:
Only two white areas left and right.
What i’m doing wrong? I’v changed the threshold between 100 and 130 but it helped not out.
Can somebody help me.