setting which fb is displayed in IDE

Ok if I make a copy if the fb using following code

extra_fb = sensor.alloc_extra_fb(sensor.width(), sensor.height(), sensor.RGB565)
extra_fb.replace(img)

I then do some processing of this copy and I want to display the copy in the IDE for some debug how can I do this??

Do:

print(extra_fb.compress_for_ide(), end=“”)

This forces the IDE to display that image. Note that compress() destroys the image in the FB. Use comprssed_for_ide() instead to make a copy - but at lower quality.

Also, when you call snapshot the IDE will display whatever is in the main FB. So, made an some type of pause or etc.

Ok I will have a play with print(extra_fb.compress_for_ide(), end=“”)

Is there a way to disable that when I call shapshot that the IDE will display the image in the main fb?

What is best if instead of disabling the IDE show the image in main fb when snapshot is called is if I can set a pointer to which fb is sent and be able to set a pointer to the other fb that I have created.

I have a couple of questions now to help me understand how the OpenMV cam and IDE work.

All data is sent via serial to USB connection. This is both the data to be displayed in serial terminal and the image that is to be displayed in the frame buffer window. How does the IDE know which data to send to which window in the IDE??

Hi, the print method just sends a JPG compressed data stream to the PC which the IDE knows how to parse. Otherwise, it asks to read from the frame buffer.

Um, so, there’s a work around you can do right now by using the Open Serial Terminal method under tools to program the OpenMV Cam. See this video here: OpenMV IDE & Generic MicroPython Board Support - YouTube

The print method then controls what frame buffer is printed and snapshot no longer streams.

Thanks I just watched the video and this should do what I need even though it is a little work around :slight_smile:

Um, I can make the IDE accept print data even when the FB is disabled. Let me do that. Then this will allow you to always force a frame buffer display when you like.

Thanks :slight_smile:

Just starting to play with displaying a different fb in the IDE.

I get error from print statement can’t compress in place from this script

    img = sensor.snapshot()
    new_fb = img.binary([white_threshold], copy=True, to_bitmap=True)
    print(new_fb.compress_for_ide(), end="")
    time.sleep(0.2)

If I use compressed method instead of compress_fro_ide it no longer gets an error but I don’t seem to be able to display it in the ide.

This is my output from following script - YouTube

time.sleep(200) gives sleep of 0.2

here is the script for the output in video

import sensor, image, time

sensor.reset()
sensor.set_pixformat(sensor.RGB565) 
sensor.set_framesize(sensor.QQVGA)
sensor.skip_frames(time = 2000)
clock = time.clock()

white_threshold = (60, 100, -128, 127, -128, 127)

while True:
    clock.tick()
    img = sensor.snapshot()
    new_fb = img.binary([white_threshold], to_bitmap=True, copy=True)
    print(new_fb.compressed(), end="")
    time.sleep(200)
    for r in new_fb.find_rects(threshold = 5000):
        img.draw_circle(r.x()+r.w()//2, r.y()+r.h()//2, r.w()//2, color = (0, 255, 0))
    print("FPS %f" % clock.fps())

You can’t compress a bitmap image in place because you are trying to turn a bitmap into a jpeg image where the data is less dense.

Use the compressed_for_ide() method which compressed out of place (these methods are all documentated in the docs and are right next to each other). Yes, the nomenclature is confusing due to us deciding weird names back in 2015 for this stuff.

There are 4 compress methods (should really only be one but the cats out of the bag now).

compress()
compressed()
compress_for_ide()
compressed_for_ide() - Use this one.

So, the difference between the regular and the for IDE ones is that the regular one yields a jpeg byte stream where as the for IDE one yields a data stream built by packing every 6 bits of the jpeg data into 8 bits. This is done such that the jpeg data can be sent to the IDE along with UTF8 characters through the text channel of the VCP port without corruption of text. This method works extremely well and you’ll never see garbage in your serial terminal.

Something is still wrong because I don’t see a binary image for 200ms

here is my output - YouTube

from this simple test script

while True:
    clock.tick()
    img = sensor.snapshot()
    new_fb = img.binary([white_threshold], to_bitmap=True, copy=True)
    print(new_fb.compressed_for_ide(), end="")
    time.sleep(200)
    print("FPS %f" % clock.fps())

even if I try something like this using an extra RGB565 fb it get same result of not showing the extra fb

import sensor, image, time

sensor.reset()
sensor.set_pixformat(sensor.RGB565) # grayscale is faster (160x120 max on OpenMV-M7)
sensor.set_framesize(sensor.QQVGA)
sensor.skip_frames(time = 2000)
clock = time.clock()

extra_fb = sensor.alloc_extra_fb(sensor.width(), sensor.height(), sensor.RGB565)
white_threshold = (20, 100, -45, 127, -13, 127)

while True:
    clock.tick()
    img = sensor.snapshot()
    extra_fb.replace(img)
    extra_fb.binary([white_threshold])    
    print(extra_fb.compressed_for_ide(), end="")
    time.sleep(200)
    
    print("FPS %f" % clock.fps())

I see the issue. Okay, this is not possible using the default OpenMV IDE terminal. This is because of the limited text buffer size on the OpenMV Cam when opened in debug mode using the IDE. Instead you have to use the open terminal feature which treats the OpenMV Cam more like a serial port. In this mode the camera will block on I/O access while the data is being transferred where as in normal mode image data is tossed so as not to block waiting for I/O. I.e. when you call print using the normal connect mode the camera does not block for data being written to the serial port.

So, click disconnect, go to Open Terminal, go through the prompts to open a serial terminal, and then click the run arrow with the script you want to run in the main window. This will then display the data. As for the script being a little slow to execute. I can speed that up in a future IDE release. In open terminal mode all our special debug code is bypassed and you’re talking directly to the micropython os which has trouble dealing with lots of bytes in bursts. So, the IDE sends the bytes slowly like a serial port to prevent any issue from happening. I think I may have made it too slow however. But, it’s reliable.

The above work around isn’t that great… I guess the thing to do would be to add a method that allows you to force what image is sent to the PC as the frame buffer.

Thank you I am now getting much further, it is displaying what I want in the open terminal.

What’s the best way for writing programs and running them in the Open terminal. What I just did now was wrote my script in note++ then saved it to the USB drive then in the Open terminal repl I type import filename then it ran my script (much like using MicroPython on other platforms)

When I wanted to make a changes to my script, I make the changes in Note++ and click save then the red light comes on the cam then goes off for a bit then comes on again. I then press ctrl D to do a soft reset then type import filename again and it will run the new updated script.

Is there a better way to do writing and debugging than what I did or is that the best way to do it??

You can write the script int the IDE with open terminal. When you click the green run button the script in the IDE gets run.

Great that’s perfect. I am now cooking with gas, time to tap out some code

I made a bug tracker for a new method. Will get around to it.

This specific issue should be doable with copy_to_fb or Image.flush().