Transfer frame buffer instead of snapshots via RPC

Is there a way to transfer the Frame Buffer via RPC instead of sensor snapshots? When transferring just snapshots, any adjustments to the image (coloring/drawing shapes or strings) don’t look like they’re carried over. My code basically follows the “” program where it displays a grayscale picture except for those objects that exceed the threshold; those temperatures are displayed in ironbow colorpalette. I tried using the “image_transfer_jpg_streaming_as_the remote_device_for_your_computer” and “image_transfer_jpg_streaming_as_the_controller_device” example programs with the added logic from the “lepton_get_high_temp” program and got an output like this:
when I’m expecting an output like this:

Hi, uh, it depends on when you transfer the image. The frame buffer is constantly updated. When you call the RPC call it transfers whatever the frame buffer was.

So, you just need to transfer it later in your code.

Thanks for the reply. I was thinking that, but I feel like the frame buffer should be updated after the first call anyways? Right now, my code basically does this:

While True:
Update Frame Buffer
Remotely run script #similar to

Shouldn’t this technically do the same thing as running the script before updating the frame buffer except for the first cycle since after the lepton program on the H7 is run, the frame buffer would be updated and then transferred on the next iteration of the while loop? Or is there another way you were inferring this should be done?

It’s literally in order. So when you call the RPC system to send an image it sends whatever the frame buffer state is.

At any point in time you might draw on the frame buffer and modify it.

What snapshot does is that when called… before updating the frame buffer… it grabs whatever the last state was and puts that into a jpeg buffer for the IDE to sample asynchronously.