Camera interfacing using I2C

I have a telosb platform. It consists of MSP430 micro controller and CC2420 radio. I want to send a snapshot from OpenMV to telosb using I2C and then transmit wirelessly to another telosb. The problem is that the buffer size of MSP430 is 250 bytes and the max transfer rate is 250 kbps. Any help will be appreciated.

Hi, sorry for not answering this sooner. I just saw this.

So, first, is it possible to use async serial or spi? If not, then you can use I2C but your OpenMV Cam will have to be the master device.

As for the buffer size. That isn’t really a concern. We can transmit the image in small chunks. Do you need help with how to go about this in particular? You basically just compress the image and then use the array index functionality to send a few jpeg bytes at a time. You’ll be basically transmitting a byte array.

SPI is available but the problem is that I kind of need to solder a wire with an IC and I am afraid that I might not burn that IC.
The MSP430 controller can only be used with Master mode only so Can I can use I2C Multi Master mode ?

Yes! I was looking for help to send a few jpeg bytes at a time. Help regarding this would be really appreciated.

I know you are busy in website development and documentation so this is just a reminder.

Oh, yeah, I totally forgot about this. I’ll do this tonight.

Hi, I can’t get to this tonight. Something came up for me.

Um, so, did you see this? It’s a good example on how to transmit a JPEG compressed image via SPI: location by openMV - OpenMV Products - OpenMV Forums.

MicroPython has the rest of the docs: class I2C – a two-wire serial protocol — MicroPython 1.19.1 documentation.

No Problem! I am also not in a hurry, meanwhile I will look into the docs and SPI code.
Thanks again for all the help.

Is there any possibly of using OpenMV as I2C Slave? if by installing new drivers for OpenMV I2C slave? Also do send me the documentation for transferring the image byte by byte.

Hi, I’m going to work on this today. As for I2C slave. We use MicroPython’s device drivers. If they don’t support it we don’t either. That said, if you want to change the firmware then you can add the feature. But, that’s somewhat hard.

Okay, I finally did it:

I’ve attached two scripts. One makes the OpenMV Cam into a master module that reads data from the slave device. That’s how I tested this all out. The second is the slave script which is what you want. The slave script send images out of the camera.

I forgot about your requirement that the I2C buffer is only 256 bytes. I trust you can modify the code to deal with that issue? You just have to split the massive image read into multiple parts. Each part being 256 bytes of the image. I don’t think our image methods support accessing images via slices so you’ll have to transfer the image data to the temporary byte array I think using “data = bytearray(img)” so that you can slice the data up.

Let me know if you’re having trouble.

We don’t have a way in our library to turn bytes back into an image so that’s why that’s not done for you in the master script to see what the slave sees. We’ll have to add that to the firmware.

Make sure to attach pull ups on the I2C lines
is2_slave.py (1.38 KB)


is2_master.py (1.27 KB)


Actually, it looks like images support being sliced so just array access for the bytes you need. i.e img[0:250] to get the first 250 bytes from index 0, etc.

K, I’m wrong about that. You have to do this to transfer the data:

img = sensor.snapshot().compressed()
b = bytearray(250)
for i in range(250):
    b[i] = img[i]

So, basically just transfer over the bytes you need into a new array manually. Just do that for all the bytes until the last ones.

Note that the data in the image is JPEG data. So, don’t try to do anything with it until its delivered to a system that can JPEG uncompress the data.

Oh, I just looked at the files you sent. Thank you soo much for the help. I am extremely sorry for thanking you very late. I am busy in the exams, might disturb you in near future. Thanks again.