This is just an idea for OpenMV to think about.
The Lego Mindstorms is a very very large community that is also used to paying Lego prices for their hardware. There has been a couple of after market machine vision cams released for minstorms. NXTcam was the most liked and pixy cam have a Lego interface as well. The picy cam was less liked because the Lego community didn’t like that it didn’t have a case and electronics was exposed for kids to damage. The NXTcam was much more liked but the interface was only for NXT and when next generation EV3 was released using lab view to program it there isn’t any interface for the Lab View.
The Lego interface with pixy cam can only do (x,y,w,h) first colour blob detected. I have used and tested it myself.
I haven’t used NXT cam as have never owned the Lego NXT but I believe it does colour blob tracking and line following.
The biggest option the Lego community keeps asking for is face detection. If there was a Lego version of OpenMV with face detection, colour blob tracking and line following it would be the best Lego machine cam on the market.
I believe Lego EV3 can recieve sensor data from sensors via UART or I2C.
I think that if OpenMV was to make a Lego Labview interface that it could sell a lot of cams to the Lego community at a slightly higher price to cover the cost of developemnt to the lego version.
NXT cam in action NXT Cam ball follower - YouTube
Lego Pixy cam in action Pixy Cam - powered LEGO MINDSTORMS EV3 Mecha twins - YouTube
Hi, yes, I suppose we could do this. Would you be willing to help? It’s quite literally just a script to make the camera work with lego. I guess I could write it too. Know any contacts about who to talk to on making a camera?
Sorry, but, we need evangelist and help. It’s just me and Ibrahim side jobbing the OpenMV Cam company. It’s quite helpful if folks push the project in new directions.
I am happy to help where I can and I think this project is a great project and the more cameras you sell the lower the price for everyone and the more development will be supported.
Obviously the OpenMV software side of this is only writing a short script to make these features available as a slave to the Lego EV3 brick (Lego processor). The OpenMV hardware will be just make a case with lego attachment holes and also a lead to interface bewteen the OpenMV and the EV3 brick.
The harder part is the Lego side of things as you will need to make a Lego Lab View block for the users to access the data the OpemMV cam is pushing out.
Some info on the Lego mindsorms EV3 platform. It is a Linux OS that runs a Lego virtual machine that auto boots a Lab view programming environment. see the EV3 developer tool kit at the bottom of this page https://www.lego.com/en-us/mindstorms/downloads
I do know 1 of the guys that maintain EV3DEV which is a alternative Linux OS for Lego EV3 that allows you to program the EV3 brick in Python, this guy loves machine vision and I have already showed him your cam and he was excited and though it would be great fro Lego.
old thread but question on if anything happened here to improve lego connectivity and what is the script you suggested here?
to answer this I found/built a simple example