This is based on the genx320_grayscale_with_tracking example with the sensor biases set for active marker detection. It looks great indoors and in controlled demos.
I’ve had a hard time finding any examples of a GENX320 simply pointed outdoors in bright conditions, so I wanted to ask if anyone has tried pointing one out a window or outside on a sunny day with these biases enabled.
I’m mainly curious about whether the sun, skyglow, or specular reflections from cars, windows, or foliage generate a lot of spurious events, and what those events look like spatially. Even just waving the camera around with the sun in frame would be interesting to see.
My use case is long range flashing LED beacon tracking where the beacon might be only one or two pixels in apparent size. I’ve had really decent results with frame based cameras, but things get difficult when there is hand shake or imperfect gimbal motion and the payload is extremely small because it’s faraway. The appeal of the GENX320 here is the way it responds to changes over time rather than full frames, which seems like it could be much more tolerant of observer motion.
I’m not asking for a full benchmark or a flashing LED demo. Even a short clip or screenshot of a sunny outdoor scene with the active marker biases enabled and some motion would be really helpful. I’ve searched around but haven’t been able to find any outdoor examples like this.
Thank you! Ill stay tuned, potentially a really cool module down to the silicon for this use case. Its hard to understand how good a “tunable discrete bandpass filter” for every pixel could be.
I checked in with the sensor supplier, and here’s what I learned.
Your use case is actually a very good fit for event-based sensors like this. In their Metavision examples (such as frequency estimation and active marker tracking), they detect blinking temporal patterns per pixel, so tracking can work with just a handful of pixels — in some cases even a single pixel is sufficient.
A pre-programmed LED “active marker” (each LED has a unique blinking pattern)
Reference code to track and identify the LEDs on Kria
They report that tracking is very stable and only relies on a few pixels, making this the fastest way to get a working system if you want something turnkey.
For outdoor data: they don’t currently have public recordings of the active marker used outdoors, nor datasets captured with identical sensor biases. However, they do provide several outdoor event datasets here: https://docs.prophesee.ai/stable/datasets.html
In particular:
driving_sample
courtyard_walk
automotive
Those datasets were captured outdoors. Keep in mind they’re provided in RAW event format and require the Metavision SDK to replay properly.
For convenience, they also shared a couple of rendered outdoor examples (AVI videos at 30 fps with a 33 ms accumulation window). These are meant purely for visual inspection, not processing — the real processing happens on the underlying event stream. These videos do not include active markers and use different sensor biases.
I’ll see if it’s possible to build an active market, maybe OpenMV Can sell this. It think it’s just LEDs with a local oscilator at a particular freq which can be done with hardware chips only.
Thank you very much for taking the time to check with Prophesee and for putting together such a thorough response. I really appreciate it and this is extremely helpful.
The fact that tracking can potentially work down to a single pixel via temporal pattern detection is especially encouraging. That’s exactly the aspect of event-based sensors that makes them compelling for long-range beacon tracking, where apparent size under inconsistent motion becomes the dominant constraint.
I’m also very interested in exploring the active-marker approach experimentally, particularly how modulation parameters affect event generation at range. I’ve spent some time looking through the datasets you’ve shared and it’s been useful to see how to characterize the spurious detections.
Thanks again for sharing all of this. It gives me a much clearer picture of what’s practical to explore next.
I talked with them on making my own active marker board and it appears they want you to buy their SDK for that. I guess we’ll leave this to our customers to build.