OpenMV as Autoguider for astrophotography?

You guys seem to be the perfect hardware for a headless astrophotography autoguider

Background: the Earth spins, so the stars in the sky are always moving. If you want to take a photo of something really dark and far away in the night sky, you need to do long-exposure photos but since the stars are moving, you need to move the camera in the opposite direction of the Earth. You can buy a star tracker for this but some star trackers allow you to add a optional secondary camera, an autoguider camera, to move more accurately, you get a sharper image.

Problem being they are $150 for just the camera, plus another $50 for the guide scope, plus you need a laptop to run the software.

I’m playing around with the idea of making all of this cheaper, or at least get rid of the laptop.

Anyways, the algorithm should be very simple. Find a dot, follow the dot by giving the star tracker (or EQ mount) one of four basic signals: left, right, up, down. (it’s controlled via a ST4 cable, just 4 signals, active low, very simple)

Usually during the initial setup, it’ll do a calibration. It locks onto a dot, tells the tracker/mount to go left for a few milliseconds, and see how much it moved and in what direction. This should be easy too.

My options are:

  • smartphone with a telephoto lens and a Bluetooth nRF51 doing the ST4 signalling (it sounds cheap but I’d rather still have my phone in my hand all night)
  • raspberry pi and HQ camera and a CS telephoto lens (price getting expensive)
  • ESP32 cam (where do I get lenses for these? The camera’s quality is usually terrible, dead pixels and such. Plus, I’m not excited about writing image processing in C++)

OpenMV hits a good price, has the super telephoto lens, runs image processing, has GPIOs. Optionally I can add the LCD shield or the Bluetooth shield.

Now I have no idea what the OpenMV camera would actually see when it’s pointed at a star. This is why I’m here on this forum.

My next bit of research revealed that OpenMV is limited to 0.5s of exposure. This should be OK, I think… my 350mm camera lens would see star trails at 2.5s. 0.5s should be fine if I point it at a super bright star like Vega.

The max gain is 32. No idea what that looks like in terms of brightness and noise. I’m honestly not sure how to convert gain to ISO, I know “base ISO” is supposed to be gain of 1, right?

I’m guessing the super telephoto lens is fixed focus at infinity? As long as the star looks round, my idea should work, I just need a point or circle detector algorithm.

Can somebody with the super telephoto lens take a couple of photos of the night sky to share with us? Extra points if you can run the circle detector and provide a plot of XY position frame-to-frame so we can get an idea of the motion noise. (don’t worry, we can filter this noise out, the earth doesn’t change speed)

Has anybody experimented with adding a heat spreader to the bottom of the OpenMV camera sensor PCB?

Thanks, hope you all had a chance to see Comet NEOWISE

Hi, you’re going to need to buy the camera to get the answer to most of these questions.

Anyway, you can make the max exposure longer than 0.5 seconds. If you turn the PLL on the camera off and do other ticks you can make it very long since we give you direct hardware control of the system. E.g. slow the clock way down, etc.

As for the camera to use, the global shutter module will give you the most precise picture. However, it’s expensive. The default camera is cheap and has large pixels for good low light response. So, it will probably do the job too.

Tracking the stars can be done with find_blobs().

Hi thanks for the info so far. I ordered the plus version just now.

Calculations tells me that it’ll be a pretty crappy guider, 11.6 arc sec per pixel. I might end up turning this into a polar alignment camera instead. Gonna need to write a fake plate solver.

Testing is going to be a pain lol a few nights ago a firetruck showed up to kick away all the photographers and astronomers from a spot on Skyline.

hey I just read about

The OpenMV Cam has two memory areas for images. The classical stack/heap area used for normal MicroPython processing can store small images within it’s heap. However, the MicroPython heap is only about ~100 KB which is not enough to store larger images. So, your OpenMV Cam has a secondary frame buffer memory area that stores images taken by sensor.snapshot(). Images are stored on the bottom of this memory area. Any memory that’s left over is then available for use by the frame buffer stack which your OpenMV Cam’s firmware uses to hold large temporary data structures for image processing algorithms.

If you need room to hold multiple frames you may “steal” frame buffer space by calling sensor.alloc_extra_fb().

If sensor.set_auto_rotation() is enabled this method will return a new already rotated image object.

Note: sensor.snapshot() may apply cropping parameters to fit the snapshot in the available frame buffer space given the pixformat and framesize. The cropping parameters will be applied to maintain the aspect ratio and will stay until sensor.set_framesize() or sensor.set_windowing() are called.

The H7 Plus has enough RAM to perform find_blobs() on the full 5MP resolution image, right? I don’t need high frame rate.

Yep! It will be super slow. But, no problem.

I guess I’ll find out how slow when it gets here

My plan is to sort the blobs by brightness. It looks like I can get a list of blobs easily. My next questions:

Is there a limit on the number of blobs detected? There could be… ahem… a lot of stars.

If a blob’s bounding box area is less than area_threshold it is filtered out.

If a blob’s pixel count is less than pixel_threshold it is filtered out.

I need the exact opposite of that. I need to keep small blobs but remove big blobs. I am guessing that I have to use threshold_cb? Or do I feed it a negative number or something? (this isn’t very important, a big blob would mean I need to retry the capture anyways)

I see I can get a circle representing each blob. What’s the best way to determine the “brightness of a star”? I think I see a b_and() function that accepts a circular mask, and I can probably call get_histogram() to get the statistics, but that sounds like it’d operate on an entire 5MP image, whereas I only need to sum up like maybe 10 pixels. Would you suggest I just iterate over the rectangular bounding box with get_pixel() instead?

Is there a more lightweight call for get_histogram if I only need min and max? Would it be faster if I iterated over the entire image myself? Honestly I just want a function that tells me if I have at least one true white pixel and one true black pixel.

Hi, just let find_blobs() run with no area/pixel thresholds and it will return all the blobs.

The callback method is useful however to stop before a blob is added to the output list and execute functions on that blob. You can target the ROI of get_histogram to lie on the blob and then filter the blob out by it’s brightness.

E.g. get_histogram(roi=blob.rect())

Do you guys have any specs on the optical errors of the lens and the soldering planarity of the sensor? I need he view to be perfectly parallel with the scope holder, or find a way to calibrate away that error. So I want to know how straight the lens is and how level the soldering is on the sensor.

Pretty sure polar scopes are usually machined aluminum. Commercially available polar alignment cameras and autoguide cameras all look like machined aluminum.

Guide scopes don’t really care about the parallelism since there must be a calibration routine anyways. I’m ignoring this use case for now.

I don’t own a lathe. I have my own FDM 3D printer and at work we have a Form 2. My current plan is to just do as good of a job as possible and then run a timelapse capture session overnight, calibrate based on the centroid of the circles captured.

Plan B is to just swap with an already aligned polar scope, and see what the difference is. But just touching the whole setup after alignment is risky. This will need to be done during the day with a stationary object on Earth, not a star, so maybe I can just bolt the scope holder to something that’s sturdier than a 3 legged tripod.

Haha, no.

There’s a reason the camera is cheap… Because it’s not calibrated. At all :slight_smile:. We could add two 0s. To the price for all of that if you want.

I envisioned that the WiFi shield provide a preview of the image to a smartphone. It’s gotta be full resolution for the exposure check. The actual plate solving bit can be just SVGs.

I am wondering if there are performance penalties. I am considering using a ESP8266 as a co-processor just to be the HTTP server if the performance penalities for the ATWINC1500 is too high.

Very simply… is the ATWINC1500 driver completely triggered by interrupts? Or is there a periodic SPI poll?

I’m sure you optimized the crap out of everything for your race cars but just wanna make sure.

Hi, the ATWINC1500 used to be bad. But, when doing the interface library updates for it I fixed a lot of issues with it. It works using 40 MHz SPI. Transmit is very fast and doesn’t block. RX works but is not as performant. The WINC1500 cannot actually handle a large packet stream input from another device over WiFi without falling over itself. Generally, what happens is that for UDP messages are dropped. TCP works better for RX since the WINC1500 can slow the external device down. Transmit can do 15 Mb/s if you were just moving a large pre-allocated array. However, you should get about 7.5 Mb/s in practice on average.

Anyway, use TCP to do transmit. You should get above 5 Mb/s easily.

I’m sure you optimized the crap out of everything for your race cars but just wanna make sure.

Working on it. At the start of this year a lot of things weren’t. Right now… SDRAM is as fast as it can be. And we’ve got the camera DCMI driver working as fast as it can. We will be adding double buffering soon to finish maximizing the performance out on that. Regarding I/O. When I made the interface library we fixed all issues with SPI/I2C/CAN/UART to get the best speeds out of these interfaces and verified that the bandwidth is maxed out with a logic probe.

SD Bandwidth needs to be worked on still (will do after double buffering is enabled).

Just got it in the mail

The M12 thread is really loose unless the lens is screwed all the way in, but if it’s all the way in, it’s definitely not focused right. This is a big problem but I might just solve it with teflon tape or thread locker. Or maybe I could stick an o-ring between the housing and the lens?

Really wishing the WiFi shield did not come with the pins soldered

Sensor was dirty on arrival. Sensor cleaner for my camera works wayyyy better than rubbing alcohol. But I had to clean it while the view was live.

Wish I could do more today but its cloudy.

Could you please send me the code that enables longer exposure times than 0.5sec? Thanks.

Hi,

The M12 thread is really loose unless the lens is screwed all the way in, but if it’s all the way in, it’s definitely not focused right. This is a big problem but I might just solve it with teflon tape or thread locker. Or maybe I could stick an o-ring between the housing and the lens?

You have to focus the lens and then tighten the screw. I think we will switch to a screw on M12 thread lock ring.

Really wishing the WiFi shield did not come with the pins soldered

Lots of folks wish the opposite! :slight_smile:

Sensor was dirty on arrival. Sensor cleaner for my camera works wayyyy better than rubbing alcohol. But I had to clean it while the view was live.

Hi, we get the camera sensor cleaned by the factory now days. Where did you buy this unit? Our official modules all are cleaned now.

Could you please send me the code that enables longer exposure times than 0.5sec? Thanks.

What model are you using? The OV7725? Just use sensor.__write_reg() to change settings:

sensor.__write_reg(0x0D, 0x00) # Turn off PLL
sensor.__write_reg(0x11, 0x80 + 3) # Divide input clock by (3+1) * 2 # can go up to 63

This will really slow down the clock… note that you you make a single image take longer than 3 seconds our driver will say the camera timedout. You can make that longer if you need it.

After which you can use sensor.set_auto_exposure(false, exposure_us=1000000)

To get a 1 second exposure.

Here’s how exposure works: https://github.com/openmv/openmv/blob/master/src/omv/ov7725.c#L403

This is the H7 Plus, so the sensor is a OV5640, I don’t see any documents about the registers at a glance. Is it the same code though?

The lens I’m using is the telephoto lens. What screw? There’s a small hole on the top of the housing but it’s not a screw. I know the default lens has that ring that sets the focus but that doesn’t exist on the telephoto lens.

The camera is from Elmwood, SN appears to be 10888

Ah, yeah, the H7 plus has a screw lock ring and not the regular set screw. For our first production run with it the sensor might not have been totally clean.

You can take the screw lock off the default lens and put it on the Telephoto screw.

As for the H7 plus. Give me a second.

Here’s how exposure works:

https://github.com/openmv/openmv/blob/master/src/omv/ov5640.c#L791

The OV5640 is actually a very complex camera SoC. Much more so than the OV7725. As such, it has a very complex clock tree. If you work out the math for the clock we are feeding it… it’s actually running at 800 MHz internally.

You’re going to want to edit this stuff:

0x3036 SC PLL CONTRL2 → Bit[7:0]: PLL multiplier (4~252) Can be any integer for 4~127 and only even integer for 128~252
0x3037 SC PLL CONTRL3 → Bit[3:0]: PLL pre-divider 1, 2, 3, 4, 6, 8

The default firmware pushes the camera literally as fast as it can go before the image quality starts to really get bad. For an easy slow down that our firmware understands just lower the PLL mul value. It’s by default set to 0x64. So, just lower that to like 0x8 and that will slow all clocks down by 8x. The auto exposure code then will calculate the requested exposure time based off that being lowered.

sensor.__write_reg(0x3036 , 0x08)
pyb.delay(500)
sensor.set_auto_exposure(false, exposure_us=1000000)
pyb.delay(500)

You can take the screw lock off the default lens and put it on the Telephoto screw.

No I can’t, the correct focal point is like 1mm away from the bottom, there’s no thread in that region, and the ring is too thick anyways. I think some teflon tape would work. I would permanently glue it down once I make sure that thermal expansion/contraction due to night weather doesn’t affect the focus too much.

The code you’ve posted is a bit wrong, the address 0x3036 should be 0x3037, then it worked. I can push it past 0.5 sec but I can’t seem to get anything slower than 1 sec now, even setting it to 2 sec would reply back with about 1.06 FPS in the serial terminal. I thought the timeout was 3 sec?

I turned off my room lights and it’s picking up some street lamps… damn clouds but hey we need some rain lol

Yeah, the timeout in snapshot is 3 seconds.

That datasheet I liked to is how we developed our firmware. Took a lot of effort because it’s not really enough documentation. Pretty much for any of these modernish camera SoCs now days big companies just use FAEs to develop customer products and there really isn’t documentation about how things work. It’s all kept internally to the company.

Anyway, just try out different PLL values to slow the camera down.

If you want precision control get the global shutter camera. It has clear register setting documentation so you can control it precisely.

Ok weird stuff happened, I swear I pushed the PLL multiplier down to 66 at my desk and the whole thing worked fine and I almost got to as long as 3 seconds worth of exposure

Then… night time… I couldn’t push it past 1.4 sec because the PLL wouldn’t let me change multiplier without doing the initialization error

But I got this shot of some power lines, oh, and Jupiter


Plate solving with Astrometry Astrometry.net which looks correct, confirmed it with Stellarium

(camera is upside down due to 3D printed tripod mount)

Still plenty of more work to do, including that WiFi AP problem I posted to WINC soft AP causes crash - OpenMV Products - OpenMV Forums

Thanks for all the help so far

COOOOOOOL you can see Callisto and Ganymede with this thing