Sunday, October 15, 2017

Double Cluster, Heart Nebula and Soul Nebula

This is the second image with my Pentax-based imaging system:

(click on image for full resolution image)
The three main objects in this image are the Double Cluster, the Soul Nebula (top) and Heart Nebula (below). They are all in the Perseus arm of our galaxy (earth is in the Orion arm) and in similar distance. The Double cluster is 7500 light years away, both nebulae 6500 light years.
The Double Cluster is relatively young (12.8 million years) and appears to be slightly blueshifted. This is a result of its movement - it races towards earth with 38/39 km/sec!
The cluster can be seen with the naked eye in really dark areas and easily with a binocular. It was discovered 130 B.C. by the greek astronomer Hipparcus.
Both nebulae are actually one gigantic complex that is 300 light years wide! They are connected by a bridge of gas. Both are birthplaces of stars in their center (which is why their centers are less red: a lot of gas has already been consumed by new stars). The stars in their centers are just a few million years old - and they are younger the further they are away from the center.

Processing this image was made difficult by the bloated stars in the LRGB images:
LuminanceHa

I compensated for this already in the linear state by shrinking the stars using the MorphologicalTransformation process in Pixinsight (again, using one of he awesome tutorials on lightvortexastronomy.com):

First, I created a "contour star mask" from the stretched image:

This star mask should cover exactly the stars. Here is how the inverted mask looks::

Now, we apply the MorphologicalTransformation process:

And here is the result before and after:
Before:After:

It's a subtle difference (which is good as we don't want to completely change the image) - but makes a huge difference further downstream.

Sunday, October 8, 2017

The North America Nebula

This is the first successful image of my Pentax-based imaging rig:
(click on image for full resolution)
Processing this image was a challenge - mostly because the imaging scale is so different then my images from my Takahashi TOA-130 scope. It has A LOT of stars and very little true background.

This nebula is four times the size of the full moon (which demonstrates the HUGE field of view of the 55mm lens!!!) The nebula and the Pelican Nebula (the lower, smaller nebula) are part of the same interstellar cloud of ionized hydrogen - which is forming stars. Between the nebulae and us are dust lanes that create the shape. We don't know for sure how far the nebula is away from us and what its dimensions are. Some speculate that Deneb (the very bright star in the lower part of the image and one of the brightest stars in our skies) ionizes this nebula. which would put it at a distance of 1800 light years and a diameter of 100 light years!

Comet Tracking

I read about C/2017 O1 ASAS-SN. As this comet is currently nice and high in the sky (at least in the second half of the night) I decided to give it a shot.

First, I just pointed the mount to the point in the sky where the comet is and started imaging:

(OK, ignore the dust motes for a second :-) But you can clearly see the comet in the middle right. Zoomed in:

Yep! There it is. But of course elongated as the scope is tracking the stars - not the comet.

So, I used Horizons (part of APCC Pro) to track the comet. I think Ray Gralak did a great job with Horizons. It's a little complicated at first to setup, but once you went through it once or twice, it's actually very easy!

With that, now, my images looked like this:

The comet is in the middle - and you can already see that the stars are elongated. Zoomed in:

:-(

It looks as if Horizons adjusts the mount only every few minutes (these are 10 minute exposures) ...

... on second thought, this could also be my PEC gone wrong. Tonight, I'll try 10 minute unguided exposures with and without PEC to see what happens ...

Sunday, October 1, 2017

A Pentax lens based imaging rig

1+ year ago I started to work on a Pentax lens based imaging system. I saw some wide angle shots that were made with 30 - 60mm lenses that looked amazing. And because of the resolution, these images can be taken without guiding and just a good enough pointing model. Almost "point and shoot" style.

Why Pentax lenses?
Pentax 67 lenses have three properties that make them suitable for astro-imaging:
  1. Backfocus
    These lenses have a backfocus of 84.95mm(!!!) - enough to put focuser and filter wheel between it and the camera
  2. Imaging circle
    These lenses have an imaging circle of almost 8cm(!!!) - more then what most CCD cameras have.
  3. Quality
    Being made by Pentax, the optics of these lenses is remarkable. Definitively good enough for astro-imaging.
I purchased a Pentax SMC 67 45mm F4 lens - it's very short and will create a LARGE image circle of more then 700 arc minutes!!!

Adapter

First, I had to get an adapter. I was planning to use a PDF focuser from FLI and the CFW2-7 filter wheel from FLI plus my MLx694 camera:

MLx694: 21.0 mm
CFW2-7: 20.9 mm
PDF: 29.84 mm
___________________
50.1 mm

That means that the adapter has to be 84.95mm - 71.74mm = 13.21mm thick - from the Pentax bayonet to the PDF focuser thread. Unfortunately, preciseparts can't make the adapter shorter than 24.00mm. I actually ordered the 24mm adapter, but couldn't achieve focus with it.

I checked some local companies who make adapters like these. But most didn't even get back to me - they probably only make adapters in MUCH larger numbers.

Next, I ordered a Pentax bayonet adapter with a different other side and tried to make one myself - but my skills on the lathe weren't nearly good enough to make one that is rigid and exact enough ...

A different approach would be to use a Robofocus focuser to actually turn the focuser of the lens itself. This would remove almost 30mm from the system and be plenty enough (see here for an example). But that sounded tricky as I would have to mount the focuser next to the lens stable enough that it can turn the lens. This didn't pass my test for a simple, robust system...

Finally, Richard borrowed me his adapter that he made for himself for the same purpose a few years back.


Problem #1 solved.

Mounting

Now, assembled, this is the whole imaging train:

My first attempt was to 3D-print an adapter to put this on top of a losmandy dovetail:


This worked, but it moved the whole thing pretty high up which made it less stable and also required A LOT of counterweight to balance it.

Luckily I realized that the holes in the FLI camera are exactly the same distance as the holes on the dovetail!!! That made it much easier to mount it!


First Light - weird halos

Finally, I could take out my camera and try it. This was the result:


Not too bad! But when you zoom in:

There are some weird halos around the stars. This turned out to be a major pain in the ...

First, I tried to use a UV cut filter. These lenses are very sensitive to UV light and it could be that this is the cause (though unlikely as this is a narrowband image, i.e. UV should already be filtered out).

Next, I thought that this might be stray light and put a lens cap on - same result.

I tried different f-settings - no improvement.

Finally, I took my setup to Richards ranch and we worked on it. He also couldn't find anything wrong. But he checked the setup of the lens. And in a diagram, he noticed that the last element of the lens is almost extruding from the lens and is curved. He suspected that what I see are reflections from the lens.

First Light 55mm lens

To check this, I purchased a PENTAX 67 SMC P 55mm F4 F/4 lens. I'm glad that I didn't go the route of using a Robofocus focuser, otherwise, I'd have to fuzz around with the setup as the focuser wheel of this lens is in a different position as the 45mm lens ...

First light:

Looking good! Zooming in:


Yei! No halos. Some fuzziness around the bright stars left - which is probably a result of the UV-sensitivity (I should get a UV cut filter for this lens).

Plate Solving

I want to use this rig with my MyT mount, so that I can build a pointing model and use that to track without a guidescope.

Sounds easy, but plate solving at this scale is a challenge!

Using the entire image usually doesn't work as there are WAY too many triangulations (it's kind of fun to watch this in TheSkyX which shows which frames are considered - it's jumping around like crazy!!!)

Pinpoint doesn't work at this scale at all to begin with.

Through a lot of trial and error, I determined that these are the best settings:

Filter: Luminance
Exposure: 8 sec
Binning: 2x2
Image Crop: 50%

One challenge remains if I image from out backyard: light pollution! Because of the relative small aperture, the stars don't stand out from the light-polluted background and are often missed. Nothing I can do about that :-(

Another challenge is that often earthly objects (trees, our house...) cover some of the image and create "artificial stars". Though the resulting images are sometimes really cool:

Focusing

This was (and still is) really tricky.

Because of the short focal length, the critical focus zone is 78 microns(!!!)

This means three challenges:
  1. Focusing has to be very exact
  2. The focuser can only be a little bit away from the focal point - otherwise the stars are so out of focus that they aren't recognized anymore.
  3. I have to refocus a lot during the night.
And of course, focusing routines also suffer from the same problem as plate solving: our light polluted backyard - which makes stars often not stand out enough ...

I am using the following settings (in SGPro):
Number of data points: 9
Step size: 70 (these two settings are the most sensitive ones: too small and I have to move the focuser too close to the focus point to start with plus it doesn't move out far enough to create a good v-curve, too large and the focuser moves too much out and SGPro doesn't recognize any stars)
Minimum Star Diameter: 6
Binning: 4x4
Exposure time: 20 sec for LRGB, 45 sec for narrowband filters

Flats

Taking flats is something I have not figured out at all:

  • Sky flats always have gradients (because of the large image circle). Even if I put a white t-shirt on it to diffuse the light - there are still gradients.
  • Flats with a flat panel create weird patterns (probably because of the proximity of the panel to the lens).
:-(

First Images

With all this figured out just in time for OSP, I took my first two images: the North America Nebula and the Double Cluster/Heart Nebula/Soul Nebula together. I took the LRGB data for both images at OSP and augmented it with narrowband data taken from our backyard.

Result

I have a lightweight, wide-field imaging rig!!!

But because of all the remaining instabilities in focusing, plate-solving, flats... it's pretty far from the point-and-shoot vision that I had :-( I'll keep trying, but maybe instead I should reconsider getting that awesome RH 200 Veloce from Officina Stellare :-)

Sunday, September 17, 2017

NGC 7822

NGC 7822 is a young, star-forming region in Cepheus - some of it's regions are no more then a few million years old. It is 40 light years across and lies 3000 light years away above our galaxy. Inside the region is a supernova remnant - which indicates that a massive star in it has already exploded. Also, it contains one of the hottest stars discovered near our sun - it has a surface temperature of 45000 Kelvin (the surface temperature of our sun is 5778 Kelvin). It's luminosity is about 100,000(!!!) times that of the sun.

(click for full resolution)
I took the LRGB data (4.5 hours) at OSP and Ha and OIII (13 hours) in our backyard in San Jose. I used the Ha to enhance Red and Luminance and OIII to enhance Green and Blue.

When processing this image, I ran into something that I haven't experienced before. After stacking, color correction, background extraction and enhancing the images with my narrowband data) I ended up with the following RGB and Luminance image (stretched):
But when I combined them (using LRGBCombination in Pixinsight), I got this:

Zooming in reveals that applying the Luminance image indeed increased the detail, but it took out almost all color!

I tried this several times, finally asked around. And it turns out that LRGBCombination only works on stretched images! I have no idea why I didn't encounter this before - I am sure I had done this before. But once I did this, everything else went smoothly!!!

Thursday, August 31, 2017

LBN 468

This nebula is 1,600 light years away from earth. One of the most interesting parts of the nebula is the bright part in the middle right. This nebula is called Gyulbudaghian's Nebula. It is a bipolar reflection nebula that is illuminated by a proto star.

(click on image for full resolution)
This is unfortunately only one half of the whole nebula. I wanted to take a mosaic of this but the last night at OSP didn't have good skies and I couldn't take enough data of the other half.

Eclipse Corona Image

Here is my main image from the solar eclipse:
(click on image for full resolution)

This was my biggest challeng: to process the 3 bracketed image sets that captured the corona of the sun. The corona has a very high dynamic range. My first tries to use any HRD algorithms (in Lightroom, Nik HDR Efex...) did not go very far.

I found a couple of useful tutorials on the web:
They all employ a similar workflow:
  1. Align all images precisely
  2. Create a composite image of the sum of all individual images
  3. Run a radial blur filter or a Larsen-Sekanina filter of the composite. This will create an image with all the detail but low contrast (mostly grey or black)
  4. Multiply the sum image with the detailed, low-contrast image
  5. Make final adjustments (stretch, curves...)
Fitswork has an interesting approach: pick two images, overlay by subtracting one from the other, this will make it easy to align them. It does work. ... but I found it too cumbersome. I was looking around for another solution and found the FFTRegistration (Fast-Fourier-Transformation) script in Pixinsight. This is often used to align comet images.


Enter a reference image, add all the images, I wanted to store the registered images, so I checked this and click "OK". ... takes a while and this was the result:

I took this image into FitsWorks and selected the Larsen-Sekanina Filter:

There are only two settings:

Rotation and Radius. The tutorial recommended to start with Radius=2.0 and Rotation=1.31. I chose those settings and got this result:

It has a lot of detail - and almost not other information (low contrast, wrong colors...)

But multiplying both images gives this result:

Doing a simple stretch:

And some curves, saturation and color adjustments:

And then some final tweeks in Lightroom (devignetting, cropping...)

Sunday, August 27, 2017

No Temperature reading from secondary FLI focuser port (flifoc1)

Last night, I took narrowband images of Cederblad 214 to augment my LRGB images that I took at OSP. This morning I was surprised that most of them were badly out of focus. On further inspection, I noticed, that SGPro never refocused. The only time it did was when it changed from the Ha filter to the OIII filter.

.. I checked the autofocus settings and was surprised that the temperature-based option was greyed out ...

... then I realized that the focuser did not report any temperature reading at all ...

... then I remembered that the camera disconnected last night and I had to reconnect through the secondary FLI port.

And, yes, that was the reason: when using the focuser through the secondary FLI port (flifoc1), there is no temperature reading - only on the primary one (flifoc0).

Learnt something - and I have to throw away most of the Ha images (the OIII images were taken after 3:40am when the temperature was much more stable).


Monday, August 21, 2017

Solar Eclipse - quick and unprocessed

I have to do some processing and will write up more about the amazing experience that was this solar eclipse. But here are some of my favorite photos:
Our camp at OSP during totality

The Diamond Ring

The corona during totality

Baily's Beads


Friday, July 28, 2017

Weird APCC failure

Wanted to take more data for Sharpless-86, but ran into weird issues:
  1. Polar Alignment worked
  2. Started SGPro Sequence
When SGPro tried to center the object, APCC complained that the correction to be made is too large. Tried again. Same result. Then I tried to do just a "Solve and Sync", first I noticed that SGPro failed over to Blind Solve and then I got the same error again.

I then tried to do an Image Link in TSX. And ran into the same issue. Only a All-Sky link worked ...

Restarted the scope, restarted the computer - same result.

At some point I noticed that when a slew command was issued, the scope started slewing but the position in TSX did not update (it just sat at the Park 3 position). I tried a couple of things.

When I disabled APCC (i.e. connected the Astro-Physics ASCOM driver directly to the serial port of the mount) everything worked!!!

I have absolutely no idea what happened.

First, I didn't think too much about it and that I could just use the mount without APCC.

... but then I remembered that I wanted to use Horizons for the solar eclipse to make sure that the sun stays centered... and for that I'll need APCC. So, I still have to figure that one out...

---

After I posted to the ap-gto mailing list, Ray Gralak replied and a) asked what versions of APCC and the ASCOM driver I'm using, and b) recommended to update both to the latest versions.

I realized that I updated APCC recently but not the ASCOM driver. After doing that, everything seems to work. Yei!!

Monday, July 24, 2017

How to solar charge Li-Ion batteries

So, I bought two Li-Ion batteries for the field. They are definitively smaller and much lighter. I used them last night for my scopes. Everything worked great. Then I charged them with me 100W solar panels during the (very sunny) day. In the evening I tried to use them again, but both gave up very quickly.

Did some research and found out that they need special charge controllers as their charge profile is different. After lots of research I bought new charge controller from Morningstar. In about a week will I find out if and how much of a difference that makes...

Wednesday, July 12, 2017

Mauna Kea Observatories

My personal highlight of our Hawaii vacation was a summit tour of Manua Kea to watch the sunset. Initially, I had SO many ideas for photos and videos to take. But I ended up taking a few and then setting the camera into auto mode to take images for a sunset timelapse ...
... and then just taking it all in!!!

First, the timelapse:


And then some of the amazing photos that can't capture the moment.
The Subaru telescope (left) and Keck I and Keck II.

My friend Patrick in front of UH88

The CHFT

Taking Photos

Me and Patrick
Star gazing under the Milky Way on our way back down

Tuesday, June 13, 2017

Blew my MyT electronics board and NUC...

The disconnects of my MyT mount were (apparently) related to the 12V->48V converter that I used. When I wanted to check it out further, the converter shorted and took the MyT control board and the NUC down ... SH###!!!

So, I need fuses (as Richard always told me) and I need them such that everything works from home and in the field. This lead to this project to create a setup that I can use at home and almost exactly the same way in the field.

Eclipse MegaMovie

I love what seemingly random projects we sometimes do!!!

In this case it's Eclipse MegaMovie 2017 - a lot of info about the upcoming eclipse including simulator how the eclipse will look from arbitrary locations.

Sunday, June 4, 2017

PDF focuser weirdness

Suddenly my (fairly new) FLI PDF focuser acted up:
  1.  I move the focuser to 2500
  2. Then I move it to 2501 (using the FLIFocuser app to make sure that this isn't a problem with some other software) - I can see that it moves first to 2506 and then back to 2501 (I guess it has a 5 step built-in backlash compensation?)
  3. Then I move it to 2502But now it only moves it to 2496 !!!!
    It somehow didn't execute the +1 step from me or +5 step from the first part of the backlash compensation. But only the -5 backlash compensation.
And it happens A LOT! I tried all three speeds (Slow, Medium, Fast) or I tried other step sizes - always the same.

Asked Richard, Richard, Jim and John and also sent an email to FLI...


Saturday, June 3, 2017

Same setup at home and in the field

I find it always tricky to move my scope into the field and make sure that everything works. The main two challenges are:
  1. Powering everything from 12V batteries
  2. Using my laptop vs. the NUC on the scope for control.
1. Powering everything from 12V batteries
I decided to start using a 12V power source for everything. So, when I get into the field, I just replace that one with 12V batteries and everything should work exactly as before:
  • Most equipment runs off 12V anyway (focuser, filter wheel...) - Just need coaxial plugs (2.1mm and 2.5mm) with Anderson connectors.
  • The NUC computers can run off 12V.
  • The FLI cameras need 12V-12V DC/DC converter
  • The MyT mount needs a 12V-48V DC/DC converter
All connections need to be secured by fuses. Actually, I should have done that a long time ago. But was "reminded" of it when I blew up my NUC AND the control board of the MyT mount :-(

For that, I measured the power usage of all components:
ComponentPeak Power UsageFuse
FLI ML160705A
(cooling at 100% and taking continuous images)
7.5A
FLI MLx6942.8A
(cooling at 100% and taking continuous images)
5A
NUC computer2.2
(network usage, storing, CPU usage)
3A
MyT mount4.2A
(when full slewing)
5A
MyT equipment
(FLI FW, FLI PDF, Dew Heater, Grasshopper3, USB Hub)
1.3A2A
Mach1 mount1.2A
(when full slewing)
2A
Mach 1 equipment
(FLI FW, FLI Atlas, Lodestar, Dew Heater, USB Hub)
1.7A2A
Lunt Pressure Controller, Microtouch Focuser<1A1A

I bought a RigRunner to power all components on the scope and have just one main power line from the 12V power supply. The RigRunners are great for this as they have blade fuses integrated:

Now, with the NUC computer on the scope, I have all USB cables running there and only one (actually two - see below) cable running from the scope!!!:

Now, when I take the scope into the field, I can either replace the main line with one battery, or depending on power usage at night, power individual components (then with inline fuses) from a battery!
2. Using Laptop vs. NUC for imaging
The other change in the field was always that I used the NUC for imaging when I was at home and then my laptop when I was in the field. Which means that I had to copy profiles... over to the laptop - and often realized too late that I was using an older profile or such (e.g. with older focusing setup).

So, instead, I want to use the NUC always for imaging. They are great as they can be powered by 12V, i.e. don't need any converter or such.

At home, I usually used the wireless network card of the NUC to connect with them, which of course meant that I would need a different setup in the field (IP address!)

I did a little bit of research and found out that Ethernet is a little less power hungry then WiFi anyway. So, in the field I will use Ethernet to connect the scope and the laptop for remoting into the NUC. In order to use Ethernet at home too, I did the following trick: use one of our Google Wifi devices on the scope. I connect the NUC to the Google Wifi device and now I can connect from my laptop from inside via Wifi.

Finally, I use reserved (static) IP addresses for my laptop and the NUC, so that I can use the same IP addresses at home or in the field.

To power both of my scopes, I bought a small Ethernet switch which had only 2W input power (with 12V, that's less then 0.2A!!!) At home, I plug the Google Wifi into the switch, in the field I plug my laptop into the switch. No IP address changes or anything!!!

And with all this, most cables and components are now on the scope and won't change. Which makes the whole setup WAY less cluttered!!!

Monday, May 15, 2017

Sudden disconnects from MyT mount

On Sunday, when I tried to take some sun images, my MyT mount suddenly started to disconnect. At first, I thought that it is just a fluke thing, but then it happened pretty reliably every time I gave a slew command from TSX.

I saw two error messages:
Device: Mount
Error, poor communication, connection automatically
terminated. Error = 213.

Receive time-out.COMM_TIMEOUT . Error = 21002.


After the slew ended I could connect TSX again...

Tried a different USB port or a different cable, both with the same result.

I then tried with my other NUC and my laptop - both worked fine. So, it seems to be something about the computer. I then tried to power the mount and/or the NUC separately to make sure that there is enough power, but always with the same results.

Finally, I posted in the Bisque forum.

... maybe it is the 12-48V converter ...

---

Yes, it was the converter. And when I investigated it more, I blew the controller board and the NUC !!! :-(

Wednesday, May 10, 2017

PHD2 needs "screen" ?!

When I setup the new NUC, I initially did not plug in an HDMI headless plug. Everything seemed to work OK.

But in the first two nights of imaging, it seemed as if PHD2 at some point stopped responding/working. Which mean that a) the guidestar wandered out, but also that SGPro could not get a response and aborted the sessions.

When I tried to figure what happened at the time when PHD2 stopped responding I noticed that these were the times when I closed my laptop from which I remoted into the NUC. And with that effectively ending the screen session.

Weird!

Last night, I kept the laptop connected - and everything worked fine. Tonight I'll try with the headless plug and see if that works...

... no luck either.

Next, I checked if the USB ports might get powered down (although this is unlikely to be the cause as other software like SGPro or TSX keeps working). I made sure that for all USB hubs, the checkbox next to "Allow the computer to turn off this device to safe power" is disabled.

... still the same ...

Running out of ideas, I posted to the Open-PHD-Guiding Group.

Andy Galasso suspected that this has something to do with "Adaptive Hibernate" mode - though I couldn't find it. He also created a binary of PHD2 that tries to keep the computer alive. But that also didn't work.

So, I need to dig deeper (Windows 10 setting, BIOS...) to find out what causes this...

---

The crazy thing is that every other program seems to work fine. I did a whole T-Point run with TSX with Teamviewer detached. SGPro continues to take images (but then hangs when it wants an update from PHD2 after dithering)...

---

Andy asked if I would see the same behavior with other remoting software. I tried out "Windows Remote Desktop" ...
... and everything worked!!!

Really interesting: after I used Windows Remote Desktop once, I could then use TeamViewer again and PHD2 would not stop after disconnect ... really strange ...

So, between just switching to Windows Remote Desktop and spending more time on trying to figure out what TeamViewer / PDH2 is doing, I take the shortcut and use Remote Desktop (even more so as there is now a Mac and Android client too!) I still summarize my observations and will send them to TeamViewer support. Maybe they can figure something out.

With that, my TOA-130 scope is finally back in full service. Yei!!!

Monday, May 1, 2017

Eclipse Imaging #5 Aerial video / time lapse of moon shadow creeping over the land

This is probably the most far out idea. I want to fly the Mavic as high as possible and take a video or time lapse of the final minutes before the eclipse. Point the camera towards the horizon and hopefully see how the shadow of the moon creeps over the land. And then rely on the Mavic that it will auto-land once the batteries are low

Equipment: Mavic Drone - that's it!


Things to figure out:
  • Can I take images from the horizon to our location from the maximum height?
  • The shadow will be VERY fast (supersonic speed!!!) - What frame rate do I need to use
  • How long of a video can I shoot with that frame rate?
  • How long can the Mavic stay up there?
Scary is that during the eclipse it could get quite windy. Will the drone just be blown away and/or miss it's home landing zone?

Eclipse Imaging #4 360 video

To capture the event AND us, I want to take a 360 video:

  • Nikon KeyMission 360
  • Manfrotto tripod
The only thing to figure out is how to take a 2.5 hour video with the camera (SSD Card size, Resolution).

Setup should be easy: put camera on tripod. Start.

Eclipse Imaging #3 Ultra-wide angle Timelapse

Of course I want to take a timelapse of the event. And what better equipment then:

  • Nikon D750 and the 14-24mm lens
  • qDSLRDashboard (Holy Grail mode) + Intervallometer
  • On top of Really Right Stuff tripod
A couple of things to figure out:
  • The totality will last for 1min 28sec. In order to get a good timelapse, I need to take images every 5 seconds. Which means that the shutter speed can't exceed 3 seconds! Is that enough for the light during totality (I can ramp up ISO!)
  • The brightness will change before/after totality MUCH faster then at sunset/sunrise. Should I take the average of 2 images or just the last image for adjustment?
  • Can I use (again) a cable to control the camera, or do I need to control via Wi-Fi (sucks up much more power...)
Setup will be fairly straightforward:
  • Mount and orient camera the day before (for perfect framing)
  • Focus as always
  • Configure qDSLRDashboard
  • Connect and start Intervallometer
---

Before leaving to OSP:
  • Buy fresh AAA batteries for intervallometer
  • Try Auto setting for eclipse-like event (bright-dark-bright) - simulate with putting something in front of lens
  • Can I set qDSLRdashboard to use only 2 frames or even only 1 frame to determine new exposure/ISO?
  • Estimate max ISO / Exposure time (sent an email to the SEML group). With that, calculate the time and number of images to take and test if it can be done with one battery or if I need to buy a second grip for a second battery (my other one will be used with the Nikon D7000).
At OSP before August 21:
  • Setup tripod, check out framing, decide if I want to use slider or not (interesting object in foreground?)
  • Recharge slider the night before!
On August 21:
  • Setup camera
    • Frame
    • Focus
    • Fresh batteries into intervallometer
    • Fresh batteries into camera - attach grip with second fresh battery
  • Start qDSLRDashboard
    • Set Auto (both directions!!!)
    • Set level
    • Set max ISO=XXX max exposure time=XXX
  • Start intervallometer
    • Every XXX seconds

Eclipse Imaging #2 Ha-images with the moon covering the Sun

Well, this is what I bought my Lunt scope for :-)

I hope that these images provide some interesting detail (craters and mountains on the side of the moon against the sun with all it's surface details). And/or will make a good time lapse of the event. After reading more about it, the one thing I won't see here is the outer layers of the corona during totality (I would have to remove the filter for that ...)

So, the equipment I will use is:
  1. Lunt scope with automated pressure tuner
  2. Grasshopper camera from Point Grey
  3. MyT mount
  4. FlyCap software (from Point Grey)
Things to figure out:
  • For the sun surface vs. corona I need different exposure times. For my images so far, I took many images of the surface and then many images of the corona and then combined them. But here I need to take both at the same time. I.e. I need a program that can alternate between exposure times. Can FlyCap do that?
  • I also need different pressures for surface vs. corona. The app from Lunt to control the pressure can't do that and even if, I doubt that I could synchronize pressure alternation with exposure alternation.
    Maybe I need to write a script that triggers both (but need to find out if and what API the pressure tuner has)
  • How can the MyT mount track the sun over 2.5 hours?
    • TSX has a "sun tracking" speed. Need to try that out and how accurately it keeps the sun centered.
    • "proper" sun guiding solutions (like LuSol) might not work as the sun will be distorted.
  • Need to setup the Lunt scope side-by-side on the MyT mount
    • Need a better way to rotate the versa plate (without pinching the cables)
    • Need to route the RJ11 cable for the focuser through the mount
    • Need to run the whole setup from the NUC
Things I'll have to do onsite:
  • Accurately polar align the MyT mount (should not be a problem as I will use the MyT mount for astro imaging the nights before)
  • Focus and pressure tune as good as possible (I can do that the day before as I will keep the scope mounted)
  • That's it! This should be pretty easy! ... famous last words!!!
---

Update 5/28:
I tried to track the Sun with "Sun tracking" enabled. It worked very well in RA, but in DEC, the mount went out by ~1 arcmin per hour. Which is expected as this only adjust the RA axis. But it should be enough for Ha imaging as I only need the body of the sun (I won't be able to see the corona around it). Maybe I will have to slightly adjust it once or twice to bring the sun back to the middle.

Update 6/12:

After trying a lot of things (Bahtinov mask, Hartmann mask, FireCapture...), I think the best way to focus is to do it manually. Use Live View of the camera, zoom into the edge of the sun and then carefully move the focuser into and out of focus until I find the right position.

---

So, here is the plan for these images:

Before leaving for OSP:
  • Let camera run for the entirety of solar eclipse (before first contact and after fourth contact) - how much memory will I need? (might need to attach separate storage device)
  • Do I need to alternate exposures to capture surface and flares? Or can I stretch images enough to get flares? On solarchatforum, I got a reply to my post how to get both with one shot. Need to try this out.
At OSP before August 21:
  • Enter exact GPS coordinates from out site into TSX using the mobile GPS
On August 21:
  • Assuming that I have a great model and polar alignment from the previous nights, point the scope to sun. If necessary, center sun manually and synch into TSX.
  • Set tracking rates in TSX to sun.
  • Start Lunt compressor, set to ~9.5
  • Focus manually on outer sun.
  • Setup FireCapture
    • Set exposure time low enough that histogram peaks at 60% (should be able to leave gain at 0!)
    • Set delay between images to 1 second
    • Start recording

Eclipse Imaging #1 Individual images

I was reading a lot about how to take individual real-color images of a solar eclipse. But the outlook of constantly adjusting exposure time, changing filters.. wasn't too great. I want to enjoy the eclipse as much as possible and pay as little as possible attention to my gear.

So, I was delighted to learn about Eclipse Orchestrator. It does create a whole script and executes it along a very tight timeline (needs exact location and time). The only thing that it needs is to remove the filter just before the second contact and put it back on after the third. I can do that!!!

Initially, I was thinking of using my 300mm lens for my Nikon camera (I can only use the D7000 as the D750 is not supported) and mount it on top of the TOA-130 scope and use the Mach1 mount to track. But then I realized that I could also use the TOA-130 scope itself with the Super Reducer to take images. The field of view should be perfect and the non-flat image shouldn't be too bad for these images (especially considering that the D7000 isn't a full frame camera!)

So, the equipment that I want to use is:
  1. TOA-130 scope with Super Reducer
  2. Nikon D7000 (connected with DSUSB from shoestring astronomy to short the time between exposures)
  3. Mach1 mount
  4. Eclipse Orchestrator
There are a number of things to figure out:
  • I have an adapter from PreciseParts that should work (might be a few millimeters too short). Need to try it out and measure the distortion in the corners.
  • Need to spend a lot of time trying out the script that Eclipse Orchestrator generates - maybe adding some frames as possible (this turned out to be very involved and I decided to write an extra blog post about it)
  • How can the Mach1 mount track the sun over 2.5 hours?
    • The AP driver has a "sun tracking" speed. Need to try that out and how accurately it keeps the sun centered.
    • I could use APCC Horizons to accurately track the sun
    • "proper" sun guiding solutions (like LuSol) might not work as the sun will be distorted.
  • Need a new Serial-USB adapter (my old ones don't work with Windows 10)
Things I would have to do to set this up onsite:
  • Good polar alignment (not a problem as I will use the TOA scope the nights before for astro imaging)
  • Focus the sun extremely well
  • Enter exact coordinates from GPS (unfortunately Eclipse Orchestrator can't read from my GPS - have to enter them manually)
  • Keep NMEATime running while imaging to make sure that time is as accurate as possible (and disable Dimension4 - it shouldn't do anything as it won't find any internet. but better be on the safe side)
  • Exchange my normal imaging train (flattener, filter wheel, CCD camera) in the morning with the Super Reducer and Nikon camera WITHOUT affecting polar alignment or such.
---

Update 05/01:
Received the new Serial-USB adapter - works. Yei!!!

Update 05/28:
1. Camera Setting
Tried to figure out what exposure time / ISO to use. Full sun with the Baader filter is best at:
  • ISO 100
  • Exposure 1/400 sec
I need to use that as the basis for Eclipse Orchestrator.

2. Adapter
The adapter that I had, was too short (to match the backfocus of the super reducer). I ordered one with the proper length - but that one brought the camera so far out that I couldn't focus ...
So, I went with the shorter one. Need to measure the aberration in the corners - hopefully it's not too bad...

3. Focusing
Focusing is surprisingly difficult. What I ended up doing:
  1. Focus on a distant object without the filter.
  2. Put on filter and slew to sun.
  3. Use Live View (Nikon Control Pro 2) to focus the sun well (zoom into outer areas or spots and focus on it).
I will have to do this early enough - hopefully the temperature difference before/during/after the eclipse won't change focus too much.

Update 6/4:

1. Centering the sun
Last night, I polar aligned my scope really well. I thought that would making centering the sun a piece of cake ... Not so much. Took me some time to get the sun into the center. The easiest way I found was using the hand control and semi-systematically moving the scope around ... Need to think if I should get a simple solar centering device like my Lunt has ...

2. Keeping the sun centered
I tried using Horizons that comes with  APCC Pro. This worked REALLY well, the whole day, the sun stayed more or less in the center. The only thing is that TSX shows the scope at some point far away from the sun as it probably sees these adjustments but interprets them wrong ...
But overall, the process was simple enough!!!
So, I created the ephemeris file already and can the use it when I'm there. It will be slightly off as I have to use the default GPS coordinates, but they should do.

3. Focusing
Focusing is still tricky. I ended up ordering a combined Hartmann/Bahtinov mask for my scope. Hopefully that will help (will need to figure out how to use that on top of my solar filter!)

4. Eclipse Orchestrator
With everything setup, I worked through the Eclipse Orchestrator script again.
  • Eclipse Orchestrator uses ISO 100 and 1/400 sec exposure time as the initial setting. I know that I didn't set that, so it's good to see that we agree here :-)
  • Need to remember to set the camera to Mirror Up - otherwise it will try take two images every time. The good news is that the camera didn't fall behind - even when taking two images. The fast SD card is great!
So, except focusing, I feel pretty good now about this

Update 6/12:

After trying a lot of things (Bahtinov mask, Hartmann mask, FireCapture...), I think the best way to focus is to do it manually. Use Live View of the camera, zoom into the edge of the sun and then carefully move the focuser into and out of focus until I find the right position.

Update 7/7:

I ran the script over and over again. Unfortunately, it always had delays:
  • using MirrorUp increased the delays by more then 0.5 sec
  • using the D750 vs. the D7000 did not make a difference
The weird thing is that even the first image has already a delay ...

One thing left is a super fast SD card. I am currently using a SanDisk Extreme Plus which has a write rate of 90 Mb/sec. The Extreme Ultra has 300 Mb/sec. I ordered one of these to see if that makes a difference.

I also ran the entire script and it uses:
  • 6GB (240+ images)
  • 60% battery
So, I should be good running it from one battery and the SD capacity isn't an issue at all.

Eclipse Orchestrator is using ISO 200 and 1/1600 sec exposure. I could drop this to ISO 100 and 1/800. But there is no easy way to set this to 1/400 (as I measured). I guess I'll leave it as is.

Update 7/16:

The script will take images as follows:

ExposureISO
1/4000 100
1/3200 100
1/1600 100
1/800 100
1/400 100
1/200 100
1/100 100
1/50 100
1/25100
1/13100
1/6 100
1/3 100
1/1.6100
1.3100
1.6160, 320

I will need to take dark frames for these (and will probably use the 1/4000 dark frame as a bias frame).

Not sure if and how I want to take flat frames (especially considering that I want to image at night the night before and after). Maybe I'll skip these...

Update 7/17:
I tried the faster SD cards with 300Mb/sec but had exactly the same delays as before. Looks like I have to live with it (though I sent an email to Eclipse Orchestrator support and asked about this).

Update 7/22:
I update the firmware in my cameras and now at least the D7000 is in the range of 0-0.6sec delay. The D750 only improved a little :-(
Also, I think I found out why EO sometimes initializes really slow or seemingly not at all. When I removed the secondary SD card, everything was much faster. When I removed all images from the primary SD card everything worked really fast. It seems as if EO uses some initialization call that reads in all images on the SD cards ...

Update 7/25:
I compared the capture info (exposure, ISO...) on the images with the ones that the script has. And in order to get reliably, good results, I have to increase the time between images to 1.75 seconds!!! With that, I get only one set of Corona Bracketing (instead of 2-3). Sch...!!!

Update 7/30:
Upon reading more in the manual and various forums (why doesn't have Eclipse Orchestrator its own forum???!!!) it seems as if the EXIF information in the images might not be correct. So, I'll need to check the images based on what they show - not the EXIF information. Luckily we'll have a full moon soon and I can try out the whole sequence on a real object.

I also tried to work on focusing. Many people pointed out that it is difficult to use Live View for focusing as it overexposes the image. I thought about three different ways:

  1. Camera Controls Pro
    It only has Live View and, yes, it seems overexposed. I couldn't find any settings to lower the ISO or exposure time.
  2. Sequence Generator Pro
    This would work. But I can only use FIT or RAW as file formats (not JPEG) which means that the downloard time is really slow (several seconds). But with ISO 100 and 0.01sec exposure, I can get the sun image (and especially border) pretty clear.
  3. qDSLRDashboard
    I installed the Windows version and needed to replace the Nikon driver with a tool called zadik. I tried that, but the installation of a new driver failed - and then the original Nikon driver didn't work anymore either!!! Tried to uninstall and disconnect/connect the camera and such. No luck. Finally, I remembered that the tool installed a restore point and I restored back to that one and everything was as before. Phew!
    Next I tried to connect the camera through Wifi to the computer. Unfortunately, I can't use Wifi when the camera is using USB. And I don't think that I want to first disconnect the camera, focus through Wifi and then USB connect again. Especially during the eclipse (before totality) that would through Eclipse Orchestrator off...
  4. Eclipse Orchestrator
    Unfortunately, Eclipse Orchestrator only uses Live View and can't adjust the exposure for Nikon cameras either.
So, my best options seems to be SGPro. I'll have to practice that more. Unfortunately, there are no sun spots on the sun right now. I.e. it's not easy to check if the sun is really in focus ...

Update 8/6:
After many trials and errors, I decided to use Live View focusing with Camera Control Pro. Today was a sun spot and it was actually surprisingly easy to focus on that one. When the eclipse happened, I should be able to use the moon in a similar way.

Also, in a last attempt, I ordered an XQD card for my camera in the hope that it will help with the delays ...
---

So, here is the plan for these images:

Before leaving for OSP:
  • Clean Nikon Sensor and Super Reducer
  • Run Eclipse Orchestrator script several times
    • How much memory will I need for all images? (shouldn't be a problem)
    • How much battery will I need for all images?
  • Take bias and dark frames (according to all exposure times from the EO script)
  • Try out focusing routine of EO
At OSP before August 21:
  • Recreate the whole Eclipse Orchestrator script with the exact GPS coordinates from our place using the mobile GPS.
  • Run the whole script (including guiding) the day before
On Monday, August 21:
  1. Assume that from the previous night(s) the scope is very well polar aligned.
  2. In the morning, exchange the CCD camera with the Nikon camera
  3. Start AP Horizons and load Sun ephemeris. Start tracking on Sun (might need to manually center sun)
  4. Focus manually on edge of the sun - take a lot of time for this!
  5. Connect GPS to scope, start NMEATime to constantly correct the time
  6. Make sure that the Nikon is set to:
    • MirrorUp
    • Bulb exposure
  7. Start Eclipse Orchestrator
  8. Before first contact, replace battery for fresh one
  9. Before second contact: check battery (and replace if necessary) plus check focus
  10. During Eclipse, make sure to listen to commands and remove and put back the Baader filter.

Eclipse Imaging at OSP

After thinking and reading a lot, I decided to image the solar eclipse in 4-5 (automated!!!) ways. One of the main goals was that I could set everything up in advance and then just let it run, so that I can just enjoy the eclipse itself.

1. Individual Images of the Eclipse
  • TOA-130 scope on Mach1 mount with my Nikon D7000*
  • Controlled by Eclipse Orchestrator
  • Nikon D750 with 14-24mm lens
  • Maybe on slider
  • Controlled by qDSLRDashboard
  • Nikon KeyMission 360 on Manfretto Tripod
  • Just shoot - no control
  • Mavic Drone high up (390 ft)
  • Fly Drone up, start video / images, forget about it, let it land automatically when batteries get low
There is still a lot to figure out for each of these. With 2 months before we are leaving on vacation, I don't have too much time to plan for all of this...

* I had to use the D7000 for this as Eclipse Orchestrator does not support the D750 camera.

Sunday, April 30, 2017

Weird Lodestar drive message "Beta release has expired."

After I setup the new NUC for the TOA-130 scope, I got a weird message when I tried to connect the Lodestar X2 via Ascom:

""Beta release has expired. Please update your driver"

I received this message both in PHD2 and SequenceGenerator. After clicking the message away, everything seemed to work fine though ...

As always, I tried restarting, re-installing ... - always with the same results. Posted on the sx mailing list.

When I went back to the download page from Bret McKee, I noticed that there are two drivers. A version 6 beta driver and a version 5 production driver. Initially, I used the v6 beta driver. When I downloaded the version 5 driver and installed that (after uninstalling the v6 driver) everything seems to work fine ...

Monday, April 24, 2017

Exact Location and Time in the field

Several tools (e.g. TheSkyX - or especially T-Point) need the exact location and time. With an internet connection it's not a problem at all:

  • Android Phone for location
  • Dimension 4 for exact time
But how to get this when we are in the field without an internet connection. I'm always surprised how many Android apps need an internet connection - even if their functionality doesn't really need them. And Dimension 4 will not work at all as it connects to Internet Timer Servers.

The best solution I found was describe in Using GPS Receivers to Set Computer Time for In-the-Field Logging by Gene Hinkle. It requires:
Setup of GPS device is simpl: install driver, plug in device, install and start GPSInfo program. It might take a couple of minutes until the program found enough GPS satellites to determine its location:

(It should automatically detect the right COM number)

Now, close GPSInfo and start NMEATimePanel. It will first synchronize:

... and then after a while, it will lock to the satellites:

And now it will correct the computer time. Usually it gets it down to a few milliseconds.