Contains:  Solar system body or event

Annular Eclipse 2023

Annular Eclipse 2023

Equipment

Loading...

Acquisition details

Loading...

Description

At last!  I've finally finished the Eclipse timelapse.  This was no ordinary timelapse.  I didn't follow some blog or tutorial online.  Nobody taught me how.  I stumbled through it and had fun along the way.  This was the single largest photography and scripting project I've had yet. 

This final timelapse may not look like much, but the value is in what I learned and the python scripts I built to accomplish it.  Now I can take the scripts I built and use them on future timelapse projects like this.  I also made some crucial mistakes in capturing these images.  As you can see, the quality of the data has a lot of room for improvement.  Where should I start???

I attended my first star party at X-bar Ranch where I camped out for 3 nights in a tent with the Eclipse occurring on my last day.  I spent the previous night's testing out new gear and trying to solve various problems.  I have never taken proper pictures of the sun before.

I was having trouble obtaining a good focus when I use my solar filter.  Turns out I was getting some internal reflections in my scope.  I noticed there was a blurry haze that would rotate when I locked my focus ring.  I simply shifted the solar film so it wasn't parallel with the camera sensor, and I was finally able to get a sharp focus.  I checked the film for pin holes but couldn't find any problems.  I'm not sure why this occurs.  I believe the artifacts seen at the end of the timelapse around the edges of the sun are caused by this solar film shifting from wind.

This was my first time using an H-Alpha filter.  My Sony a6000 camera is a color sensor.  While preparing and taking the photos I used the histogram to make sure I wasn't overexposing my image.  It didn't dawn on me until reviewing my images the next day that the histogram is showing the sum of all 3 channels, Red, Blue and Green.  This means when I increased the exposure to bring the sun bump to the middle of my histogram, my red channel was completely washed out and I was gathering enough light in the other channels to bring the average of all three into the middle.  In other words, ALL of my H-Alpha detail was washed out on ALL my photos!!!  Hard lesson - Learned.

Data Acquisition:  I took one second bursts every 15 seconds for the entire event, for a total of 7346 images.  Each 1 second burst acquired 9-11 images.  The raw ARW images took a total of 168 GB of storage space. The 15 seconds between each burst gave me a little bit of time to make adjustments to focus as the temperatures changed.

Processing Plan:  I knew that I could take pictures of the moon, even if only a few, and stack them to reduce noise.  Since my camera has the best dynamic range at 1600 ISO I knew I was going to get a bit of raw image noise with such fast shutter speed.  There is also turbulence in the atmosphere that can be reduced by stacking sequential images.  So, the plan was to stack each burst of images to give me more detail, less noise, and more ability to stretch the image detail.

Reality:  To stack these images, I had to first convert the images to TIFF format.  Converting the raw ARW format images to TIFF increases each image size by almost 6 times!  That means to convert all 7346 images, it would turn my 186 GB into 980 GB!   So, I need at least 1TB of free space, or I can simply crop the images down so the final TIFF isn't so big.  Thank you Darktable for making this crop and convert to TIFF process EASY.

First Python Script:  I need a way to group the images by the burst.  Since there are 4 bursts per minute, you can't even tell where one burst ends and the next begins without opening each photo and seeing where the moon moved slightly.  Adding a Windows Explorer column doesn't even show the timestamp in seconds, only minutes.  The same detail is lacking in the file properties using Windows Explorer.  Luckily, the timestamp data is still there down to the second, imbedded in each file.  I wrote a program script to take a directory as input, make a list of all the image files, saves each image in the list in a folder, compare each new image timestamp to the previous image and if the difference is more than 3 seconds, start a new folder.  Thanks to the PIL for Python for allowing me to get the EXIF data from the images.

Second Python Script:  Now that I have my images grouped by burst in 679 folders, I need to stack each folder of images into its own single image.  I know that I can use the G'MIC plugin for GIMP to do this, but I'm not going to do this by hand for hundreds of folders.  Python to the rescue again.  You can script these processes in GIMP using python-fu.  Having never done this before the task was a bit daunting at first.  I broke it down into parts and tested them one by one.  I learned how to make a basic plugin and run it in GIMP.  I learned how to pass parameters from the GUI to my script.  I learned how to debug in the Python console in Gimp.  I figured out how to not only call gimp processes in python-fu, but also how to pass commands to the G'MIC plugin.  (this was much harder to debug).  Finally, I made my final script, and it works like a dream.  I open GIMP and run my "StackEmUp" program from the tools menu, it prompts for the parent directory and alignment options, it opens the first image in the first folder, opens the remaining images in that first folder as layers, runs the align layers G'MIC filter, runs the Blend Median G'MIC filter to stack the layers into one image, exports a single image into the parent directory, then moves on to the next folder.  It run for a little over 6 hours and my 679 final images were ready to be made into a timelapse video.

Video Editing:  Now for the easy part, right?  I thought it was downhill from here.  I've done timelapses before, but I was mistaken.  I use Davinci Resolve because it's free and has the best image stabilization and color grading on the market, but unfortunately because my images suck, there is not enough detail for the tracking and stabilization black magic to do its thing in Resolve.  The images are very broken up by clouds in the beginning and the image moves around enough to make the auto tracking fail.  It took me some time to learn how to correct the tracking points and start tracking from different points throughout the clip.  I probably could have done a little better with the tracking, but the images weren't worth that many extra hours manually tracking.  When I do this again, my images will have more detail to track and hopefully I don't have problems with the clouds again.

Finally, a finished product.  You're probably thinking I'm crazy.  You're probably right.  But I wanted to create something more than just an image, more than just a video, something unique.  I had fun learning Python, and I've learned from my narrowband imaging mistake.  Hopefully I've got the mistakes out of the way and our next eclipse in 2024 doesn't give me too many problems (since it is a total eclipse and not just annular).  I'll be posting my git hub link later.  I have some slight changes to make before publishing, but I'll post it here when I'm done.

Please comment if you've read through this far.  I hope it was entertaining or helpful in some way.  I love to get feedback too, even if you just want to tell me this was crazy.  Though, I have a feeling there are more people like me on AstroBin than not.  Thanks for reading!

Comments