Whatever I do, I end up with coarse details in galaxies - where is my process failing? [Deep Sky] Processing techniques · Arny · ... · 48 · 2349 · 30

afjk 3.58
...
· 
·  1 like
I have tried several routes processing 17h of integration time, but never end up with a satisfying result, as the image below shows:
the fine details in the galaxy arms are always coarse and somewhat bloated. I tried reducing denoising to a minimum without effect.

As I have had that with most of my galaxy images I wondered whether I have a flaw in my process or setup.

What could I do better?

Setup:
- EdgeHD 11, well collimated
- Asi2600mm pro camera at -10 degree cooling
- EAF for focus and EFW 7x2“ with Zwo RGB and Ha filters
- AsiAir plus controlling the setup
- 200 images of 180s and 300s in RGB and Ha
- bortle 4 sky at 2-4 degree celcius
- excellent guiding of 0.2-0.3“ on my AM5
- good automatic focus usinf EAF with fwmh of 1,31“
- raw images look ok, even though not too strong of an SNR from what I can tell visually …

Process in Pixinsight
- thorough deleting of suboptimal frames before stacking
- WPBB stacking with 100% images being processed
- Autohistogram on to red channel for RGB subframes
- continuum extraction in Ha channel (Toolbox Script)
- Channel Combi RGB
- Graxpert gradient reduction for RGB and Ha
- BlurxTerminator for RGB and Ha
- NoisexTerminator for RGB and Ha
- StarsXTerminator for RGB and Ha
- LRGB Combi of Ha into RGB
up to here everything looks fine, the issue start during stretching when trying to eliminate background brightness
- GHS stretching of starless and stars RGBHa
- Curves adaption increasing color contrast and making the image pop
- screen blend re-integration of stars

Here some revisions of the image representing several processing alternations, where I changed
- the order of when to integrate Ha (early vs late)
- stretching simpler with autohistogram
- integrating Ha only towards final end of process
https://www.astrobin.com/wrouwh/

Any ideas or recommendations of there the cause might be in the process or where I need to look in the raw images?
Especially @Gary Imm never seems to run into these problems even taking shorter exposures in a similar bortle 4.5 sky … :-)

Arny

image.png
image.png
Edited ...
Like
Dcolam 3.31
...
· 
·  1 like
Hi Arny,

Not a galaxy imager here, but to me it seems that you are clipping the blackpoints too much. I would look into your GHS processing step, as too aggressive adjustments can cause these issues.

Also color calibration seems a bit off. Maybe try to play around first with RGB only and then add Ha at the end when you are happy with the RGB processing. 

If you want to add Ha, use minimally processed raw and linear files. This means, only Gradient experimination, no blurx or noisex. Then do a continuum subtraction and proceed with the processing of the continuum subtracted HA.

I hope that helps. I learned that in processing, less is more. Better invest in more and cleaner data.
Edited ...
Like
Die_Launische_Diva 11.14
...
· 
·  2 likes
Hi Arny,

how does a simple RGB integration looks like?

By simple I mean just combine the R, G, B channels with ChannelCombination, remove gradients with the new GradientCorrection tool and perform SPCC with background normalization. Probably you can reverse the order of GC/SPCC as we still wait what is the best practice on using SPCC with GC but this won't matter much. At the moment it is important to follow a rudimentary processing workflow.
Like
afjk 3.58
...
· 
Hi Arny,

Not a galaxy imager here, but to me it seems that you are clipping the blackpoints too much. I would look into your GHS processing step, as too aggressive adjustments can cause these issues.

Also color calibration seems a bit off. Maybe try to play around first with RGB only and then add Ha at the end when you are happy with the RGB processing. 

If you want to add Ha, use minimally processed raw and linear files. This means, only Gradient experimination, no blurx or noisex. Then do a continuum subtraction and proceed with the processing of the continuum subtracted HA.

I hope that helps. I learned that in processing, less is more. Better invest in more and cleaner data.



Thanks Dcolam - I will follow your suggestion on being more careful and not to clip too much and do Ha later once RG is doing fine - and will report :-)

Regarding data quality:
does the raw image on the AsiAir image look ok or is that already too faint?
Like
Austronomer76 5.77
...
· 
·  1 like
Arny,

I don't see the point why you would do these two steps:
- Autohistogram on to red channel for RGB subframes
- continuum extraction in Ha channel (Toolbox Script)

Usually, RGB and Ha are stretched on their own and combined later in their non-linear form.

As allready suggested, try to process an RGB image withouth Ha and without the two steps mentioned above.
In the first step you should manage to get a propperly colour corrected RGB image, before thinking to add Ha.

I also suggest to use a mask when adding Ha, not to change the colour of the galaxy core (in case the continuum subtraction doesn't perfecly cancel out the core, which it probably woun't).

All the best!
Chris
Like
afjk 3.58
...
· 
Christian Koll:
Arny,

I don't see the point why you would do these two steps:
- Autohistogram on to red channel for RGB subframes
- continuum extraction in Ha channel (Toolbox Script)

Usually, RGB and Ha are stretched on their own and combined later in their non-linear form.

As allready suggested, try to process an RGB image withouth Ha and without the two steps mentioned above.
In the first step you should manage to get a propperly colour corrected RGB image, before thinking to add Ha.

I also suggest to use a mask when adding Ha, not to change the colour of the galaxy core (in case the continuum subtraction doesn't perfecly cancel out the core, which it probably woun't).

All the best!
Chris



Thanks Chris,

- Autohistogram serves to align the RGB channel to the same median value in statistics - that usually allows for better color balance in RGB. But I will now try without!
- Continuum extraction reduces the effect of normal red enhancing the Ha narrowband channel that are driven by red parts of a broad white contiuum.
- Using a Ha mask: I did use that, sorry for not mentioning

I will run your suggestions of doing a perfect RGB first - and only combine Ha in non-linear phase!
Like
Dcolam 3.31
...
· 
Arny:
Autohistogram serves to align the RGB channel to the same median value in statistics - that usually allows for better color balance in RGB. But I will now try without!




I agree with Chris here. If you want to have a better color balance, you could either use SPCC or similar on the combined images. Or if you really want to pre-calibrate your channels, you could try LinearFit with your weakest channel. I dont see the point, though. As you are processing a broadband target.

Cheers, 
David
Like
afjk 3.58
...
· 
Arny:
Autohistogram serves to align the RGB channel to the same median value in statistics - that usually allows for better color balance in RGB. But I will now try without!




I agree with Chris here. If you want to have a better color balance, you could either use SPCC or similar on the combined images. Or if you really want to pre-calibrate your channels, you could try LinearFit with your weakest channel. I dont see the point, though. As you are processing a broadband target.

Cheers, 
David



Interesting, that Linearfit is more for narrowband targets.
Also I thought that AutoHistogram was doing the same as Linearfit - what is the difference?
Like
GaryI
...
· 
·  3 likes
Arny,  I will send you a private message on this subject.   I have a few ideas but need more information.
Like
jrista 8.59
...
· 
·  14 likes
Arny:
I have tried several routes processing 17h of integration time, but never end up with a satisfying result, as the image below shows:
the fine details in the galaxy arms are always coarse and somewhat bloated. I tried reducing denoising to a minimum without effect.

As I have had that with most of my galaxy images I wondered whether I have a flaw in my process or setup.

What could I do better?

Setup:
- EdgeHD 11, well collimated
- Asi2600mm pro camera at -10 degree cooling
- EAF for focus and EFW 7x2“ with Zwo RGB and Ha filters
- AsiAir plus controlling the setup
- 200 images of 180s and 300s in RGB and Ha
- bortle 4 sky at 2-4 degree celcius
- excellent guiding of 0.2-0.3“ on my AM5
- good automatic focus usinf EAF with fwmh of 1,31“
- raw images look ok, even though not too strong of an SNR from what I can tell visually …

Process in Pixinsight
- thorough deleting of suboptimal frames before stacking
- WPBB stacking with 100% images being processed
- Autohistogram on to red channel for RGB subframes
- continuum extraction in Ha channel (Toolbox Script)
- Channel Combi RGB
- Graxpert gradient reduction for RGB and Ha
- BlurxTerminator for RGB and Ha
- NoisexTerminator for RGB and Ha
- StarsXTerminator for RGB and Ha
- LRGB Combi of Ha into RGB
up to here everything looks fine, the issue start during stretching when trying to eliminate background brightness
- GHS stretching of starless and stars RGBHa
- Curves adaption increasing color contrast and making the image pop
- screen blend re-integration of stars

Here some revisions of the image representing several processing alternations, where I changed
- the order of when to integrate Ha (early vs late)
- stretching simpler with autohistogram
- integrating Ha only towards final end of process
https://www.astrobin.com/wrouwh/

Any ideas or recommendations of there the cause might be in the process or where I need to look in the raw images?
Especially @Gary Imm never seems to run into these problems even taking shorter exposures in a similar bortle 4.5 sky … :-)

Arny

You are doing a lot of processing. Not necessarily an excessive amount, although it seems as though your stretch is overly aggressive.

Something I like to teach imagers who come to me for help is to see what their data can do for itself, before they go through and process it with a bunch of steps. Sometimes, the baseline data is actually quite good, before you go and make a lot of changes to it.

I would try this. Get the latest version of PI. Combine the raw channels together right after integration. Don't do any other processing. Use the new gradient correction process to correct the gradient. Then, do NOTHING else to your image, other than a simple and lightweight stretch. Don't crush the blacks, don't do any noise reduction. Before you mess with anything, just take a look at what your gradient-corrected image looks like. In other words, let the data speak for itself, and let it tell you what you have.

You may very likely be surprised to find that your raw data is a lot better than you think it is. Once you have an idea of what your data looks like on its own, at lest without the gradient...THEN decide on what processing you need, and to what degree. Before you do anything else, I would just experiment with your stretching...see how far you can push the image before you feel it starts to break down. That should help you decide what to do, and to what degree, once you decide to do more processing (which would involve backing up to say just after gradient correction.)
Like
GaryI
...
· 
·  2 likes
As usual, Jon has provided excellent advice.   Everything he says above is spot on.
Like
ghatfield 1.51
...
· 
·  1 like
I've processed several galaxies where I have added Ha to the RGB (e.g., https://www.astrobin.com/10w5bb/ ) and I've had good results using the method from YouTube's Crazed Conceptions (https://www.youtube.com/watch?v=LwuiS8N1A-E).  Others have described similar processes, but I like the simple explanation provided in this video.  In this method the linear Ha (modified to remove broadband red) is added to the linear RGB.   So it is done very early in processing.   Note that I don't typically add a lot of Ha.  I go for the "natural look"... which may differ from your desired look.  But this method allows one to easily adjust the amount of Ha added to the RGB.

General process:

combine R, G and B to give linear RGB...  the R, G and B are untreated other than cropping.  I do gradient removal on the RGB.  
remove gradients from linear RGB...  I have been using GraXpert but might switch to the new Pixinsight process
to remove gradients from linear Ha
Use the method referenced above to combine a portion of linear Ha into the linear RGB
Color calibrate linear HaRGB using SPCC
Run BlurX on the resulting HaRGB
Remove stars using StarXterminator or Starnet++
Treat starless image with NoiseXterminator
Stretch starless image using arcsinh (minimal) and GHS; do the same with stars
If I create an LRGB image, I now process the luminance, synthetic luminance or superluminance by removing gradients, treating with BX, removing stars and stretching as above. 
Now, combine nonlinear HaRGB and nonlinear luminance; before combining, I try to adjust the two images to have similar "means" in Statistics.
Add back stars
Minimal processing in PS (mainly contrast adjustments)

You might ask a friend to process your data (by their methods) to see if they get the same outcome.   Jon has some good advice in this regard.

George
Edited ...
Like
HegAstro 11.91
...
· 
·  2 likes
Jon Rista:
I would try this. Get the latest version of PI. Combine the raw channels together right after integration. Don't do any other processing. Use the new gradient correction process to correct the gradient. Then, do NOTHING else to your image, other than a simple and lightweight stretch.


This is exceptionally good advice. I can usually tell, just by looking at a linear, gradient corrected image, whether the final processed image is going to be good or not. I would say that I would personally consider your image heavily overprocessed. You need to get a sense for what the data can do. Going beyond that just results in an unnatural look.
Like
vercastro 4.06
...
· 
·  1 like
Jon has some good advice. Go back to basics. I'll add some extra tips in no particular order.
  • Don't combine luminance and/or narrowband in a linear state.
  • With regards to narrowband addition, I strongly recommend reading this guide as it delivers by far the best results: https://www.nightphotons.com/guides/advanced-narrowband-combination
  • Don't bother using linearfit or similar process for RGB. The first thing I do with RGB after stacking is to combine the three channels then run DBE or some other gradient extraction technique. Then I will run SPCC to colour calibrate. Then BlurX, noise reduction, etc. all in linear state.
  • On the topic of noise reduction, I recommend running the AI tools before removing stars. It's not strictly necessary but I prefer this methodology because these tools were trained with stars and I want the noise profile between stars and starless to match.
  • Ensure that nonlinear intensity matches roughly before attempting to combine luminance.
  • Don't stretch in such a way as the clip blacks. Contrast should be low until very late in the process. Usually one of the last things I'll do is increase contrast with curves.
  • Don't combine star and starless with a simple addition. This will create all kinds of artifacts. I strongly suggest reading this guide: https://www.nightphotons.com/guides/star-addition

If you want to see an image similar to yours which combines all these techniques, check this out: https://www.astrobin.com/xwhle5/
Edited ...
Like
timopro 1.81
...
· 
Hi Arny, 

First,I must admit that I am not a super expert. 

Would you be willing to share your RAW data? I’d like to take a look and see what improvements can be made.

It seems there might be a problem somewhere in the process since, as others have pointed out, the background appears clipped and the color calibration seems off. Importantly, there's no evident presence of Ha information, which should be expected. 

Usually, the Ha continuum subtraction is performed with the Red channel (sort of Ha-R arbitrarily) to prevent the Ha red prominences from coloring the rest of the image.

 Inngeneral I'm applying the Linear Fit to balance the RGB data, using the R channel as a reference. *case by case* 

I'm curious about how you integrate the Ha. It's not clear whether you're using LRGB combination, Ha into the RGB data in a linear phase?. 

Today, there are more sophisticated methods available. There are scripts or even the new Color Mapper, which can be effectively utilized during the linear phase.

Please let me know if you're open to sharing your data for checking. 😉 

Best regards and CS, Timo
Edited ...
Like
afjk 3.58
...
· 
Wow , some great advice guys - thank you all very much!

common denominator of your advice is to stick more to the basics, use less tools and not to overcorrect and not to clip. 

reagarding stretching I saw Ghs and arcsinh as recommendations - any other or even just a manual histogram transformation favored?
Like
vercastro 4.06
...
· 
·  1 like
Yes. Personally I've tried every method of stretching and the very best results by a long shot come from either GHS or HT (or a combination of both). I recommend reading the documentation/watching the videos on how GHS works and how to use it. It's extremely powerful and flexible once you understand how to use it.
Like
jrista 8.59
...
· 
·  2 likes
Yes. Personally I've tried every method of stretching and the very best results by a long shot come from either GHS or HT (or a combination of both). I recommend reading the documentation/watching the videos on how GHS works and how to use it. It's extremely powerful and flexible once you understand how to use it.

I haven't had a chance to use GHS yet. I've pretty much always used HT, however my approach is to apply many passes with the express intent of managing the final, resulting apparent noise. A single-pass stretch with HT is rarely the most effective, but if you do multiple passes to extract details from different parts of the spectrum more carefully, you can get a pretty good stretch without over-emphasizing the noise. 

I've read a bit about GHS, and it sounds really interesting. I'll be done with some alternative palette processing with my latest image, and will move onto reprocessing another old image, and I'll give GHS a try.
Like
afjk 3.58
...
· 
·  1 like
Die Launische Diva:
Hi Arny,

how does a simple RGB integration looks like?

By simple I mean just combine the R, G, B channels with ChannelCombination, remove gradients with the new GradientCorrection tool and perform SPCC with background normalization. Probably you can reverse the order of GC/SPCC as we still wait what is the best practice on using SPCC with GC but this won't matter much. At the moment it is important to follow a rudimentary processing workflow.


This is the raw version of just RGB combined with no other processing than cropping

image.png
Edited ...
Like
afjk 3.58
...
· 
·  1 like
Timothy Prospero:
Please let me know if you're open to sharing your data for checking. 😉




Hi Timothy,

thanks for your advice - here is the link to the stacked master files of RGB and Ha. For RGB I took 180s and 300s exposure times

https://drive.google.com/drive/folders/1WOuY-nbnuP9z07yPrF9tJsHUg_g9BKMP?usp=sharing

Arny

PS: give it some time to upload the 1 GB ...
Edited ...
Like
Geoff 2.81
...
· 
·  2 likes
Just using HDRMultiscale transform makes a big differenceImage35.jpg
Like
HegAstro 11.91
...
· 
·  1 like
Jon Rista:
I've read a bit about GHS, and it sounds really interesting. I'll be done with some alternative palette processing with my latest image, and will move onto reprocessing another old image, and I'll give GHS a try.


The nice thing about GHS is that it allows you to add contrast in a localized and controlled way, much better than HT. It made, to my eyes, a big difference to a galaxy image I recently reprocessed, in bring out contrast and color, better than I could get in my previous attempts with the same data. Adam Block has an excellent tutorial on it which a couple of great examples.
Like
janvalphotography 4.36
...
· 
·  3 likes
The linear data looks good and sharp, but it comes off quite purple. I wonder if the channels might be mis-labelled? It seemed a lot better when I did RBG (based on the file names) rather than RGB..

Edit: Added a screenshot of the unlinked STF for that particular combination
image.png
Edited ...
Like
GaryI
...
· 
·  1 like
Wow, Jan Erik, you may be right!  Working with Arny's masters, I could never achieve a decent color-calibrated image.  I told Arny that I felt there was a calibration issue somewhere in the process.   But now switching those 2 channels made all of the difference.  Only Arny will know if that is the case, but it is the best explanation so far.

Jan Erik, how did you decide to try that?
Like
ghatfield 1.51
...
· 
·  2 likes
I processed the 300 second R, G and B masters.  Before combining to the RGB I ran linearfit with red as the standard since there was a huge difference in the means.  I then ran the RGB through GraXpert which seemed to work fine.  Then color calibration with SPCC.   I used Astrodon filters.   I don't think the filter type makes much of a difference.  Anyway, the SPCC curves were very unusual...  I have not seen them running in opposite directions in any of the images I have processed.  Also the central core is way too purple even when calibrated and the arms of the galaxy look sort of "coagulated."  Could there be a problem with the WBPP processing?  I don't think any amount of processing is going to help this image.  I would go back to the very beginning and start over.astrobin image and spcc curve.jpg

George
Like
 
Register or login to create to post a reply.