Pixinsight Master Light Not Generated - can I salvage something? Pleiades Astrophoto PixInsight · Piers Palmer · ... · 27 · 1347 · 2

PiersPalmer 2.15
...
· 
·  2 likes
I've managed a 7 hour session on the Heart Nebula over the past couple of nights and I'm excited about seeing the result. After three hours of my PC chugging away, I got a message saying "Master Light Not Generated" and something about insufficient memory. 

Argh.

Is there something I can do with all the registered and calibrated frames to create the final image? I don't fancy running that all again and risking exactly th same thing happening. 

In the Master folder, There is a LN_Reference_Light etc file and I could fiddle with that, but that's not the final image is it?

Looks ok still!
LN_Reference_Light_BIN_1_6248x4176_EXPOSURE_120_00s_FILTER_NoFilter_RGB_integration_ABE.jpg
Like
PiersPalmer 2.15
...
· 
Just in case this is of interest.

20230904080008.log
Like
andreatax 7.72
...
· 
·  2 likes
Stop using WBPP, this my advice to you. Bias (which you shouldn't use) and Master dark should be done once in a while, not part of the normal pre-processing run. The image above is a "preliminary" integrated light used for Local Normalization purposes. It is in the better than nothing category. Finally, integrate each night separately and things will look up for you.
Like
PiersPalmer 2.15
...
· 
·  1 like
But apart from that...

I guess the main two queries I have are, if not WBPP, then what? I started using this rather than Deep Sky Stacker and figured it must be better as it took x7000 longer. Also, would the processing the individual nights separately just make things run smoother or is there a technical reason to too?

I took a batch of Bias (which I know now aren't needed, but do they do any harm?) and darks a while back, and just put them in the WBPP setup thingy each time. 

In the end, I ran Image Integration and got something I'm quite pleased with.  

integration_ABE.jpg
Like
andreatax 7.72
...
· 
·  1 like
Piers Palmer:
But apart from that...

I guess the main two queries I have are, if not WBPP, then what? I started using this rather than Deep Sky Stacker and figured it must be better as it took x7000 longer. Also, would the processing the individual nights separately just make things run smoother or is there a technical reason to too?

I took a batch of Bias (which I know now aren't needed, but do they do any harm?) and darks a while back, and just put them in the WBPP setup thingy each time.


To answer the first query: Run the processes driven by the script on their own one at the time. Faster and at least you get to understand what is going on and why.

To answer the second query: I suppose if the problem is available memory than that is the answer but otherwise no difference (in the main). Technically you would of preference keep several imaging run separate as each might have a best reference frame to look up to. Plus, in the final integration, it gives you leeway to define the weights you might want to attach to each imaging run.

It is well posible that using Bias will actually do harm the resullts (at a minimum you increase the noise) as the bias signal is already contained in the Dark signal (and you aren't going to scale the Dark anyway).

Adding a repeated and potentially expensive run just to generate a Master Bias and  Master Dark if you already have them masters in your library would certainly slow down the script which is no speed demon either.

BTW, do you know that you are integrating 172 out of 210 images you took?
Like
PiersPalmer 2.15
...
· 
Brilliant - thanks Andrea!

And yes, 38 of the frames were rejected by WBPP, but it's still the longest integration I've managed. 

I shall try again with the data, using the procedure you suggest.
Like
andreatax 7.72
...
· 
Piers Palmer:
And yes, 38 of the frames were rejected by WBPP, but it's still the longest integration I've managed.


At just over 18% is a pretty high rejection rate, nearly one out of five. I'd be curious to know why (if I were you).
Like
Die_Launische_Diva 11.14
...
· 
·  1 like
Don't use LocalNormalization and use something less computationally demanding than Generalized Extreme Studentized Deviate for pixel rejection (Winsorized Sigma Clipping is fine). Both algorithms are computationally demanding and not available in DSS thus it is unfair to compare WBPP to DSS. Your system may not have the necessary memory for those to run.

As @andrea tasselli have said, skip (for the moment, at least) WBPP and try to learn doing things manually. The guide from Light Vortex is helpful, albeit a bit outdated: https://www.lightvortexastronomy.com/tutorial-pre-processing-calibrating-and-stacking-images-in-pixinsight.html. Also, Bernd's guide is a must read: https://pixinsight.com/forum/index.php?threads/guide-to-preprocessing-of-raw-data-with-pixinsight.11547/

There are many things enabled by default in WBPP that are computationally demanding and are not absolutely necessary.
Edited ...
Like
PiersPalmer 2.15
...
· 
andrea tasselli:
Piers Palmer:
And yes, 38 of the frames were rejected by WBPP, but it's still the longest integration I've managed.


At just over 18% is a pretty high rejection rate, nearly one out of five. I'd be curious to know why (if I were you).

Me too, especially as I'd used the Blink thingy and taken out anything less than perfect....I thought. That's a question for a cloudy night.
Like
PiersPalmer 2.15
...
· 
Die Launische Diva:
Don't use LocalNormalization and use something less computationally demanding than Generalized Extreme Studentized Deviate for pixel rejection (Winsorized Sigma Clipping is fine). Both algorithms are computationally demanding and not available in DSS thus it is unfair to compare WBPP to DSS. Your system may not have the necessary memory for those to run.

As @andrea tasselli have said, skip (for the moment, at least) WBPP and try to learn doing things manually. The guide from Light Vortex is helpful, albeit a bit outdated: https://www.lightvortexastronomy.com/tutorial-pre-processing-calibrating-and-stacking-images-in-pixinsight.html. Also, Bernd's guide is a must read: https://pixinsight.com/forum/index.php?threads/guide-to-preprocessing-of-raw-data-with-pixinsight.11547/

There are many things enabled by default in WBPP that are computationally demanding and are not absolutely necessary.

I did initially when I got Pixinsight but then I thought that WBPP was the creme de la creme of this sort of thing. 

I'm actually going through the Light Vortex advice now and will see how that looks...and how long it takes!

As for the PC, it shouldn't prove a bottleneck as it's an AutoCAD-geared workstation with 32GB of RAM, but who knows!
Like
jeffreycymmer 0.00
...
· 
·  1 like
Have you set up swap files?
Like
Hâthor 0.90
...
· 
·  3 likes
Hi,

Line 36764 of your log file:
[2023-09-04 10:38:24] * Integrating channel 1 of 3:
[2023-09-04 10:38:24] Integrating pixel rows:     0 ->   674:   2%
[2023-09-04 10:38:51] *** Error: Out of memory
[2023-09-04 10:38:51] <* failed *>
[2023-09-04 10:38:51] ** Warning: ImageIntegration failed.

So here it is

I also view that you leave minimum wieght to default wich is 0.05 (basically "all the frames"; no regret for those who've been reject). You can try something really aggressive like 0.5 or even 0.8 (I often use 0.5). With less frame, it is easier to handle but....

4GB of RAM is really few on these days
You should get at least 8 if it's possible for you.

BTW, just in case: you don't have to re-run WBPP. You can just load the script via script explorer. It will bring you a process container. Then, just pick "Image Integration". You can try different settings and see what you get.
Like
PiersPalmer 2.15
...
· 
Have you set up swap files?

I don't have a clue what they are (or what it is) so I'm guessing not! if it's a setting on the computer I've not touched it since Dell sent it to me.
Edited ...
Like
andreatax 7.72
...
· 
·  1 like
François Guiot:
4GB of RAM is really few on these days
You should get at least 8 if it's possible for you.


It must be more than that as the log reports upward of more than 5GB at times, so probably the machine got to have 8 Gb at least.
Like
apennine104 3.61
...
· 
·  3 likes
Piers,

If WBPP made it to the integration phase, you should have the registered lights. You can just go to Process>Image Integration. Click "Add Files" and add the registered lights. For Pixel Rejection, try Winsorized Sigma Clipping since that is less memory intensive. 

I agree there is other issues at play. Your WBPP points to way less RAM than 32GB. I have 64GB of RAM, and my latest WBPP log says ~58GB available. So, something is either eating up a ton of memory in the background, or something is physically wrong. If after a fresh reboot you go CTRL+ALT+DEL, click task manager>performance>memory, it should report how much is in use/how much you have. For example, mine reports 11.4/63.8.

The Swap File performance increase also mentioned above is within Pixinsight. You add more folders for PI to work with to take advantage of multi-threaded processes. This explains it well.
Like
jeffreycymmer 0.00
...
· 
·  1 like
https://pixinsight.com/sysreq/
Like
PiersPalmer 2.15
...
· 
Thanks all - much appreciated. I was wrong; I have 16gb of Ram but something is clearly amiss. Autocad and photoshop would have been running so that could be eating into the amount available?

I need a separate imaging computer!
Like
Alan_Brunelle
...
· 
·  1 like
Piers Palmer:
Thanks all - much appreciated. I was wrong; I have 16gb of Ram but something is clearly amiss. Autocad and photoshop would have been running so that could be eating into the amount available?

I need a separate imaging computer!

To your point that I copied above, you may not need a new separate computer.   My experience with WBPP has been mostly good, but on some occasions, it does cause my system to crash and reboot.  Typically, this is during the drizzle integration step.  As suggested above, it is likely due to limited resources (RAM) on my machine.  I do have a lot of RAM, but I also often initiate a WBPP session while doing other work on my computer.  Each additional application opened on the computer reserves a certain amount of RAM and adds "stress" to the RAM.  In particular, I find that the Aladin app will cause a crash.  It is a RAM hog.  And it is known that once opened Aladin, even if closed, somehow still impacts the RAM.  So, I now reboot my computer prior to starting a run and I have few if any problems in such a situation.  So try doing such intensive work on a computer that is newly booted.  And don't browse, etc. while it is running.  If it then fails, you will need to upgrade your system.  Be sure to configure the swap files as per the PI instructions and if working with an old spinning hard drive, you will need to at least add an SSD for those swap files.

Certainly, ensure that your computer is up to the task for any of the PI functions that are used within the WBPP script.  There is nothing within WBPP that inherently will cause a failure of your system to execute that doing each function separately will solve.  After all, WBPP only really executes one function at a time.  I think the comments to not use WBPP and go with doing PP using each individual function is kind of an odd suggestion (per the PI phylosophy, see below).  But I will add, that I do not completely disagree with this suggestion.  If you learn the PP process using each script separately, you will learn and better understand just what it is you are doing along the process train and learn what each option does to affect the outcome.  This also, depending on the data you feed into the process.  A valuable lesson to be learned!

However, the understanding of the Pre-Processing "process" was precisely the reason for the creation of WBPP!  If you read the early threads upon release of WBPP, the feedback to questions by the PI insiders was that you should only use WBPP as a means to learn how to do PP.  Or at the very least, it should only serve as a means for the knowledgable to do simple "quick and low effort" integrations just to see how the data will look.  The expectation being that one would alway, always go back and redo the PP manually.  In light of these facts, it is odd that anyone should suggest not to use WBPP as a starting point.  However, what I disagree with is not that advice, but PIs' experts' point of view.  I find PI in this regard to be very disingenous and elitist (from a photo processing perspective)!  WBPP is written in a way that you can set up a calibration/integration quickly in the same way you would do it manually.  It also saves much of the actions chosen and outcomes in a log file that the beginner can review.  It also saves many of the intermediate calibrated/registered image files for the beginner to review, so that if something goes wrong, you can trace back to where it went wrong.  So in that, it certainly works as a very good, logical learning tool.  BUT, that can also be done (maybe better) when using each script individually.  What I find as disingenous is that if one can set up WBPP to work well, and if one does understand just exactly what choices are made by the user within WBPP, then why would a user not use WBPP as a primary PP tool?  

Assuming you get your hardware system up to the task, as far as PP goes, whichever path you take to understanding what each step accomplishes and how those relate to what you want to achieve in the end, you should learn that you can take a path where you sit at the computer and execute each step individually, wait for the process to execute and then go on to the next step, or you can set up WBPP to do the exact same processes, hit the "go" button and walk away and get on with your life!  If I see two finished images that both look great, one done by individual effort and one done by WBPP, I could care less which choice was made.  AstroBin doesn't ask us to list which way we do PP.

For those that use PI as their choice to do PP, anectdotal evidence suggests to me that many here, if not most, use WBPP as their primary and as they wish, not as the PI WBPP founders suggested.  And my guess that with the many updates and upgrades to WBPP over the years, it is clear that even PI has succumbed to the concept of using WBPP as a primary tool for PP.

BTW, when my WBPP crashes at the Drizzle Integration step, it is a trivial fix.  WBPP, saves the registered images, the drizzle files and even the normalization files and I just then go to the Drizzle script in PI and do the Drizzle integration manually.  It then never fails.

I also have used the WBPP often to troubleshoot issues in the final results.  It is easy to read the log file, and also to open the intermediate calibrated/process/debayered/etc. files and just look and see the results.  But I agree with those who say you should learn the process by starting out by doing each step individually and learning that way.

Bottom line, do the best to learn what is going on.  Generating a history of experience with success and failures is the best learning experience.  Choose how you do your processing based on your outcomes and desires.  Remember these are your choices.  Never feel forced to make changes based on guilt or other non-objective pressures.  But, solicit and review comments and criticisms and try different things you see and fold them into your process as you have time and energy to do so.  This takes time and should never cease as a process of improvement.
Edited ...
Like
Leonardo-Ruiz 0.00
...
· 
·  2 likes
Piers, good afternoon. I had exactly the same problem. My integration times with the ASI2600 MC reached 17 hours. I mistakenly thought that I needed a new PC and invested in a very powerful one. But before receiving it, I found the solution. Your problem is in the calibration configuration parameters. When you load WBPP, Bias, Flats, Darks and Lights, you must also set the calibration. In the file you attached, it clearly shows.You are not processing with master Daks or master Flats, but with all Dark and Flats images, which multiplies the process and therefore the time.
Page 17.jpg
You must select in the calibration settings to use the generated master.Calibration.jpg Bias are automatic. Only with this adjustment in calibration, I went from 17 hours to 24 min. With the new equipment (68Gb RAM, intel i9) in 4 min.
NOTE:
If you want to try Dark Flats, don't use the Bias in WBPP and put the Dark and Dark Flat in the Dark tabs. Pixinsight will recognize them, don't worry, and again, in calibration, select Dark and Flats. Try both methods and decide which one works best for you.
Edited ...
Like
jrfreireq 0.00
...
· 
·  1 like
Leonardo Ruiz:
Piers, good afternoon. I had exactly the same problem. My integration times with the ASI2600 MC reached 17 hours. I mistakenly thought that I needed a new PC and invested in a very powerful one. But before receiving it, I found the solution. Your problem is in the calibration configuration parameters. When you load WBPP, Bias, Flats, Darks and Lights, you must also set the calibration. In the file you attached, it clearly shows.You are not processing with master Daks or master Flats, but with all Dark and Flats images, which multiplies the process and therefore the time.
Page 17.jpg
You must select in the calibration settings to use the generated master.Calibration.jpg Bias are automatic. Only with this adjustment in calibration, I went from 17 hours to 24 min. With the new equipment (68Gb RAM, intel i9) in 4 min.
NOTE:
If you want to try Dark Flats, don't use the Bias in WBPP and put the Dark and Dark Flat in the Dark tabs. Pixinsight will recognize them, don't worry, and again, in calibration, select Dark and Flats. Try both methods and decide which one works best for you.

I did not know about the Dark Flats being in the Darks folder. I will try that next time. 

To the OP @Piers Palmer , I would check the hard disk memory that you have in your disk, when you run WBPP, it creates a ton of files and that eats you memory quick. Maybe that is the case.
Like
pfile 1.81
...
· 
·  3 likes
Alan Brunelle:
The expectation being that one would alway, always go back and redo the PP manually.


i don't know about that. the suggestion was to go back and do the light integration manually, not the whole preprocessing flow. the point of BPP (and then WBPP) was to automate the drudge work of matching flats with lights, darks with lights, darks with flats, etc.
Like
PiersPalmer 2.15
...
· 
Alan Brunelle:
So, I now reboot my computer prior to starting a run and I have few if any problems in such a situation.  So try doing such intensive work on a computer that is newly booted

Thanks for your very detailed and really helpful post. It's much appreciated. 

I do need to reboot my machine more often but the trouble is I'm lazy, often have many autocad drawings open at once and sort of rely on open drawings to be my aid to remembering what I'm doing! I suppose I could easily up the amount of RAM in my machine too, but WBPP has given me some results I'm very happy with in the past and that's what counts. I'm happy to try both that and the method where I'd do each step myself, and see if that makes any difference I can notice.
Like
PiersPalmer 2.15
...
· 
Leonardo Ruiz:
Your problem is in the calibration configuration parameters. When you load WBPP, Bias, Flats, Darks and Lights, you must also set the calibration. In the file you attached, it clearly shows.You are not processing with master Daks or master Flats, but with all Dark and Flats images, which multiplies the process and therefore the time.
Page 17.jpg
You must select in the calibration settings to use the generated master.Calibration.jpg Bias are automatic. Only with this adjustment in calibration, I went from 17 hours to 24 min. With the new equipment (68Gb RAM, intel i9) in 4 min.
NOTE:
If you want to try Dark Flats, don't use the Bias in WBPP and put the Dark and Dark Flat in the Dark tabs. Pixinsight will recognize them, don't worry, and again, in calibration, select Dark and Flats. Try both methods and decide which one works best for you.

Got it - thank you, although I thought, using the default settings, it would have done this itself? I guess not!
Like
Alan_Brunelle
...
· 
Alan Brunelle:
The expectation being that one would alway, always go back and redo the PP manually.


i don't know about that. the suggestion was to go back and do the light integration manually, not the whole preprocessing flow. the point of BPP (and then WBPP) was to automate the drudge work of matching flats with lights, darks with lights, darks with flats, etc.

I am not able to go back to the source of my recollection, but it was a post on the PI forum not long after the WBPP was released.  I tried, but there is just too much stuff about WBPP to sort through it all and the search function is not able to tease out that post.  For what it is worth, it was a statement from one of the PI staff, and may well have been from one of the developers of WBPP.  Just not sure.  But it was a very strong, clear, philosophical statement made by someone in the organization.  Whether the rest of the developers at PI agreed with that statement is irrelevant, since it was not quickly followed up by anyone else from PI refuting it within my recollection.  But it was a clear enough statement that it has stuck with me through the years.  And the point was that, in his words (my recollection), WBPP was not intended as a means to generate a final product, but to work out the issues to get the best conditions for preprocessing.  However, the construction of WBPP then and now certainly is such that what you say makes much more sense to me.  Except that I don't know what your first statement means about going back and doing the light integrations manually.  Why wouldn't I just let WBPP do the light integration?  Or why wouldn't I just use the product of integration if it meets my standard?  Its probably the least problematic or complex part of the process.
Edited ...
Like
pfile 1.81
...
· 
·  2 likes
Alan Brunelle:
Alan Brunelle:
The expectation being that one would alway, always go back and redo the PP manually.


i don't know about that. the suggestion was to go back and do the light integration manually, not the whole preprocessing flow. the point of BPP (and then WBPP) was to automate the drudge work of matching flats with lights, darks with lights, darks with flats, etc.

I am not able to go back to the source of my recollection, but it was a post on the PI forum not long after the WBPP was released.  I tried, but there is just too much stuff about WBPP to sort through it all and the search function is not able to tease out that post.  For what it is worth, it was a statement from one of the PI staff, and may well have been from one of the developers of WBPP.  Just not sure.  But it was a very strong, clear, philosophical statement made by someone in the organization.  Whether the rest of the developers at PI agreed with that statement is irrelevant, since it was not quickly followed up by anyone else from PI refuting it within my recollection.  But it was a clear enough statement that it has stuck with me through the years.  And the point was that, in his words (my recollection), WBPP was not intended as a means to generate a final product, but to work out the issues to get the best conditions for preprocessing.  However, the construction of WBPP then and now certainly is such that what you say makes much more sense to me.  Except that I don't know what your first statement means about going back and doing the light integrations manually.  Why wouldn't I just let WBPP do the light integration?  Or why wouldn't I just use the product of integration if it meets my standard?  Its probably the least problematic or complex part of the process.


in the forums, they were adamant about the fact that there are too many options involved in ImageIntegration for BPP to be able to automatically select the right ones, at least for light frame integration. in particular, how to normalize the input images and what pixel rejection algorithm to use. in fact, BPP used to pop up a warning dialog box saying exactly what i'm saying - that the recommendation was to go and re-do the light integration yourself. many people objected to this dialog box on the grounds you are saying here, that if i like the output, why not use it? but that's not how Juan and company roll. they believe in letting the math guide you - in this case, meaning that you should tweak the ImageIntegration controls to maximize the SNR of the final product. the fear being that BPP would select too aggressive or wrong rejection parameters and compromise the SNR of the result. in the end, the master light integration was felt to be an iterative process and BPP could not do the iteration for you.

you may wonder if this applies to bias, darks and flats, but the proper integration parameters for those types of frames are pretty cut-and-dried and there's not much need to tweak them. for instance with bias and dark you never use normalization and you don't need really fine-tuned rejection parameters as the outliers you are rejecting (cosmic ray hits) are usually super-outliers and easy to identify. for flats you always use multiplicative normalization. for panel flats at least, there's no need to do any pixel rejection.

really, there's no point in even having the BPP script if you are going to go back and manually run ImageIntegration on your bias/dark masters, ImageCalibration on your lights 4 times, once per filter, being sure to carefully select the right dark, bias and flats each time, etc. the process of matching dark durations, gains, offsets, flats, etc. is very error prone and matching these up properly is something that a computer is very good at doing and there's no risk that the results will be suboptimal.

there are lots of people who recommend doing the whole flow manually at least once so that you understand what WBPP is doing behind the scenes, but that's not an argument for not using BPP/WBPP for the preprocessing task. i think the warning that BPP used to produce about the master light and the recommendation to understand the flow have been conflated here. i'm certain the PI devs never said to only use BPP or WBPP to figure out the correct preprocessing pipeline. if they ever did, it is certainly not true now.

WBPP has become a beast of a script, and if you notice, two things have emerged since BPP was created so long ago. first is LocalNormalization, which the developers believe can run "unattended" as long as a good normalization reference image exists. the other is the ESD rejection method, which again is felt to produce good results without fine-tuning. unfortunately it is *very* processor and memory intensive. these two things together have relaxed the requirement to go back and re-do the light integration.
Like
 
Register or login to create to post a reply.