ASI2600 File Size Management ZWO ASI2600MC Pro · Ed Litoborski · ... · 39 · 2048 · 0

jerryyyyy 9.03
...
· 
Hi,
I just built a new imaging computer with the Ryzen 7950 and built in storage.  In W11 you can mirror (RAID 1) drives easily so I bought 2 18 TB drives and they are mirrored.  I use OneDrive for work in process with common folder for the images from my telescope... you pay for 1TB in Office 365.
JY
Like
Gamaholjad 3.31
...
· 
More the merrier, this hobby will chew through what ever you have  Buy what you can afford.  And save in pi compressed format. The save your final images in what is good for you tiff, jpeg etc etc. I have aspare 1tb ssd just for processing the current target, once finished i then move to backup drives for later additions.
Edited ...
Like
jerryyyyy 9.03
...
· 
I am very interested in the XISF conpression.  Here it says how to turn it on, but does not seem to save new files in any smaller format... BTW my ASI6200 files start out as 100k in FITS and are 200 in XISF (Huh?). 

https://pixinsight.com/forum/index.php?threads/can-xisf-files-always-saved-compressed.15323/

Seems a good strategy would be to take my FITS files into WBPP and convert them to the registered calibrated images in XISF and just save those, if they were smaller... hope I am missing something obvious.
Like
vercastro 4.06
...
· 
I am very interested in the XISF conpression.  Here it says how to turn it on, but does not seem to save new files in any smaller format... BTW my ASI6200 files start out as 100k in FITS and are 200 in XISF (Huh?). 

https://pixinsight.com/forum/index.php?threads/can-xisf-files-always-saved-compressed.15323/

Seems a good strategy would be to take my FITS files into WBPP and convert them to the registered calibrated images in XISF and just save those, if they were smaller... hope I am missing something obvious.

NINA live stack plugin has the capability to calibrate each sub frame automatically and save as compressed .xisf. Then you could simply transfer the calibrated subs off of your imaging PC to wherever you have long term storage. Saves time and space.
Like
jerryyyyy 9.03
...
· 
Well, time to use NINA... check the numbers... would be nice to be able to do this automatically in WBPP

Here are some real numbers after saving files individually:

Original FITS from ACP/Maxim    119k
FITS WBPP Master Pixinsight       239k
XISF WBPP Master Pixinsight       239k
XISF LZ4 compressed                     179k
XISF LZ4-HC                                     166k
XISF ZLIB                                          155

Looks like 60-70% savings.
Edited ...
Like
Die_Launische_Diva 11.14
...
· 
·  1 like
Well, time to use NINA... check the numbers... would be nice to be able to do this automatically in WBPP

Here are some real numbers after saving files individually:

Original FITS from ACP/Maxim    119k
FITS WBPP Master Pixinsight       239k
XISF WBPP Master Pixinsight       239k
XISF LZ4 compressed                     179k
XISF LZ4-HC                                     166k
XISF ZLIB                                          155

Looks like 60-70% savings.

Is this list in Kilobytes or in Megabytes? Anyway, keep in mind that usually XISF stores image in 32-bit thus it is expected XISF files to be larger from a FIT (or DSLR/MILC camera raw file).
Like
vercastro 4.06
...
· 
·  1 like
XISF can also be 16 bit. NINA saves as such. Only the masters will or should be saved as 32 bit.
Like
Linwood 5.76
...
· 
·  3 likes
Also keep in mind that flow like live stack that produces a calibrated output may not be the ideal thing to save permanently.  Pixinsight does periodically change how files are calibrated (notably calculations saved in metadata, not so much the image itself), so that Pixinsight recommends that to permit subsequent re-processing of image data, you save the original files (in whatever format).  

I used to calibrate and discard my flats; this broke the ability to use some of the new integration ranking functions when they came out, so now I save the originals and flats (and dark library) and discard all intermediate work products.
Like
jerryyyyy 9.03
...
· 
Hi,
OOps, truncated the ,000 from the k. 

Brought this up over at the PI Forum.  Maybe they can automatically use the compressed files in WBPP.

https://pixinsight.com/forum/index.php?threads/compression-in-wbpp.19765/

I guess the question is are 32-bit files twice the size of 16-bit files?  If so, you lose 1/2 your storage capacity.  I am not sure what he point is of having 32-bit precision if in reality you only have 16-bit accuracy (at best).  I guess some of the processes may benefit, and you probably want to keep the masters in 32-bit.  But I guess it makes sense just keeping the 16-bit FITS originals rather than 32-bit calibrated/registered images?   If you do that, I guess you also do not need the "projects" that take up a lot of space. But then as stated above you need to keep the flats etc...

Nothing is simple.
Edited ...
Like
Linwood 5.76
...
· 
·  2 likes
Remember files in PI as they are processed may benefit from added precision since you are dividing and subtracting and changing values as you stretch and de-noise and such.

Also, it's not necessarily desirable to have compressed files in WBPP itself, since it will slow it down quite a bit.  You can always do a batch conversion to compress any files you want to retain (and you could compress those aggressively; the most aggressive setting is quite compute intensive).
Like
jerryyyyy 9.03
...
· 
Linwood Ferguson:
Remember files in PI as they are processed may benefit from added precision since you are dividing and subtracting and changing values as you stretch and de-noise and such.

Also, it's not necessarily desirable to have compressed files in WBPP itself, since it will slow it down quite a bit.  You can always do a batch conversion to compress any files you want to retain (and you could compress those aggressively; the most aggressive setting is quite compute


Linwood Ferguson:
Remember files in PI as they are processed may benefit from added precision since you are dividing and subtracting and changing values as you stretch and de-noise and such.

Also, it's not necessarily desirable to have compressed files in WBPP itself, since it will slow it down quite a bit.  You can always do a batch conversion to compress any files you want to retain (and you could compress those aggressively; the most aggressive setting is quite compute intensive).

Thanks, I have a high performance computer, so could compress a lot.  Can you explain how you do a batch compression?  Not just Zipping a folder?
Like
Linwood 5.76
...
· 
·  3 likes
Thanks, I have a high performance computer, so could compress a lot.  Can you explain how you do a batch compression?  Not just Zipping a folder?

Got to Script, Batch, Batch Format Conversion.

For output format hints use compression-codec zlib+sh  compression-level 100  to get the slowest but greatest compression. 

Set an output directory, and if you want to overwrite the input files use the same as input and check the overwrite.

I just ran it on a flat that started at 233mb and ended 147mb.  image data will probably not compress as much.

Compressed files do not get a different file type, but PI just recognizes them and uses them (almost) as though they were the uncompressed versions.  Obviously it takes some time internally to decompress, and I think there are some aspects of big processes like integration that work slightly differently (not different output, just how it optimizes reading the files). 

But a simple way to use all this is leave everything uncompressed throughout processing for speed, and then any files  you plan to archive you can compress heavily at the end.  Having the original source subs compressed from capture has a mild effect but since they are used only once to provide the calibrated subs, the impact does not persist through processing.
Like
StuartT 4.69
...
· 
·  1 like
Linwood Ferguson:
Thanks, I have a high performance computer, so could compress a lot.  Can you explain how you do a batch compression?  Not just Zipping a folder?

Got to Script, Batch, Batch Format Conversion.

For output format hints use compression-codec zlib+sh  compression-level 100  to get the slowest but greatest compression. 

Set an output directory, and if you want to overwrite the input files use the same as input and check the overwrite.

I just ran it on a flat that started at 233mb and ended 147mb.  image data will probably not compress as much.

Compressed files do not get a different file type, but PI just recognizes them and uses them (almost) as though they were the uncompressed versions.  Obviously it takes some time internally to decompress, and I think there are some aspects of big processes like integration that work slightly differently (not different output, just how it optimizes reading the files). 

But a simple way to use all this is leave everything uncompressed throughout processing for speed, and then any files  you plan to archive you can compress heavily at the end.  Having the original source subs compressed from capture has a mild effect but since they are used only once to provide the calibrated subs, the impact does not persist through processing.

This is an excellent suggestion! Thanks. I never thought of just compressing all the archived files. I currently just archive off all my lights, flats etc to a USB drive but in original format. I'll compress them all instead and prob save a ton of space! I rarely re-visit them anyway (unless I make some major step forward with PI and decide to re-process)
Like
jerryyyyy 9.03
...
· 
·  1 like
The compression worked.  I had a folder with 97 O-III files from a current project and compressed them and it went from 22 GB to 16 GB.  These were the registered O-III... these are so huge... really need to think through the strategy as eventually will use up my 18 TB....
Like
astrodad1954 3.31
...
· 
14 TB WD external hard drive for $240 delivered. I used ~ 5 TB so far primarily on planetary imaging.
Like
 
Register or login to create to post a reply.