ASI2600 File Size Management ZWO ASI2600MC Pro · Ed Litoborski · ... · 39 · 2048 · 0

Edski 2.81
...
· 
So I just upgraded from the ASI533 mc pro to the 2600 mc pro and just to be able to run WBPP on the 1st light image, I had to run out and get a 5tb external HD.  I had a 1tb
Question is: going forward, what can I expect?  What do you all use?  Any file size long term storage tips and tricks?
Thanks in advance and clear skies.
Like
Mikey_G 0.90
...
· 
·  2 likes
I use an expandable NAS to store the data. At 50MB per image, it adds up quick. For my workstation, I have a 1TB SSD for OS and apps, and a 1TB SSD for images that I'm currently processing. Once I'm done processing, I move the data to the NAS.
Like
StuartT 4.69
...
· 
·  2 likes
I also have a 2600 and as Mikey says, the unbinned files are 50MB or so each. I keep the most recent datasets on my laptop until they are processed. Then I archive off all the raw subs, flats etc to a Seagate 5TB USB 3.0 external drive
Like
hbastro
...
· 
·  2 likes
I operate remote, using ASI6200 cameras, subframe files are 120 meg. For each camera I have 1TB local storage on the respective control computer, and a 10TB NAS at the remote site for backup. For processing I have a 14Tb working drive and a larger rack mounted NAS-mirror drive for archival storage.
Edited ...
Like
Linwood 5.76
...
· 
·  3 likes
Buy more disk. 

I use a large working area, for all current projects, and then offload the subs for finished projects to two separate USB drives.

But fundamentally, like video, this is a hobby that takes large amounts of disk.  It's just part of the expense.  There is no magic.
Like
vivian.budnik@umassmed.edu 3.01
...
· 
I use Dropbox to store everything. You need to buy space there, but everything is secure, retrievable at any time, and not that expensive.
best,

Vivian
Like
Linwood 5.76
...
· 
Vivian Budnik:
I use Dropbox to store everything. You need to buy space there, but everything is secure, retrievable at any time, and not that expensive.
best,

Vivian

That's only viable for people with really fast internet connections (including up).  While I'm 300mbs or so down, I'm only 11mbs up.  That takes a LONG time for anything large.  Though the US is so far behind other places in internet speeds, that may be more of a US thing.
Like
afd33 4.65
...
· 
·  1 like
Vivian Budnik:
I use Dropbox to store everything. You need to buy space there, but everything is secure, retrievable at any time, and not that expensive.
best,

Vivian

The problem I have with that is the cost. I could put together a NAS with about 10TB of storage for about $525. From what I can see dropbox costs $16.58/month for only 3TB of storage. So it would only take 2.5ish years to break even.

It's the same reason I bought Pixinsight instead of going with Photoshop. $260 once vs $21 for the rest of time. Only 12 months for that one.
Like
dmkusz 2.11
...
· 
·  1 like
Hello Ed,

I have just over 320 hours of integration and all associated calibration frames for each image in 2022. Using my 2600mm I have taken up 1.8TB of storage space. I use a 5TB HD and have 2 others for back up. I generally use only one hard-drive per year. So in 2023 I Have a fresh 5TB drove ready to go. It might me a bit much, but I have had HD failures and lost 1000's of frames so I am extra cautious with my files. I may start using a cloud service as back up for ultimate redundancy.

CS
Dan.
Edited ...
Like
vercastro 4.06
...
· 
·  4 likes
If you are using NINA consider enabled .xisf file compression. It will drastically decrease data usage.
Like
StuartT 4.69
...
· 
If you are using NINA consider enabled .xisf file compression. It will drastically decrease data usage.

This sounds interesting. I use NINA (2.0) but I can't see this feature. Can you point me to it?
Is the compression lossy at all? There is presumably some downside?
Like
vercastro 4.06
...
· 
·  3 likes
Stuart Taylor:
If you are using NINA consider enabled .xisf file compression. It will drastically decrease data usage.

This sounds interesting. I use NINA (2.0) but I can't see this feature. Can you point me to it?
Is the compression lossy at all? There is presumably some downside?

Under Options > Imaging > "Save image as" set to xisf. Then set compression to LZ4 and SHA-1, and enable "Byte Shuffling".

The compression is lossless. The downside is that you can only open the files in PixInsight (which may not be negative depending on your workflow) and they take slightly more CPU horsepower to decompress.
Edited ...
Like
TurtleCat 4.62
...
· 
·  1 like
I aggressively cull subs to keep file size down but they all zip very nicely. Usually a 20-40% reduction in size so periodically I'll just zip up the raw files for a project I'm basically done working on. I'm also not someone who keeps files around forever so I'll delete stuff that's hopeless, lol.
Like
Linwood 5.76
...
· 
<deleted>  (Sorry, I was mentioning .xisf before noting it was already discussed)
Edited ...
Like
olaskarpen 1.20
...
· 
Dropbox
Like
StuartT 4.69
...
· 
Stuart Taylor:
If you are using NINA consider enabled .xisf file compression. It will drastically decrease data usage.

This sounds interesting. I use NINA (2.0) but I can't see this feature. Can you point me to it?
Is the compression lossy at all? There is presumably some downside?

Under Options > Imaging > "Save image as" set to xisf. Then set compression to LZ4 and SHA-1, and enable "Byte Shuffling".

The compression is lossless. The downside is that you can only open the files in PixInsight (which may not be negative depending on your workflow) and they take slightly more CPU horsepower to decompress.

great! Thanks. I only use Pixinsight so that sounds good!
Like
Semper_Iuvenis 2.10
...
· 
The good news is 16TB drives are cheap.    As a photog I have many of these.   I use EMC Retrospect to keep a backup of all the data as well.
Like
jhayes_tucson 22.76
...
· 
·  3 likes
I operate remote, using ASI6200 cameras, subframe files are 120 meg. For each camera I have 1TB local storage on the respective control computer, and a 10TB NAS at the remote site for backup. For processing I have a 14Tb working drive and a larger rack mounted NAS-mirror drive for archival storage.

Like Dave, I use a NAS system to back up the 120MB files from my remote system in Chile.  It is set up to be fully automatic so that the data just appears on my NAS drive as it is taken.  That one scope typically generate 5-8 GB/day and with nearly 300 clear nights/yr that requires a capacity of 1.5 - 2.5 TB/yr.  I'm about to install a second scope which will double the storage requirements.  NAS systems are FAR less expensive than using something like Google Drive or Drop Box.  My system currently has 4x12 TB drives configured as SHR + RAID5 to give 36 TB of storage.  I have an additional, identical system offsite for backup.  I just use these NAS systems only for image data so they should last for roughly 18 years with one scope and another maybe 10 years with two scopes.  The system is expandable if I need more space.  Here's the system that I use:  https://www.synology.com/en-us/products/DS923+.  With four Seagate Ironwolf 12 TB drives this system cost around $1500.  You'll also want to install a good quality UPS and in my case, I had to upgrad my internet service to increase the upload speed to make it easier to download data remotely.  I may eventually upgrade my home internet to fiber to get symmetric data speeds (same upload and download speeds) but that's in the future.   It's not super cheap but it's a one-time investment that lasts for a decade.  If you look at what it would cost to save that much data on Google or Drop Box over that period of time, its 

One other thing about both Google Drive and Drop Box is that both serves have dropped their backup services and replaced them with "Sync" service.  Sync service is not suitable as pure data backup service since it keeps the target drive synchronized with the source drive.  That means that if you erase data on the observatory drive, it gets erased on the synchronized drive--and you lose it all!  That's bad.  NAS systems can be configured the same way -or- they can be configured as a true backup system, which is what you want!  The other problem with both Google Drive and Drop Box is that it's hard to down load a folder with all of the ~6 GB of data taken in a single night.  There are ways to do it but it's common to have the download stop and need to be restarted.  It is NOT a smooth process.


-John
Like
Linwood 5.76
...
· 
·  3 likes
Incidentally, one space management technique I don't see mentioned:  get where you can shoot longer exposures.

200 x 60s exposures vs 100 x 120s vs 50 x 240s may not be perfectly equivalent, but can be similar (depending on target and sky).  If you can keep the stars round at longer exposures, this can save quite a lot of space.
Like
jhayes_tucson 22.76
...
· 
Linwood Ferguson:
Incidentally, one space management technique I don't see mentioned:  get where you can shoot longer exposures.

200 x 60s exposures vs 100 x 120s vs 50 x 240s may not be perfectly equivalent, but can be similar (depending on target and sky).  If you can keep the stars round at longer exposures, this can save quite a lot of space.

Agreed.  This is one strategy but the other that I've suggested to SGP is to allow sub-frame imaging.  This is easily done in MaximDL and a few other programs but it is notably lacking in SGP, which I use for my 20" in Chile.  I often work on small targets that don't require the full field and I could save on both storage space and bandwidth requirement if I could simply specify a cropped region of the sensor that I want to work with.  That capability would allow lucky imaging on very small targets even with a remote system.  It will be interesting to see how long it takes for the guy at SGP to get around to it.

John
Like
jewzaam 3.01
...
· 
Linwood Ferguson:
Incidentally, one space management technique I don't see mentioned:  get where you can shoot longer exposures.

200 x 60s exposures vs 100 x 120s vs 50 x 240s may not be perfectly equivalent, but can be similar (depending on target and sky).  If you can keep the stars round at longer exposures, this can save quite a lot of space.

For the his reason I shoot with gain 0 for broadband filters. Better dynamic range and longer exposures. As long as you can swamp noise, why not?
Edited ...
Like
Linwood 5.76
...
· 
John Hayes:
Linwood Ferguson:
Incidentally, one space management technique I don't see mentioned:  get where you can shoot longer exposures.

200 x 60s exposures vs 100 x 120s vs 50 x 240s may not be perfectly equivalent, but can be similar (depending on target and sky).  If you can keep the stars round at longer exposures, this can save quite a lot of space.

Agreed.  This is one strategy but the other that I've suggested to SGP is to allow sub-frame imaging.  This is easily done in MaximDL and a few other programs but it is notably lacking in SGP, which I use for my 20" in Chile.  I often work on small targets that don't require the full field and I could save on both storage space and bandwidth requirement if I could simply specify a cropped region of the sensor that I want to work with.  That capability would allow lucky imaging on very small targets even with a remote system.  It will be interesting to see how long it takes for the guy at SGP to get around to it.

John

NINA has added some support for that, though another alternative with programs like Pixinsight (with good batch processing) is you can do a batch crop on all your images after the fact.  That doesn't save transfer speed or space on the imaging computer, but for long term storage it does help.

A downside of subframe or crops is if done pre-calibration you need similarly sized flats and darks and such.
Like
jhayes_tucson 22.76
...
· 
Linwood Ferguson:
NINA has added some support for that, though another alternative with programs like Pixinsight (with good batch processing) is you can do a batch crop on all your images after the fact.  That doesn't save transfer speed or space on the imaging computer, but for long term storage it does help.

A downside of subframe or crops is if done pre-calibration you need similarly sized flats and darks and such.

I use SkyGuard for guiding and when I set up my system, NINA didn't support it.  Post acquisition cropping is certainly a possibility but it doesn't solve how much data would be required on the local data buffer with really short exposures, how long it would take to transfer all that data, and I would run into the daily max data limit set by the observatory.  Flats and darks could easily be cropped but they are also easily retaken so I don't consider that to be a problem.  The actual programming required to add this capability to SGP is pretty trivial compared to a lot of things already in that software.  The real problem is that the have a thousand users all screaming for their own features or bug fixes so it could take a long time for them to ever get around to adding this capability.

John
Like
Teo_ 0.00
...
· 
I work on a local 4TB SSD. Then, once elaborated, I move all on external 14TB HDD. I keep masters and final image on the internal SSD.
The 14TB HDD has multiples bakup on a NAS. It has slower connection but with more space (about 36TB).
Like
paolostivanin 0.00
...
· 
·  1 like
Under Options > Imaging > "Save image as" set to xisf. Then set compression to LZ4 and SHA-1, and enable "Byte Shuffling".

The compression is lossless. The downside is that you can only open the files in PixInsight (which may not be negative depending on your workflow) and they take slightly more CPU horsepower to decompress.

This was an amazing tip, thank you very much!!
I wasn't aware of this feature. It drastically decreased the amount of required space for my qhy268m (-60% for calibration frames, -42% for lights)
Edited ...
Like
 
Register or login to create to post a reply.