· Astrophotography · 6 min read
NAS File Organization

Setup
As I mentioned at the beginning of my file organization post, I purchased a Synology DS923+ on Black Friday. The main objective was adding a ton of file storage and potentially replacing a few storage subscriptions with one solution.
Here’s the setup:
- Synology DS923+
- 2 x Western Digital Ultrastar 14TB HDD
- 2 x random drives I had laying around (500GB and 2TB)
I used the Synology Hybrid RAID 1 RAID type, which provides tolerance for 1 disk failure and a great deal of flexibility on drive types. This strategy provides 14.4 Terabytes of usable storage, which should hold me over for awhile. If you want to play around with drive configurations, Synology’s RAID calculator is a great tool.
Once the NAS was configured, I used Robocopy to move my raw astrophotography files over. My scripted approach generates a lot of shortcuts, which I had Robocopy skip. Those links would either be broken or would result in the actual duplicate files being copied instead, neither of which was desirable.
Once all of the files were copied over, I set up Synology Drive, which allows clients to sync files locally from the NAS and generally interact with the filesystem as if it was local. If you are familiar with Google Drive’s desktop client, this one behaves nearly identically. You can pin folders to always be synced to your computer and otherwise it will download files on demand as needed.
Great, now we’ve got all the files off our local hard drive and on the network. I’ll walk through a few scenarios and how they have changed with the NAS.
Scripting changes
I was pleasantly surprised to find I didn’t have to change much with my scripting approach after swapping to the NAS. The main thing I did was change the directories I was looking at for library files and flats to be within the Synology Drive file structure rather than my local location. In my case, this transitioned from D:\astrophotography\_shared
to D:\SynologyDrive\astrophotography\_shared
. Now astro_ccd
will work just fine and link to other files via Synology Drive.
Unfortunately, astro_summary
can no longer be run locally because it will attempt to download every single file it inspects. I might be able to work around this by changing the behavior of the script itself to just use filenames, but in the meantime, I SSH onto the NAS and run it from there.
Stacking and Processing
I use PixInsight, so WeightedBatchPreProcessing is my weapon of choice. When I want to stack up a target, I have two options:
- Pin local copies of the files to download them upfront - this approach might make sense if you were taking your files on the go or wanted to download them overnight
- Just let Synology Drive download when adding to WBPP - this often means a couple GB of downloads. My wifi is fast, so this only takes a couple minutes typically, but I could plug into Ethernet if I wanted to accelerate the process.
After this initial download, all of the files are stored on your local drive. In the second case, they are cached but Synology Drive is free to remove them in case it wants to free up space at some point (I haven’t yet discovered if/when it does this). From there, everything is normal in WBPP - I just make sure to select an output folder that is on my local hard drive and not in the SynologyDrive structure, since there’s no need to sync/backup the calibrated/debayered/registered images or the stacked master files.
Syncing after a session
My favorite change I’ve made with the NAS is to have my N.I.N.A. session end by copying all of the subs from that night up to the NAS. When I wake up in the morning, there’s no file copying to do! I borrowed some ideas from Patriot Astro, but it still took a bit of doing to get the commands and structure right for my setup.
Here’s what the template looks like in N.I.N.A.
I store my sessions by date, so I wanted to ensure I only copy the latest session. Because I wanted to do date math, I needed a little Powershell magic. I replicated N.I.N.A.’s “date of current time - 12 hours” to ensure that the script still works before and after midnight:
$session_date = [DateTime]::Now.AddHours(-12).ToString("yyyy-MM-dd")
echo on
robocopy C:\Users\Brian\Pictures\$session_date \\nasty\home\astrophotography\_staging\$session_date /njs /njh /e /xx /xo /r:2 /w:10
Yes, I nicknamed my NAS “nasty” because that’s how I roll. This script sticks the session folder in my _staging
directory, where I will filter out bad subs before integrating into the main file structure. I pinned the _staging
directory on my laptop, so when I turn it on in the morning, it pulls the files from the NAS and then I can clean up the subs as normal. The one thing to remember is to “unpin” the folders as you move them into the main file structure to ensure Synology Drive can clean them up as needed.
I also added some handy scripts for copying over Moon and Sun (swap the word “Moon” for “Sun” in that one spot) images from where I told Firecapture to put them:
echo off
set /p "date=Enter Date: "
call set formattedDate=%%date:-=_%%
echo on
robocopy C:\Users\Brian\Pictures\Moon\%formattedDate% \\nasty\home\astrophotography\_staging\ /njs /njh /e /xx /xo /r:2 /w:10
Lastly, I added a script to backup N.I.N.A.’s templates and targets:
echo on
robocopy "C:\Users\Brian\Documents\N.I.N.A" "\\nasty\home\NINA Backups" /njs /njh /e /xx /xo /r:2 /w:10
Backups
The NAS provides redundancy in case of drive failure, but files stored in one single physical location are still subject to local conditions (natural disasters, fire, ec.). Synology has many different cloud backup offerings, but I chose to use AWS Glacier. It works well, but my main motivation for choosing that is I get a monthly personal AWS budget from my company, so this storage is effectively free for me. It costs under $15 per month for over 3 terabytes of data, so that’s not too bad. There may be more cost-effective options depending on how much storage you need, but I didn’t really bother to explore.
Summary
The NAS transition has been pretty easy and I’ve been very happy with Synology’s hardware and software thus far. It gives me great peace of mind not relying on a single laptop hard drive and having both local and cloud redundancy. Happy file management and clear skies!