I want to create a copy of my NAS hosted media folder that is about 30TB. I have a bunch of 4-8TB local (most USB3, some SATA) disks, and I would like to copy these files to the various destinations maximizing space used and time required. Since I have a 10GBE network, I can read data far faster than I can write to any of the destinations, so multiple simultaneous file copies is required at the same time to maximize this activity. doing this manually is painful. trying to select the maximum number of files that can fit (but not go over) each destination is a pain. Any thoughts on a script or an app I can use to assist here is appreciated. I want to leave the files in their native format, so I am looking for a file copy, not block-based backup etc.

You are viewing a single thread.
View all comments View context
1 point

This is an interesting option but is it suitable for a one time copy?

permalink
report
parent
reply

Data Hoarder

!datahoarder@selfhosted.forum

Create post

We are digital librarians. Among us are represented the various reasons to keep data – legal requirements, competitive requirements, uncertainty of permanence of cloud services, distaste for transmitting your data externally (e.g. government or corporate espionage), cultural and familial archivists, internet collapse preppers, and people who do it themselves so they’re sure it’s done right. Everyone has their reasons for curating the data they have decided to keep (either forever or For A Damn Long Time ™ ). Along the way we have sought out like-minded individuals to exchange strategies, war stories, and cautionary tales of failures.

Community stats

  • 1

    Monthly active users

  • 913

    Posts

  • 4.6K

    Comments

Community moderators