I once had a very annoying problem when trying to backup my computer. Less than 50GB could take an entire night!
So I almost never did backup which was annoying for security and for distro-hopping. That was before a guy on Discord showed me a simple trick that would convert those annoying hours of backup in to less than 5 minutes.
sudo su mount /dev/sdb1 /mnt # Mount the backup drive cd mnt/ tar cvf <backup name>.tar <path to the stuff to backup i.e /home> tar tf <backup name>.tar >> <backup name>.txt
The three first commands were to mount and enter the disk. Then first tar command was to backup all the files into one file, the second is to create a “table of content” of all the backed up files.
Why does it work?
The reason my computer was so slow to backup was because I had a LOT of very tiny (empty) files, so the backup drive stopped a short time after each file written, which slowed the whole process.
So that’s where tar comes in, what tar does is simply convert all the tiny files into a single huge file.
We need a table of content because tar has a problem: it’s very very fast to make, but slow to extract. So that file can be used to extratc specific files from the archive.
How to extract some files
First let’s mount the drive
sudo su mount /dev/sdb1 /mnt cd /mnt
Then, you’ll have to search for the files using the table of content: (except if you know the specific path)
grep "<file name or regex>" <backup name>.tar
Finally, simply take the path output by the precedent command and extract it using this command:
tar xvf <backup name>.tar <path to file or directory to extract>
With those simple three tar commands, you can make your backups very very fast if you have tons of very small files on your computer like me :)
tar cvf <backup path and name>.tar <path to backup> tar tf <backup path and name>.tar >> <backup path and name>.txt tar xvf <backup path and name>.tar <path to extract>