How to Use the Command 'Duplicity' (with Examples)
Duplicity is a powerful and reliable command-line utility designed for creating encrypted, compressed, and incremental backups. It supports storing these backups on a variety of backend services, offering an excellent solution for both personal and professional data preservation needs. Its capabilities include not only backing up directories but also managing and restoring these backups effectively. Duplicity is versatile and secure, catering to various backup configurations and use-case requirements.
Use Case 1: Backup a Directory via FTPS to a Remote Machine, Encrypting it with a Password
Code:
FTP_PASSWORD=ftp_login_password PASSPHRASE=encryption_password duplicity path/to/source/directory ftps://user@hostname/target/directory/path/
Motivation:
In today’s digital age, ensuring the security and integrity of data is crucial. Backing up directories to remote locations via FTPS (File Transfer Protocol Secure) provides an extra layer of security by encrypting data in transit. By encrypting the backup with a password, users ensure that the data remains confidential and protected from unauthorized access.
Explanation:
FTP_PASSWORD=ftp_login_password
: This environment variable sets the password for accessing the FTP server. Since FTPS is a secure connection, a password is required to authenticate and establish the connection to the remote server.PASSPHRASE=encryption_password
: This sets the password used to encrypt the backup, ensuring data security and confidentiality. The encryption prevents unauthorized access to the backup contents.duplicity path/to/source/directory
: The duplicity command specifies the local directory that you want to back up. It reads all the files within this directory and prepares them for transfer.ftps://user@hostname/target/directory/path/
: This specifies the remote FTPS destination where the backup will be stored. Theuser
is the username for the FTP account, andhostname
is the address of the remote server. The rest of the URL specifies the target directory path on the remote server.
Example Output:
Transaction started
Copying files to FTP location...
Uploading encrypted data...
Backup finished successfully
Use Case 2: Backup a Directory to Amazon S3, Doing a Full Backup Every Month
Code:
duplicity --full-if-older-than 1M s3://bucket_name[/prefix]
Motivation:
Utilizing cloud storage services like Amazon S3 for backups is a strategic choice given its reliability and scalability. By setting up full backups every month, users can have peace of mind knowing their data resilience is regularly refreshed without relying solely on incremental changes. This strategy combines efficient use of storage with periodic data integrity checks.
Explanation:
duplicity
: This initiates the duplicity command, readying it for a backup operation.--full-if-older-than 1M
: This flag instructs duplicity to perform a full backup if the latest full backup is older than one month (1M
). This ensures that your backup remains up-to-date, providing a comprehensive snapshot of your data at least once a month.s3://bucket_name[/prefix]
: This specifies the Amazon S3 bucket and optional prefix path where the backups will be stored. Thebucket_name
is the specific S3 storage space, andprefix
can direct duplicity to store the backup in a specific location within that bucket.
Example Output:
Preparing to upload to Amazon S3...
Performing full backup as last full backup is older than 1 month...
Backup successful, files transmitted to S3
Use Case 3: Delete Versions Older than 1 Year from a Backup Stored on a WebDAV Share
Code:
FTP_PASSWORD=webdav_login_password duplicity remove-older-than 1Y --force webdav[s]://user@hostname[:port]/some_dir
Motivation:
Over time, storage space can become constrained as older versions of backups accumulate. Deleting backups that are older than necessary helps manage storage costs and optimize space usage. In environments utilizing WebDAV shares, it becomes crucial to maintain only the necessary backup data, discarding what is no longer needed.
Explanation:
FTP_PASSWORD=webdav_login_password
: This password is used to authenticate the connection to the WebDAV share, ensuring that only authorized users can manage the backups stored in the specified directory.duplicity remove-older-than 1Y
: This command is issued through duplicity to systematically remove backup versions that are older than one year (1Y
). This conservative retention policy balances the need for historical data recovery with storage limitations.--force
: This option is necessary to bypass any safety prompts that typically verify your decision to delete old backups, automating the process of clearing outdated data.webdav[s]://user@hostname[:port]/some_dir
: This specifies the WebDAV URL where the backups are stored. It includes credentials in the form of auser
with optionalhostname
port, and specifies the directory path within the WebDAV location.
Example Output:
Connecting to WebDAV share...
Identifying backups older than 1 year...
Deleting older backups...
Cleanup successful
Use Case 4: List the Available Backups
Code:
duplicity collection-status "file://absolute/path/to/backup/directory"
Motivation:
Understanding what backups are available in your repository is essential for informed decision-making, whether it’s for restoration purposes or routine verification of backup strategies. Knowing the available backups ahead of time is invaluable in assessing the resilience of your data management approach.
Explanation:
duplicity collection-status
: This command is used with duplicity to display detailed information about the collection of backups available in a specific storage location."file://absolute/path/to/backup/directory"
: This specifies the file path where the backups are stored locally, leveraging thefile://
scheme to denote a local directory structure.
Example Output:
Collection Status:
- Full Backup: [2023-01-01]
- Incremental Backup: [2023-01-15, 2023-02-01, 2023-02-15]
Total backed up data: 200GB
Use Case 5: List the Files in a Backup Stored on a Remote Machine, via SSH
Code:
duplicity list-current-files --time YYYY-MM-DD scp://user@hostname/path/to/backup/dir
Motivation:
Being able to list the contents of a backup without restoring it is vital for quick verification and inventory purposes. This use case illustrates checking specific backup contents remotely via SSH, which offers a secure means of interacting with or verifying the backup repository.
Explanation:
duplicity list-current-files
: This is the duplicity command that lists files from the most recent backup.--time YYYY-MM-DD
: This option specifies the exact date of the backup to inspect, providing the user with a temporal checkpoint for file inventory.scp://user@hostname/path/to/backup/dir
: The SCP URL here denotes a secure copy protocol path used to connect and list files stored in a remote backup directory via SSH.
Example Output:
Current files as of [YYYY-MM-DD]:
- /home/user/Documents/report.pdf
- /home/user/Photos/vacation.jpg
Total files: 1002
Use Case 6: Restore a Subdirectory from a GnuPG-Encrypted Local Backup to a Given Location
Code:
PASSPHRASE=gpg_key_password duplicity restore --encrypt-key gpg_key_id --path-to-restore relative/path/restoredirectory file://absolute/path/to/backup/directory path/to/directory/to/restore/to
Motivation:
Restoring specific parts of a backup allows for targeted recovery and reduces unnecessary data restoration. When dealing with encrypted backups, having the capability to restore a subdirectory using GnuPG encryption ensures secure and efficient data recovery for sensitive or critical information.
Explanation:
PASSPHRASE=gpg_key_password
: This environment variable provides the passphrase necessary for decrypting the GnuPG-protected backup files, offering a layer of security during the restoration process.duplicity restore
: This initiates the duplicity command for restoration operations, focusing on retrieving data from existing backups.--encrypt-key gpg_key_id
: Specifies the GnuPG key identification used during backup encryption, ensuring the correct key is paired with the backup for successful decryption.--path-to-restore relative/path/restoredirectory
: This denotes the specific subdirectory to restore within the backup, leveraging a relative path to focus the restoration on targeted data.file://absolute/path/to/backup/directory
: The source location for the local backup from which the subdirectory will be restored.path/to/directory/to/restore/to
: The destination path where the restored subdirectory will be unpacked, ensuring the data is reinstated at the desired location.
Example Output:
Decrypting and validating the backup...
Restoring subdirectory to the specified location...
Restore completed successfully
Conclusion:
Utilizing duplicity for backup operations showcases its robust capability in managing and securing data across multiple storage solutions and configurations. From encrypting and securing sensitive data to efficient data management and restoration, duplicity provides comprehensive coverage for various backup requirements. By understanding and applying each use case effectively, users can enhance their data protection strategies significantly.