Full backup – definition of a full backup

A full backup is at the core of any backup method. There are many other types of backups, but all of them require the initial complete backup to work as intended. Creating a full backup is straightforward, though inefficient in the long run. Read on to find out more about full backups as part of our series of articles on backup strategies.

Note

We have answered the question ‘what is a backup?’ in detail here.

MyDefender
Safeguard your data with easy cyber security
  • Protection against ransomware attacks
  • Regular virus scans
  • Automatic backups and simple file recovery

What is a full backup and how is it created?

A full backup, also known as a complete backup, creates an exact copy of an entire dataset. Since a full data backup refers to a defined dataset, it is a relative term. The dataset from which a backup is created can include:

  • all data on a laptop
  • all payrolls of one year
  • the entire root directory of a website.

So what is the difference between a full backup and a normal copy? First, a backup is usually created on a different physical medium than the one that contains the original data set. For example, one copies the data from the internal laptop SSD to an external hard drive. Furthermore, a backup is an exact copy, which is why the process of creating a backup is also called ‘cloning’ or ‘mirroring’.

It should be possible to faithfully reconstruct the original state from a full backup. The purpose of a full backup is to create redundancy and to store the resulting copies in a distributed manner in order to minimise data loss.

To create a copy in digital systems, data must be transferred. The data is read from the source and written to the target. Depending on the size of the data set and the data line used, a complete backup can take a very long time.

To create a full backup, data is copied to physical storage such as an external hard drive or USB stick, or uploaded to cloud storage. Depending on the operating system and requirements, specialised software is used for this purpose. However, often existing ‘on-board’ tools in an operating system are sufficient. Below are a few examples of commands that can be used to create a full data backup using the command line.

Tip

Use HiDrive, the high-performance cloud storage from IONOS for your business to create professional backups.

Create a copy of a directory (Linux)

To backup of a directory on the command line, we use the Linux copy command. We copy the source directory to a destination path. The destination can be on any volume mounted on the system:

cp -a <source-dir> <target-path>

With the -a option (‘Archive’) the copy command creates an exact copy: timestamps, access rights, etc. of target data are set to the exact values of the corresponding sources. This is critical, among other things, for systems that include code, such as WordPress. If the copy is created without using the archive option, there is a risk of security vulnerabilities or loss of functionality.

Create tarball archive

A widely used method for creating a complete backup is to create a tarball archive. To do this, we use the Linux-tar command (the name stands for ‘Tape Archive’) to create a single file from multiple files or directories. For this to work, the target file must be specified as the first parameter:

tar -czf <target-file>.tar.gz <source-1> <source-2> <source-3>

The -z option instructs the tar command to use Gzip compression. The resulting .tar.gz file is usually a factor of 2 to 10 smaller than the total size of the data included in the backup.

Mirror directory with Rsync

Rsync is a sophisticated software for copying data sets. The source and destination can be located on the same physical system or on remote systems connected via the network. In case of the latter, Rsync uses the mature SSH protocol, among others.

Server backups with Rsync are created or restored in this way. For large data sets, it is advantageous that Rsync is able to resume aborted backups. Let’s look at the simplest example for creating an exact directory copy:

rsync -a <source-dir>/ <target-path>

As with the cp command, the -a option for ‘archive’ is used to create an exact copy. Our example assumes the directory to be copied does not yet exist on the specified path. If the directory already exists on the source, Rsync intelligently transfers only the changes as of the last copy operation. In this case, it is no longer defined as a full backup, but a differential backup.

Mirror directory under Windows with Robocopy

The cp and Rsync commands are Linux tools. They are also available under Windows via WSL2 (‘Windows Subsystem for Linux’). With Robocopy, a Windows-specific alternative is available. This is particularly useful because the NTFS file system used under Windows has some special features.

Similar to its Linux counterpart, Robocopy Backup is used from the command line. Besides an obvious connection to RoboCop, the name of the tool stands for ‘Robust File Copy’. Let’s look at an example Robocopy command. We mirror a source directory to a destination path:

robocopy /mir <source-dir>/ <target-path>

The /mir option stands for ‘mirror’. Robocopy creates a complete backup of the source directory. Destination and source are located on the local system or a Windows share connected via the network. Unlike Rsync, Robocopy does not support copying via SSH connection.

Create a full backup of a Mac or Windows computer

To backup on Mac or Windows 10, you can use the system-integrated tools Time Machine or Windows Backup. With both tools a complete backup is created the first time they are run. In subsequent backups, only the changes as of the last backup are transferred. The system can be completely restored from the full backup.

Secure data in the cloud

The tools presented so far use a local or remote operating system to write the target data. But what happens when you want to store data in the cloud for backing up? You could use a service like Cloud Backup from IONOS to this end. This protects your business data optimally against data loss and other issues like ransomware encryption.

What are the pros and cons of a full backup?

Depending on usage, it can make more or less sense to create a complete backup of a data set. Other specialised backup methods exist. Let’s look at the advantages and disadvantages of a complete backup.

Advantages of a full data backup

There are three main advantages to creating a full backup:

  1. Easy to set up: onboard tools are enough
  2. Most reliable backup method: low risk of data loss
  3. Easy to restore: reversing the copy process is usually enough

One advantage of a complete backup is that it is relatively easy to create. As a rule, you do not need any specialised software, but can use existing on-board tools. Because a full backup includes the entire data stock, the creation does not require any special preparations. Only two requirements must be met:

  1. There is enough memory available on the target system.
  2. There is sufficient bandwidth to complete the copying process in an acceptable amount of time.

Once you ensured that both requirements are met, you begin the backup process. Then you have to wait. It can take a while for a full backup to complete. Once copying is completed, you should check that the backup was fully done and without errors.

Another big advantage of a full backup is reliability. Since a complete backup includes the entire data set to be backed up, it is impossible to forget or overlook data. However, for this to happen, it is important to stick to the original intention: a full backup, by definition, means that the entire data stock is backed up.

Especially with large amounts of data, it may be tempting to find arguments to exclude certain data from the backup: ‘We don’t need that anyway’, ‘it was all backed up last time’, etc. Unfortunately, this could lead to data loss. If no current backup of the excluded data exists, the damage is done. Therefore, it is necessary to back up the entire database even if it may take a long time.

Creating a backup is only half the battle. A backup is only valuable as long as it can be restored true to the original. Here, too, the full backup is the simplest: to restore data, only the backup itself is needed; it is usually sufficient to swap the source and destination and run the copy process again.

However, with full backups, as with all backup methods, you should not assume that the original data can be restored from the backup without testing it first. Testing restores from backups are part of any solid backup strategy. Under no circumstances do you want to realise that the carefully created backups are worthless only when data loss occurs.

Disadvantages of a full data backup

The main disadvantage of a full backup is inefficiency. Creating a full data backup takes a long time and tends to take up a lot of space on the target medium. This is because a full backup, by definition, backs up the entire data set.

Mac users will be familiar with this issue from using the native backup software Time Machine. The first backup can take hours, or even an entire night depending on the amount of data to be backed up. This is because a complete backup of all data on the Mac is created during the first run.

Tip

Protect yourself from data loss and try automated online backups from MyDefender!

When is a full backup used?

First of all, creating a full backup is the basis for subsequent differential or incremental backups. So without a full backup there is nothing as part of a backup strategy. In addition, there are a few circumstances that require or favour the creation of a full backup. Let’s take a look at a few examples.

If it is unknown which data should be backed up

Sometimes it may be unclear which components of a data set are important. In this case, it is advantageous to back up fully first. With this safety net in place, the data can be viewed and sorted into categories such as ‘bin it’ and ‘keep it’. If any errors occur, the previously created full backup can be accessed.

As a safety measure before making changes

A similar situation emerges when changes to a system are made. Web administrators will be familiar with the following: a new customer has an existing system that was previously maintained by another admin. Now, a new admin is supposed to make changes. But what happens if something goes wrong? The new admin does not know the system, the old admin is no longer available. Here it’s advantageous to back up completely first. If changes fail and damage the system, admins can fall back on the full backup. Again, it is important to test the restore before making the changes.

In preparation for a system migration

If you want to migrate a system, e.g. move a WordPress site from one server to another, you need a complete copy of the website data. In the case of WordPress, this includes at least the WordPress root directory and the WordPress database. To prepare for the migration, one creates a complete backup of the data and transfers it to the new server. There, the site is reconstructed from the backup. Once you have ensured that the site is running smoothly, it can go live. In most cases, the old system is left in place for a while so that it can be used in an emergency. It acts as a full backup, so to speak.

To prepare the restart of a system

Sometimes it is necessary to reboot a system. Think, for example, of a laptop with hard disk errors, or a website with a reinstalled server. In these cases, the procedure is similar to that used for system migration: create a complete backup, set up the underlying hardware or software again, and then restore the backup.

However, there is a crucial difference to system migration: when restarting, the original system is lost. It is best to follow the 3-2-1-backup-rule and create at least two complete backups of the system in separate locations. You should also test the restore process before restarting.

To archive project data

When a project is completed, all important data must be archived for the future. In research and development projects, change-proof archiving is a basic requirement for the availability or reproducibility of the results. In photo or film production, large quantities of unique raw data are generated. These also need to be archived. The simplest measure is to create a full backup as a tarball or ZIP archive and store it on redundant data storage.

Even web projects reach the end of their life at some point. Before taking a website offline, it is standard practice to create a final, complete backup. This allows the site to be reconstructed in the future if necessary. Since a website includes code in addition to static data, you must include the execution environment in the backup. For example, one creates an image based on a virtual machine or container. This ensures that the system can be started in the future with all its dependencies.

For data recovery from an unstable system

You might be familiar with the following: an older computer shows increasing signs of failure. At first, the computer crashes occasionally, then ever more frequently, until it rarely starts up. Here you need to act fast to salvage what can be saved. If you can get the system to run stably again, you should create a complete backup as soon as possible. This might be the last chance to do so. Once the backup has been created, you can sort through the data.

When the importance of a small, rapidly changing data set is high

A common scenario that prompts users to intuitively create full backups is when they are dealing with an important data set that is regularly updated. Consider a folder containing a doctoral dissertation in the editorial process. The folder may contain only a handful of documents that collectively comprise a few megabytes. The doctoral student works on the documents every day and backs up any changes that occur by copying the entire folder to a USB flash drive in the evening. Although relatively inefficient, this approach is widely used among less technically savvy users because of its simplicity.

HiDrive Cloud Storage
Store and share your data on the go
  • Store, share and edit data easily
  • ISO-certified European data centres
  • Highly secure and GDPR compliant
Was this article helpful?
Page top