Backing up your Route 53 DNS records

Using a command line tool to backup your Route 53 DNS records in time stamped folders.

In both my personal and professional life I use Amazon Route 53 to set up DNS records. Whilst I’m not here to sell the product, I strongly recommend looking into their services as a fantastic resource for hassle free (and super cheap) DNS services.

If you are already using Route 53 and need a backup of your DNS records, you can use the following tutorial to download your data and store it in a secure location for future reference. All you need is a little script and a couple of command line tools.


The following was performed on a MacBook Pro running the macOS Sierra. You also need Homebrew installed. See my “Useful homebrew commands” if you’re not sure how to get this set up.

1. Set up cli53

We’re going to use a command line tool to interact with Amazon Route 53 called cli53. To install this go to your command line and type:

$ brew install cli53

One installed you can check cli53 is running by typing

$ cli53

and you should see

  cli53 - manage route53 DNS

  cli53 [global options] command [command options] [arguments...]

First we need to create set up our credentials file (I’m using VS Code to edit this file)

$ code ~/.aws/credentials

2. Setting up an IAM user

Once in the text editor we need to add our AWS credentials. You can create these in the IAM Management Console and store them securely for future reference.

We want to add a user with a unique name, give them Programmatic access and assign the AmazonRoute53ReadOnlyAccess permissions.

You will then need to paste your credentials into the file, following this template

aws_access_key_id: AKID1234567890
aws_secret_access_key: MY-SECRET-KEY

You should now be able to get a response from Amazon

$ cli53 list
ID             Name           Record count  Comment
ABC123   2
ABC234   2

3. Backup Script

You can backup individual zones by following the tutorials in the cli53 wiki on GitHub, but I prefer to backup all my zones in one go. I have modified a script by Tomas Nevar @ Lisenet to enable me to run daily backups in dated folders. You could run this with a cron job if you prefer, or even zip the files when you are done, but I’m keeping it simple for this tutorial.

First I created a folder in my Sites directory and entered the directory

$ cd ~/Sites && mkdir -p AWS/Route53 && cd AWS/Route53

I then created the script

$ code

# Declare backup path & master zone files
BACKUP_PATH="$(date +%F)"

# Create date-stamped backup directory and enter it
mkdir -p "$BACKUP_PATH"

# Create a list of all hosted zones
cli53 list --format text > "$ZONES_FILE" 2>&1

# Create a list of domain names only
sed '/Name:/!d' "$ZONES_FILE" | cut -d: -f2 | sed 's/^..//' | sed 's/.\{3\}$//' > "$DNS_FILE"

# Create backup files for each domain
while read -r line; do
  cli53 export --full "$line" > "$line.txt"
done < "$DNS_FILE"

exit 0

Finally you can run the script by entering:

$ sh

4. All done!

Your backup should now begin. You should end up with 2 master files containing a list of the zones and some basic information AWS sends back. Each zone should then have its own text file containing a full list of DNS records.

Whilst this may not be the fastest backup, you can leave it running whilst you get on with some proper work. Or set up a cron job to automate this for you on a daily / weekly / monthly basis.