Browse Source

Ansible roles for backup

This introduces two new roles for managing the backup-server and hosts
that we wish to back up.

Firstly the "backup" role runs on hosts we wish to backup.  This
generates and configures a separate ssh key for running bup and
installs the appropriate cron job to run the backup daily.

The "backup-server" job runs on the backup server (or, indeed
servers).  It creates users for each backup host, accepts the remote
keys mentioned above and initalises bup.  It is then ready to receive
backups from the remote hosts.

This eliminates a fairly long-standing requirement for manual setup of
the backup server users and keys; this section is removed from the

testinfra coverage is added.

Change-Id: I9bf74df351e056791ed817180436617048224d2c
Ian Wienand 8 months ago
15 changed files with 301 additions and 45 deletions
  1. +25
  2. +18
  3. +15
  4. +1
  5. +21
  6. +32
  7. +23
  8. +22
  9. +61
  10. +10
  11. +2
  12. +7
  13. +1
  14. +2
  15. +61

+ 25
- 0
.zuul.yaml View File

@@ -687,6 +687,30 @@
- testinfra/
- testinfra/

- job:
name: system-config-run-backup
parent: system-config-run
description: |
Run the playbook for backup configuration
- name:
label: ubuntu-bionic
- name:
label: ubuntu-bionic
- name:
label: ubuntu-bionic
- name:
label: ubuntu-xenial
- playbooks/service-backup.yaml
- .zuul.yaml
- playbooks/roles/backup.*
- playbooks/zuul/templates/host_vars/backup.*
- testinfra/

- job:
name: system-config-run-mirror
parent: system-config-run
@@ -870,6 +894,7 @@
- system-config-run-base
- system-config-run-base-ansible-devel:
voting: false
- system-config-run-backup
- system-config-run-dns
- system-config-run-eavesdrop
- system-config-run-lists

+ 18
- 45
doc/source/sysadmin.rst View File

@@ -215,53 +215,21 @@ OpenStack CI infrastructure for another project.

Off-site backups are made to two servers:
Infra uses the `bup <>`__ tool for backups.

Hosts in the ``backup`` Ansible inventory group will be backed up to
servers in the ``backup-server`` group with ``bup``. The
``playbooks/roles/backup`` and ``playbooks/roles/backup-server`` roles
implement the required setup.

Puppet is used to perform the initial configuration of those machines,
but to protect them from unauthorized access in case access to the
puppet git repo is compromised, it is not run in agent or in cron mode
on them. Instead, it should be manually run when changes are made
that should be applied to the backup servers.
The backup server has a unique Unix user for each host to be backed
up. The roles will setup required users, their home directories in
the backup volume and relevant ``authorized_keys``.

To start backing up a server, some commands need to be run manually on
both the backup server, and the server to be backed up. On the server
to be backed up::

sudo su -
ssh-keygen -t rsa -f /root/.ssh/id_rsa -N ""
bup init

And then ``cat /root/.ssh/`` for use later.

On the backup servers::

# add bup user
BUPUSER=bup-<short-servername> # eg, bup-jenkins-dev
sudo useradd -r $BUPUSER -s /bin/bash -d /opt/backups/$BUPUSER -m
sudo su - $BUPUSER

# initalise bup
bup init

# should be in home directory /opt/backups/$BUPUSER
mkdir .ssh
cat >.ssh/authorized_keys

write this into the authorized_keys file and end with ^D on a blank line::

command="BUP_DEBUG=0 BUP_FORCE_TTY=3 bup server",no-port-forwarding,no-agent-forwarding,no-X11-forwarding,no-pty <ssh key from earlier>

Switching back to the server to be backed up, run::

ssh $

And verify the host key. Note this will start the bup server on the
remote end, you will not be given a pty. Use ^D to close the connection
cleanly. Add the "backup" class in puppet to the server
to be backed up.
Host backup happens via a daily cron job (managed by Ansible) on each
individual host to be backed up. The host to be backed up initiates
the backup process to the remote backup server(s) using a separate ssh
key setup just for backup communication (see ``/root/.ssh/config``).

Restore from Backup
@@ -276,9 +244,14 @@ how we restore content from backups::
mkdir /root/backup-restore-$DATE
cd /root/backup-restore-$DATE

Root uses a separate ssh key and remote user to communicate with the
backup server(s); the username and key to use for backup should be
automatically configured in ``/root/.ssh/config``. The backup server
hostname can be taken from there.

At this point we can join the tar that was split by the backup cron::

bup join -r bup-<short-servername> root > backup.tar
bup join -r root > backup.tar

At this point you may need to wait a while. These backups are stored on
servers geographically distant from our normal servers resulting in less

+ 15
- 0
playbooks/roles/backup-server/README.rst View File

@@ -0,0 +1,15 @@
Setup backup server

This role configures backup server(s) in the ``backup-server`` group
to accept backups from remote hosts.

Note that the ``backup`` role must have run on each host in the
``backup`` group before this role. That role will create a
``bup_user`` tuple in the hostvars for for each host consisting of the
required username and public key.

Each required user gets a separate home directory in ``/opt/backups``.
Their ``authorized_keys`` file is configured with the public key to
allow the remote host to log in and only run ``bup``.

**Role Variables**

+ 1
- 0
playbooks/roles/backup-server/defaults/main.yaml View File

@@ -0,0 +1 @@
bup_users: []

+ 21
- 0
playbooks/roles/backup-server/tasks/main.yaml View File

@@ -0,0 +1,21 @@
- name: Create backup directory
state: directory
path: /opt/backups

- name: Install bup
- bup
state: present

- name: Build all bup users from backup hosts
bup_users: '{{ bup_users }} + [ {{ hostvars[item]["bup_user"] }} ]'
with_inventory_hostnames: backup

- name: Create bup users
include_tasks: user.yaml
loop: '{{ bup_users }}'
loop_var: bup_user

+ 32
- 0
playbooks/roles/backup-server/tasks/user.yaml View File

@@ -0,0 +1,32 @@
# note bup_user is the parent loop variable name; this works on each
# element from the bup_users global.
- name: Set variables
user_name: '{{ bup_user[0] }}'
user_key: '{{ bup_user[1] }}'

- name: Create bup user
name: '{{ user_name }}'
comment: 'Backup user'
shell: /bin/bash
home: '/opt/backups/{{ user_name }}'
create_home: yes
register: homedir

- name: Create bup user authorized key
user: '{{ user_name }}'
state: present
key: '{{ user_key }}'
key_options: 'command="BUP_DEBUG=0 BUP_FORCE_TTY=3 bup server",no-port-forwarding,no-agent-forwarding,no-X11-forwarding,no-pty'

# ansible-lint wants this in a handler, it should be done here and
# now; this isn't like a service restart where multiple things might
# call it.
- name: Initalise bup # noqa 503
shell: |
BUP_DIR=/opt/backups/{{ user_name }}/.bup bup init
become: yes
become_user: '{{ user_name }}'
when: homedir.changed

+ 23
- 0
playbooks/roles/backup/README.rst View File

@@ -0,0 +1,23 @@
Configure a host to be backed up

This role setups a host to use ``bup`` for backup to any hosts in the
``backup-server`` group.

A separate ssh key will be generated for root to connect to the backup
server(s) and the host key for the backup servers will be accepted to
the host.

The ``bup`` tool is installed and a cron job is setup to run the
backup periodically.

Note the ``backup-server`` role must run after this to create the user
correctly on the backup server. This role sets a tuple ``bup_user``
with the username and public key; the ``backup-server`` role uses this
variable for each host in the ``backup`` group to initalise users.

**Role Variables**

.. zuul:rolevar:: bup_username

The username to connect to the backup server. If this is left
undefined, it will be automatically set to ``bup-$(hostname)``

+ 22
- 0
playbooks/roles/backup/files/bup-excludes View File

@@ -0,0 +1,22 @@

+ 61
- 0
playbooks/roles/backup/tasks/main.yaml View File

@@ -0,0 +1,61 @@
- name: Generate bup username for this host
bup_username: 'bup-{{ inventory_hostname.split(".", 1)[0] }}'
when: bup_username is not defined

- debug:
var: bup_username

- name: Install bup
- bup
state: present

- name: Generate keypair for backups
path: /root/.ssh/id_backup_ed25519
type: ed25519
register: bup_keypair

- name: Initalise bup # noqa 503
command: bup init
when: bup_keypair.changed

- name: Configure ssh for backup server
path: /root/.ssh/ssh_config
create: true
block: |
Host {{ item }}
HostName {{ item }}
IdentityFile /root/.ssh/id_backup_ed25519
User {{ bup_username }}
mode: 0600
with_inventory_hostnames: backup-server

- name: Generate bup_user info tuple
bup_user: '{{ [ bup_username, bup_keypair["public_key"] ] }}'

- name: Accept hostkey of backup server
state: present
key: '{{ item }} ecdsa-sha2-nistp256 {{ hostvars[item]["ansible_ssh_host_key_ed25519_public"] }}'
name: '{{ item }}'
with_inventory_hostnames: backup-server

- name: Write /etc/bup-excludes
src: bup-excludes
dest: /etc/bup-excludes
mode: 0444

- name: Install backup cron job
name: "Run bup backup"
job: "tar -X /etc/bup-excludes -cPF - / | bup split -r {{ bup_username }}@{{ item }}: -n root -q"
user: root
hour: '5'
minute: '{{ 59|random(seed=item) }}'
with_inventory_hostnames: backup-server

+ 10
- 0
playbooks/service-backup.yaml View File

@@ -0,0 +1,10 @@
# This needs to happen in order. Backup hosts export their username/key
# combos which are installed onto the backup server
- hosts: "backup:!disabled"
name: "Base: Generate backup users and keys"
- backup
- hosts: "backup-server:!disabled"
name: "Generate bup configuration"
- backup-server

+ 2
- 0
playbooks/zuul/run-base.yaml View File

@@ -87,6 +87,8 @@
- host_vars/
- host_vars/
- host_vars/
- host_vars/
- host_vars/
- name: Display group membership
command: ansible localhost -m debug -a 'var=groups'
- name: Run base.yaml

+ 7
- 0
playbooks/zuul/templates/gate-groups.yaml.j2 View File

@@ -9,3 +9,10 @@ groups:



+ 1
- 0
playbooks/zuul/templates/host_vars/ View File

@@ -0,0 +1 @@
bup_username: bup-backup01

+ 2
- 0
playbooks/zuul/templates/host_vars/ View File

@@ -0,0 +1,2 @@
# Intentionally left blank to test autogeneration of name
#bup_username: bup-backup-test02

+ 61
- 0
testinfra/ View File

@@ -0,0 +1,61 @@
# Copyright 2019 Red Hat, Inc.
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.

import os.path
import pytest

testinfra_hosts = ['',

def test_bup_installed(host):
package = host.package("bup")
assert package.is_installed

def test_server_users(host):
hostname = host.backend.get_hostname()
if hostname.startswith('backup-test'):

for username in 'bup-backup01', 'bup-backup-test02':
homedir = os.path.join('/opt/backups/', username)
bup_config = os.path.join(homedir, '.bup', 'config')
authorized_keys = os.path.join(homedir, '.ssh', 'authorized_keys')

user = host.user(username)
assert user.exists
assert user.home == homedir

f = host.file(authorized_keys)
assert f.exists
assert f.contains("ssh-ed25519")

f = host.file(bup_config)
assert f.exists

def test_backup_host_config(host):
hostname = host.backend.get_hostname()
if hostname == '':

f = host.file('/root/.ssh/id_backup_ed25519')
assert f.exists

f = host.file('/root/.ssh/ssh_config')
assert f.exists
assert f.contains('Host')

f = host.file('/root/.bup/config')
assert f.exists