Sanitized mirror from private repository - 2026-04-18 11:19:59 UTC
Some checks failed
Documentation / Build Docusaurus (push) Failing after 5m14s
Documentation / Deploy to GitHub Pages (push) Has been skipped

This commit is contained in:
Gitea Mirror Bot
2026-04-18 11:19:59 +00:00
commit fb00a325d1
1418 changed files with 359990 additions and 0 deletions

View File

@@ -0,0 +1,559 @@
# 💾 Backup Strategies Guide
## Overview
This guide covers comprehensive backup strategies for the homelab, implementing the 3-2-1 backup rule and ensuring data safety across all systems.
---
## 🎯 The 3-2-1 Backup Rule
```
┌─────────────────────────────────────────────────────────────────────────────┐
│ 3-2-1 BACKUP STRATEGY │
├─────────────────────────────────────────────────────────────────────────────┤
│ │
│ 3 COPIES 2 DIFFERENT MEDIA 1 OFF-SITE │
│ ───────── ───────────────── ────────── │
│ │
│ ┌─────────┐ ┌─────────┐ ┌─────────┐ │
│ │ Primary │ │ NAS │ │ Tucson │ │
│ │ Data │ │ (HDD) │ │ (Remote)│ │
│ └─────────┘ └─────────┘ └─────────┘ │
│ + + │
│ ┌─────────┐ ┌─────────┐ │
│ │ Local │ │ Cloud │ │
│ │ Backup │ │ (B2/S3) │ │
│ └─────────┘ └─────────┘ │
│ + │
│ ┌─────────┐ │
│ │ Remote │ │
│ │ Backup │ │
│ └─────────┘ │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
```
---
## 📊 Backup Architecture
### Current Implementation
| Data Type | Primary | Local Backup | Remote Backup | Cloud |
|-----------|---------|--------------|---------------|-------|
| Media (Movies/TV) | Atlantis | - | Setillo (partial) | - |
| Photos (Immich) | Atlantis | Calypso | Setillo | B2 (future) |
| Documents (Paperless) | Atlantis | Calypso | Setillo | B2 (future) |
| Docker Configs | Atlantis/Calypso | Syncthing | Setillo | Git |
| Databases | Various hosts | Daily dumps | Setillo | - |
| Passwords (Vaultwarden) | Atlantis | Calypso | Setillo | Export file |
---
## 🗄️ Synology Hyper Backup
### Setup Local Backup (Atlantis → Calypso)
```bash
# On Atlantis DSM:
# 1. Open Hyper Backup
# 2. Create new backup task
# 3. Select "Remote NAS device" as destination
# 4. Configure:
# - Destination: Calypso
# - Shared Folder: /backups/atlantis
# - Encryption: Enabled (AES-256)
```
### Hyper Backup Configuration
```yaml
# Recommended settings for homelab backup
backup_task:
name: "Atlantis-to-Calypso"
source_folders:
- /docker # All container data
- /photos # Immich photos
- /documents # Paperless documents
exclude_patterns:
- "*.tmp"
- "*.log"
- "**/cache/**"
- "**/transcode/**" # Plex transcode files
- "**/thumbs/**" # Regeneratable thumbnails
schedule:
type: daily
time: "03:00"
retention:
daily: 7
weekly: 4
monthly: 6
options:
compression: true
encryption: true
client_side_encryption: true
integrity_check: weekly
```
### Remote Backup (Atlantis → Setillo)
```yaml
# For off-site backup to Tucson
backup_task:
name: "Atlantis-to-Setillo"
destination:
type: rsync
host: setillo.tailnet
path: /volume1/backups/atlantis
source_folders:
- /docker
- /photos
- /documents
schedule:
type: weekly
day: sunday
time: "02:00"
bandwidth_limit: 50 Mbps # Don't saturate WAN
```
---
## 🔄 Syncthing Real-Time Sync
### Configuration for Critical Data
```xml
<!-- syncthing/config.xml -->
<folder id="docker-configs" label="Docker Configs" path="/volume1/docker">
<device id="ATLANTIS-ID"/>
<device id="CALYPSO-ID"/>
<device id="SETILLO-ID"/>
<minDiskFree unit="%">5</minDiskFree>
<versioning type="staggered">
<param key="maxAge" val="2592000"/> <!-- 30 days -->
<param key="cleanInterval" val="3600"/>
</versioning>
<ignorePattern>*.tmp</ignorePattern>
<ignorePattern>*.log</ignorePattern>
<ignorePattern>**/cache/**</ignorePattern>
</folder>
```
### Deploy Syncthing
```yaml
# syncthing.yaml
version: "3.8"
services:
syncthing:
image: syncthing/syncthing:latest
container_name: syncthing
hostname: atlantis-sync
environment:
- PUID=1000
- PGID=1000
volumes:
- ./syncthing/config:/var/syncthing/config
- /volume1/docker:/data/docker
- /volume1/documents:/data/documents
ports:
- "8384:8384" # Web UI
- "22000:22000" # TCP sync
- "21027:21027/udp" # Discovery
restart: unless-stopped
```
---
## 🗃️ Database Backups
### PostgreSQL Automated Backup
```bash
#!/bin/bash
# backup-postgres.sh
BACKUP_DIR="/volume1/backups/databases"
DATE=$(date +%Y%m%d_%H%M%S)
RETENTION_DAYS=14
# List of database containers to backup
DATABASES=(
"immich-db:immich"
"paperless-db:paperless"
"vaultwarden-db:vaultwarden"
"mastodon-db:mastodon_production"
)
for db_info in "${DATABASES[@]}"; do
CONTAINER="${db_info%%:*}"
DATABASE="${db_info##*:}"
echo "Backing up $DATABASE from $CONTAINER..."
docker exec "$CONTAINER" pg_dump -U postgres "$DATABASE" | \
gzip > "$BACKUP_DIR/${DATABASE}_${DATE}.sql.gz"
# Verify backup
if [ $? -eq 0 ]; then
echo "$DATABASE backup successful"
else
echo "$DATABASE backup FAILED"
# Send alert
curl -d "Database backup failed: $DATABASE" ntfy.sh/homelab-alerts
fi
done
# Clean old backups
find "$BACKUP_DIR" -name "*.sql.gz" -mtime +$RETENTION_DAYS -delete
echo "Database backup complete"
```
### MySQL/MariaDB Backup
```bash
#!/bin/bash
# backup-mysql.sh
BACKUP_DIR="/volume1/backups/databases"
DATE=$(date +%Y%m%d_%H%M%S)
# Backup MariaDB
docker exec mariadb mysqldump -u root -p"$MYSQL_ROOT_PASSWORD" \
--all-databases | gzip > "$BACKUP_DIR/mariadb_${DATE}.sql.gz"
```
### Schedule with Cron
```bash
# /etc/crontab or Synology Task Scheduler
# Daily at 2 AM
0 2 * * * /volume1/scripts/backup-postgres.sh >> /var/log/backup.log 2>&1
# Weekly integrity check
0 4 * * 0 /volume1/scripts/verify-backups.sh >> /var/log/backup.log 2>&1
```
---
## 🐳 Docker Volume Backups
### Backup All Named Volumes
```bash
#!/bin/bash
# backup-docker-volumes.sh
BACKUP_DIR="/volume1/backups/docker-volumes"
DATE=$(date +%Y%m%d)
# Get all named volumes
VOLUMES=$(docker volume ls -q)
for volume in $VOLUMES; do
echo "Backing up volume: $volume"
docker run --rm \
-v "$volume":/source:ro \
-v "$BACKUP_DIR":/backup \
alpine tar czf "/backup/${volume}_${DATE}.tar.gz" -C /source .
done
# Clean old backups (keep 7 days)
find "$BACKUP_DIR" -name "*.tar.gz" -mtime +7 -delete
```
### Restore Docker Volume
```bash
#!/bin/bash
# restore-docker-volume.sh
VOLUME_NAME="$1"
BACKUP_FILE="$2"
# Create volume if not exists
docker volume create "$VOLUME_NAME"
# Restore from backup
docker run --rm \
-v "$VOLUME_NAME":/target \
-v "$(dirname "$BACKUP_FILE")":/backup:ro \
alpine tar xzf "/backup/$(basename "$BACKUP_FILE")" -C /target
```
---
## ☁️ Cloud Backup (Backblaze B2)
### Setup with Rclone
```bash
# Install rclone
curl https://rclone.org/install.sh | sudo bash
# Configure B2
rclone config
# Choose: New remote
# Name: b2
# Type: Backblaze B2
# Account ID: <your-account-id>
# Application Key: <your-app-key>
```
### Backup Script
```bash
#!/bin/bash
# backup-to-b2.sh
BUCKET="homelab-backups"
SOURCE="/volume1/backups"
# Sync with encryption
rclone sync "$SOURCE" "b2:$BUCKET" \
--crypt-remote="b2:$BUCKET" \
--crypt-password="REDACTED_PASSWORD" /root/.rclone-password)" \
--transfers=4 \
--checkers=8 \
--bwlimit=50M \
--log-file=/var/log/rclone-backup.log \
--log-level=INFO
# Verify sync
rclone check "$SOURCE" "b2:$BUCKET" --one-way
```
### Cost Estimation
```
Backblaze B2 Pricing:
- Storage: $0.005/GB/month
- Downloads: $0.01/GB (first 1GB free daily)
Example (500GB backup):
- Monthly storage: 500GB × $0.005 = $2.50/month
- Annual: $30/year
Recommended for:
- Photos (Immich): ~500GB
- Documents (Paperless): ~50GB
- Critical configs: ~10GB
```
---
## 🔐 Vaultwarden Backup
### Automated Vaultwarden Backup
```bash
#!/bin/bash
# backup-vaultwarden.sh
BACKUP_DIR="/volume1/backups/vaultwarden"
DATE=$(date +%Y%m%d_%H%M%S)
CONTAINER="vaultwarden"
# Stop container briefly for consistent backup
docker stop "$CONTAINER"
# Backup data directory
tar czf "$BACKUP_DIR/vaultwarden_${DATE}.tar.gz" \
-C /volume1/docker/vaultwarden .
# Restart container
docker start "$CONTAINER"
# Keep only last 30 backups
ls -t "$BACKUP_DIR"/vaultwarden_*.tar.gz | tail -n +31 | xargs -r rm
# Also create encrypted export for offline access
# (Requires admin token)
curl -X POST "http://localhost:8080/admin/users/export" \
-H "Authorization: Bearer $VAULTWARDEN_ADMIN_TOKEN" \
-o "$BACKUP_DIR/vaultwarden_export_${DATE}.json"
# Encrypt the export
gpg --symmetric --cipher-algo AES256 \
-o "$BACKUP_DIR/vaultwarden_export_${DATE}.json.gpg" \
"$BACKUP_DIR/vaultwarden_export_${DATE}.json"
rm "$BACKUP_DIR/vaultwarden_export_${DATE}.json"
echo "Vaultwarden backup complete"
```
---
## 📸 Immich Photo Backup
### External Library Backup Strategy
```yaml
# Immich backup approach:
# 1. Original photos stored on Atlantis
# 2. Syncthing replicates to Calypso (real-time)
# 3. Hyper Backup to Setillo (weekly)
# 4. Optional: rclone to B2 (monthly)
backup_paths:
originals: /volume1/photos/library
database: /volume1/docker/immich/postgres
thumbnails: /volume1/docker/immich/thumbs # Can be regenerated
```
### Database-Only Backup (Fast)
```bash
#!/bin/bash
# Quick Immich database backup (without photos)
docker exec immich-db pg_dump -U postgres immich | \
gzip > /volume1/backups/immich_db_$(date +%Y%m%d).sql.gz
```
---
## ✅ Backup Verification
### Automated Verification Script
```bash
#!/bin/bash
# verify-backups.sh
BACKUP_DIR="/volume1/backups"
ALERT_URL="ntfy.sh/homelab-alerts"
ERRORS=0
echo "=== Backup Verification Report ==="
echo "Date: $(date)"
echo ""
# Check recent backups exist
check_backup() {
local name="$1"
local path="$2"
local max_age_hours="$3"
if [ ! -d "$path" ]; then
echo "$name: Directory not found"
((ERRORS++))
return
fi
latest=$(find "$path" -type f -name "*.gz" -o -name "*.tar.gz" | \
xargs ls -t 2>/dev/null | head -1)
if [ -z "$latest" ]; then
echo "$name: No backup files found"
((ERRORS++))
return
fi
age_hours=$(( ($(date +%s) - $(stat -c %Y "$latest")) / 3600 ))
if [ $age_hours -gt $max_age_hours ]; then
echo "$name: Latest backup is ${age_hours}h old (max: ${max_age_hours}h)"
((ERRORS++))
else
size=$(du -h "$latest" | cut -f1)
echo "$name: OK (${age_hours}h old, $size)"
fi
}
# Verify each backup type
check_backup "PostgreSQL DBs" "$BACKUP_DIR/databases" 25
check_backup "Docker Volumes" "$BACKUP_DIR/docker-volumes" 25
check_backup "Vaultwarden" "$BACKUP_DIR/vaultwarden" 25
check_backup "Hyper Backup" "/volume1/backups/hyper-backup" 168 # 7 days
# Check Syncthing status
syncthing_status=$(curl -s http://localhost:8384/rest/system/status)
if echo "$syncthing_status" | grep -q '"uptime"'; then
echo "✓ Syncthing: Running"
else
echo "✗ Syncthing: Not responding"
((ERRORS++))
fi
# Check remote backup connectivity
if ping -c 3 setillo.tailnet > /dev/null 2>&1; then
echo "✓ Remote (Setillo): Reachable"
else
echo "✗ Remote (Setillo): Unreachable"
((ERRORS++))
fi
echo ""
echo "=== Summary ==="
if [ $ERRORS -eq 0 ]; then
echo "All backup checks passed ✓"
else
echo "$ERRORS backup check(s) FAILED ✗"
curl -d "Backup verification failed: $ERRORS errors" "$ALERT_URL"
fi
```
### Test Restore Procedure
```bash
#!/bin/bash
# test-restore.sh - Monthly restore test
TEST_DIR="/volume1/restore-test"
mkdir -p "$TEST_DIR"
# Test PostgreSQL restore
echo "Testing PostgreSQL restore..."
LATEST_DB=$(ls -t /volume1/backups/databases/immich_*.sql.gz | head -1)
docker run --rm \
-v "$TEST_DIR":/restore \
-v "$LATEST_DB":/backup.sql.gz:ro \
postgres:15 \
bash -c "gunzip -c /backup.sql.gz | psql -U postgres"
# Verify tables exist
if docker exec test-postgres psql -U postgres -c "\dt" | grep -q "assets"; then
echo "✓ PostgreSQL restore verified"
else
echo "✗ PostgreSQL restore failed"
fi
# Cleanup
rm -rf "$TEST_DIR"
```
---
## 📋 Backup Schedule Summary
| Backup Type | Frequency | Retention | Destination |
|-------------|-----------|-----------|-------------|
| Database dumps | Daily 2 AM | 14 days | Atlantis → Calypso |
| Docker volumes | Daily 3 AM | 7 days | Atlantis → Calypso |
| Vaultwarden | Daily 1 AM | 30 days | Atlantis → Calypso → Setillo |
| Hyper Backup (full) | Weekly Sunday | 6 months | Atlantis → Calypso |
| Remote sync | Weekly Sunday | 3 months | Atlantis → Setillo |
| Cloud sync | Monthly | 1 year | Atlantis → B2 |
| Syncthing (configs) | Real-time | 30 days versions | All nodes |
---
## 🔗 Related Documentation
- [Disaster Recovery](../troubleshooting/disaster-recovery.md)
- [Synology Disaster Recovery](../troubleshooting/synology-disaster-recovery.md)
- [Offline Password Access](../troubleshooting/offline-password-access.md)
- [Storage Topology](../diagrams/storage-topology.md)
- [Portainer Backup](portainer-backup.md)