s3dbdump

I looked for a simple backup solution for dumping, compressing and uploading MariaDB databases to a s3 (or more likely, s3 compatible) bucket. Sure, I can prepare all the dependencies, write a shell script, crontab, cram it into a Alpine container, maybe?.. Meh

Better to just do something right the first time, write some software with Go.

s3dbdump is a tool used to dump a MariaDB (MySQL) database to a file and upload it to S3 or MinIO, with gzip compression. It can be compiled and run as a standalone binary of course, but I highly recommend Podman, Kubernetes or Docker (eew).

To plainly run it locally with Podman:
mkdir -p dumps # Important since running as non-root

podman run --rm \
-v $(pwd)/dumps:/tmp/dumps:rw \
-e AWS_ACCESS_KEY_ID='<access-key-id>' \
-e AWS_SECRET_ACCESS_KEY='<secret-access-key>' \
-e S3_ENDPOINT='https://minio.example.com' \
-e S3_BUCKET='dbdumps' \
-e DB_HOST='localhost' \
-e DB_PORT='3306' \
-e DB_USER='root' \
-e DB_PASSWORD='password' \
-e DB_ALL_DATABASES='1' \
-e DB_DUMP_PATH='/tmp/dumps' \
-e DB_DUMP_FILE_KEEP_DAYS='7' \
ghcr.io/stenstromen/s3dbdump:latest
Or more seriously as a Kubernetes CronJob
apiVersion: v1
data:
  db-password: QUtJQTVYUVcyUFFFRUs1RktZRlM=
  minio-access-key-id: QUtJQTVYUVcyUFFFRUs1RktZRlM=
  minio-secret-access-key: czNjcjN0
kind: Secret
metadata:
  name: db-dump-secrets
  namespace: default
type: Opaque

---

apiVersion: batch/v1
kind: CronJob
metadata:
  name: mariadb-backup-s3
  namespace: default
spec:
  schedule: "0 6 * * *"
  successfulJobsHistoryLimit: 0
  concurrencyPolicy: Replace
  failedJobsHistoryLimit: 1
  jobTemplate:
    spec:
      activeDeadlineSeconds: 3600
      backoffLimit: 2
      template:
        spec:
          containers:
            - env:
                - name: DB_HOST
                  value: database.default.svc.cluster.local
                - name: DB_USER
                  value: root
                - name: DB_PASSWORD
                  valueFrom:
                    secretKeyRef:
                      name: db-dump-secrets
                      key: db-password
                - name: DB_ALL_DATABASES
                  value: "1"
                - name: DB_DUMP_FILE_KEEP_DAYS
                  value: "7"
                - name: DB_DUMP_PATH
                  value: /tmp
                - name: S3_BUCKET
                  value: dbbak
                - name: S3_ENDPOINT
                  value: http://minio.default.svc.cluster.local:9000
                - name: AWS_ACCESS_KEY_ID
                  valueFrom:
                    secretKeyRef:
                      name: db-dump-secrets
                      key: minio-access-key-id
                - name: AWS_SECRET_ACCESS_KEY
                  valueFrom:
                    secretKeyRef:
                      name: db-dump-secrets
                      key: minio-secret-access-key
              securityContext:
                runAsUser: 65534
                runAsGroup: 65534
                privileged: false
                runAsNonRoot: true
                readOnlyRootFilesystem: true
                allowPrivilegeEscalation: false
                procMount: Default
                capabilities:
                  drop: ["ALL"]
                seccompProfile:
                  type: RuntimeDefault
              image: ghcr.io/stenstromen/s3dbdump:latest
              imagePullPolicy: IfNotPresent
              name: backup
              terminationMessagePath: /dev/termination-log
              terminationMessagePolicy: File
              volumeMounts:
                - name: tmp
                  mountPath: /tmp
          dnsPolicy: ClusterFirst
          restartPolicy: Never
          schedulerName: default-scheduler
          terminationGracePeriodSeconds: 30
          volumes:
            - name: tmp
              emptyDir: {}

The resulting file(s) on the s3 bucket will have the naming format of databasename-yyyymmddThhmmss.sql.gz. meow-20251015T040058.sql.gz for example.

Visit https://github.com/Stenstromen/s3dbdump for more info.

Leave a Reply

Your email address will not be published. Required fields are marked *