Dokuwiki is an elegant and concise wiki-site. Since it uses plain text files instead of a database to store pages, one offline copy could be useful for some situations, like read it on your Kindle when you go offline or publish your ebook. Thus, we want to back up our pages and media file daily and create a symbolic link to the latest copy for downloads.
The official Dokuwiki provides varies of backup scripts. Unfortunately, the bash script is no longer working since Dokuwiki changed project structure, so I modified this script and added some feature. This script is tested under bash and I think it probably works in other shell environments as well.
Shell file: https://gist.github.com/luoxufeiyan/8d4ce8ffd2a7a8d83e63ea47c8f6e3a2
#!/bin/sh
# Automatically Backup Dokuwiki Data Using Shell Script
# $Id: dw-backup.sh 328 2004-12-22 13:15:20Z dp $
# Original script: https://www.dokuwiki.org/tips:backup_script
# Blog post: https://www.luoxufeiyan.com/?p=3949
# config
WIKIPATH="/var/www/wiki.luoxufeiyan.com/data" # path to your wiki data directory, no symbolic links are allowed!
# for debian etch: /var/lib/dokuwiki
BACKUPPATH="/var/www/wiki.luoxufeiyan.com/backup" # where do you save the backups?
DAILY_DATA_BACKUPS="8" # keep this amount data backups
DAILY_MEDIA_BACKUPS="3" # and media backups
FILE_PREFIX="lxfy_wiki" # filename for the latest backup to download
# no more config
# creates $1, if not existant
checkDir()
{
if [ ! -d "${BACKUPPATH}/$1" ]
then
mkdir -p "${BACKUPPATH}/$1"
fi
}
# 1 -> path
# 2 -> name
# 3 -> number of backups
rotateDir()
{
for i in `seq $(($3 - 1)) -1 1`
do
if [ -f "$1/$2.$i.tar.gz" ]
then
mv "$1/$2.$i.tar.gz" "$1/$2.$((i + 1)).tar.gz"
fi
done
}
# make sure everything exists
checkDir "data"
checkDir "data/archive"
checkDir "data/daily"
checkDir "media"
checkDir "media/archive"
checkDir "media/daily"
# first step: rotate daily.
rotateDir "${BACKUPPATH}/data/daily" "data" "$DAILY_DATA_BACKUPS"
rotateDir "${BACKUPPATH}/media/daily" "media" "$DAILY_MEDIA_BACKUPS"
# then create our backup
# --exclude is not accepted for Linksys NSLU2 box, any alternative?
tar --exclude=".*" -zcf "/tmp/data.1.tar.gz" -C "${WIKIPATH}" "pages"
tar --exclude=".*" -zcf "/tmp/media.1.tar.gz" -C "${WIKIPATH}" "media"
# for debian etch, replace "media" by "data/media" in line above
# and add --exclude="media" to first tar line
# create an archive backup?
if [ `date +%d` = "01" ]
then
cp "/tmp/data.1.tar.gz" "${BACKUPPATH}/data/archive/data-"`date +%m-%d-%Y`".tar.gz"
cp "/tmp/media.1.tar.gz" "${BACKUPPATH}/media/archive/media-"`date +%m-%d-%Y`".tar.gz"
fi
# add them to daily.
mv "/tmp/data.1.tar.gz" "${BACKUPPATH}/data/daily"
mv "/tmp/media.1.tar.gz" "${BACKUPPATH}/media/daily"
# add symbolic links for download
ln -sf "${BACKUPPATH}/data/daily/data.1.tar.gz" "${BACKUPPATH}/${FILE_PREFIX}_data.tar.gz"
ln -sf "${BACKUPPATH}/media/daily/media.1.tar.gz" "${BACKUPPATH}/${FILE_PREFIX}_media.tar.gz"
In this scenario, we compress the Dokuwiki pages and media folder which contains the text content and accessories like pictures. These two folders locate under the your_doku_path/data directory (NOT Doku directory), so change your WIKIPATH constant to your_doku_path/data, like /var/www/wiki.luoxufeiyan.com/data
. All the backed up files will be stored in BACKUPPATH, furthermore, two symbolic links will be generated to the backup path for downloads, so if you want others to download your wiki, make sure the backup folder is visible for visitors and spread these two URL, otherwise comment the last two lines of script and set folder permission invisible for visitors. To periodically backup and update files, you may need to set up a cron job, simply add your time-frequency and command /usr/bin/bash /your_script_path/dw-backup.sh
to crontab and that’s it.
Wist this script helps and if you found any issue, please contact me or leave a comment below.