How do I fix 100% disk usage when running a game wiki backup script?
Hi all, I currently have a script to backup the Wiki of a big casino game (a popular MMORPG) and every time I run it, the disk usage goes up to 100% which crashes everything! Anyone else had any problems getting their game Wiki/discussion backups to work properly? How can I optimize my script/adjust server settings to avoid this problem?
5 Answers
Yeah, it looks like your back-up scripts are saturating the I/O channels when they run. I've seen that crash servers before, particularly those running large MMORPG wikis.
For starters, make sure you are taking incremental backups, not full snapshots, and back up at off-peak times when users aren’t accessing the system.
Or consider batch compression. Compressing by batches rather than in a single operation helps to reduce the peak of such loads. Also, if your script is sequential, you might want to add multi-threading so that it runs faster.
Finally, adjust your server’s I/O priorities – deprioritize backup processes, and don’t let them stomp all over everything else. Monitor disk performance to see where the bottlenecks are. If you want a hand going through the script…
is definitely caused by a backup script which isn't throttled or runs in prime time. Schedule it with cron (unix) or Task Scheduler (Windows), do partial backups in the same script, insert a sleep command between backups, use --bwlimit if you're using rsync, etc., remove duplicate backups, etc. See what happens!
Are you sure your backup script isn’t trying to copy or lock big files? The wiki’s underlying database might not be multi-threaded. You could schedule backups to run at night, or break up the export process into batches. Lower the backup job’s I/O priority on your server or do incremental backups instead. If the Mario wiki software is MediaWiki, try optimizing its database or beefing up the disk IO of its hosting system. Chill, dude. Ping me again if that doesn’t fix things.
Run the backup script as a background process or during non-business hours, or even from another machine. Perhaps the backup script is interfering with the database or accessing the same files that the wiki application accesses. Look at the script for unnecessary writes or unneeded looping. Pause between each step in the backup process. Use incremental backups instead of a full backup. Use nice or ionice to limit how much CPU and RAM that the backup process can use. Confirm that the disk isn't full. Clean up unneeded log files. Prune out any cruft from the backup script; the smaller the data, the faster the process.
100% CPU: Your script is likely chewing up all the processing power. Break it into smaller pieces and export them one by one with some delay in between. Move backups to an external drive or cloud storage to avoid hitting your local hard disk too hard. Upgrade your VPS plan or use a separate machine for backups if necessary.