Author: tedks
Description:
I'm a user who wants to have a complete dump of English Wikipedia (or any other large wiki-project).
I could download a complete dump every time one was made, but that would be a lot of bandwidth to use to re-download mostly the same fileset, and it'd be expensive for both me and the Wikimedia Foundation.
The easiest solution to this that I (in my total ignorance) can see is having a base archive, and releasing diff archives after that that just have the changed/added files (like a duplicity backup). These incremental archives would be a small fraction of the total dump in terms of space and bandwidth, and would make keeping a dump current much easier.
Version: unspecified
Severity: enhancement