* emijrp <emijrp(a)gmail.com> [Thu, 11 Nov 2010 16:04:53 +0100]:
There are some old dumps in Internet Archive,[1] but I
guess you are
interested in the most recent ones.
Also, I have a copy of all the pages-meta-history.xml.7z from August
2010 at
home. But I can't upload them anywhere, they are 100 GB.
Why such large dumps are not incremental ones? That could bring:
1. Save a lot of disk space
2. Greatly increase speed of dumping
3. Easier to upload and restore missed parts (something like P2P)
With XML import / export that could be done quite easily. And even with
SQL you probably can limit entries at least of some tables (revision,
user, page). Though I am unsure, if it is possible to make incremental
dumps of all the tables - that would require SQL operations log, instead
of simple range inserts, probably.
Although the developers are smart people so probably I am wasting the
time and there is no reason to make incremental backups?
Dmitriy