Many times, but that's not necessarily clear or
simple. The generic
delta-generation tools we've tried in the past just choke on our files; note
that the full-history dump of English Wikipedia -- the one we're most concerned
about having archival copies of available -- is over 350 gigabytes uncompressed.
Actually, I just downloaded enwiki-20060518-pages-meta-history.xml.7z
and when I did a "7za l enwiki-20060518-pages-meta-history.xml.7z" it
said that the archive would be 692686106434 bytes (645 GB)
uncompressed. Is that inaccurate?
~MDD4696