Wouldn't a more frequent backup mechanism help third parties who index
Wikipedia content, too? I'm thinking about
answers.com and others.
Erik, I'd like to say thank you in the name of the Hungarian Wikipedia.
We always look forward to the next update and greatly appreciate what
you're doing.
Thanks,
nyenyec
On 8/31/05, Erik Zachte <epzachte(a)chello.nl> wrote:
Finally new wikistats
See
http://mail.wikipedia.org/pipermail/foundation-l/2005-August/004043.html
for an explanation why they could not be run for so long.
For some projects, like wikipedia, the newest dumps are already 6 weeks old.
Please don't ask me for next update, as it is not in my hands.
Data for French and German wikipedias are missing.
For French I will have to look into why the job failed.
For German the full xml database export is corrupt.
Does this imply that no mediawiki 1.5 full database backup for German is
available at all?
Or are sql dumps still made as well?
If not, a mediawiki 1.4 dump of June 23 is the best we have when worst comes
to worst and a restore is needed.
Even when sql dumps are also made, I still have difficulty understanding why
database dumps have such low priority.
Sure with distributed servers the change for a total collapse diminishes,
but never say never.
We may survive a power outage more comfortably now (which we did by sheer
luck only last time), but other calamities may occur,
what about deliberate sabotage, to name an unlikely but devastating
scenario?
Total database export takes more than a week, so it must be heavy on the
database servers.
If a dedicated database server would be needed to produce exports/dumps as
often as possible, that would seem a good investment to me.
If a site admin would care to comment on this, thanks in advance.
Erik Zachte
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)wikimedia.org
http://mail.wikipedia.org/mailman/listinfo/wikitech-l