On Fri, Oct 29, 2004 at 12:37:05PM +0200, Magnus Manske wrote:
* it won't get out of sync even if the master is
down for a while
and if the slave is down for a while it can d/l the archivied hourly or
daily dumps.
* probably requires less bandwidth/database stress
than MySQL
replication, since we would only sync the cur table
yes, files can be cached on the squids or redistributed.
* could provide "reformatted" information
like a list of authors for
each article (more useful than a complete "old" table, and required by
GFDL anyway)
For usage as a slave i wouldn't consider that as important, a link
to wikipedia's history was and is enough in internet.
But we could have a second output file which only carries the authors.
If we write our own software, we could use a smaller webserver for
delivering the files (also for images), ain't there one in the kernel?
ciao, tom
--
== Weblinks ==
*
http://shop.wikipedia.org - WikiReader Internet zu kaufen
*
http://de.wikipedia.org/wiki/Benutzer:TomK32
*
http://www.hammererlehen.de - Urlaub in Berchtesgaden