* Tei <oscar.vives(a)gmail.com> [Tue, 21 Jul 2009 19:42:45 +0200]:
On Tue, Jul 21, 2009 at 7:17 PM, Chengbin
Zheng<chengbinzheng(a)gmail.com>
wrote:
...
>
> No, I know what parsing means. Even if it takes 2 days to parse
them,
> wouldn't it be faster than to actually create
a static HTML dump the
> traditional way?
>
> If it is not, then what is the difficulty of making static HTML
dumps?
It
can't be bandwidth, storage, or speed.
WikiMedia work with limited resources on manpower, hardware,
etc..etc...
Things are done. When? when theres available resources, humans and of
the other types.
Is not only you, there are lots of people that want to download the
wikipedia (sometimes in a periodic fashion)
There are a log somewhere with the daily work of some wikipedia admin.
(
- :
http://wikitech.wikimedia.org/view/Server_admin_log
Some of these are even very fun, like in:
02:11 b****: CPAN sux
01:47 d******: I FOUND HOW TO REVIVE APACHES
( names obscured to protect the inocents ).
Speaking of compact off-line English Wikipedia I liked the TomeRaider
version:
http://en.wikipedia.org/wiki/TomeRaider
I wish there were newer TR builds, because English Wikipedia grows
really fast.
Dmitriy