Hi.
I know ideas for a distributed Wikipedia have been discussed here before, but I
haven't seen the following angle before (this may just be because I don't read
my mail carefully enough of course):
Let's say you have 10 Wikipedia servers. All of them dispense articles for
reading directly. When an article is about to be edited, the title is hashed,
and the corresponding server is contacted.
That way, each article "belongs" on one of X server, so a lot of consistency
problems disappear. That server can again notify the others about changes in
its "own" articles.
If a server goes down, it's not the end of the world, an Xth of the articles
aren't editable for a while. The downed server's "lease" can be revoked
after
an hour, perhaps.
While I don't have the capacity to implement this myself (just fending off the
"send patches"), it doesn't appear to be such a gigantic departure from the
existing system. (Yes, big. But not rewrite.)
Wouldn't it be worth looking into? It seems increased demand eats up every
regular measure taken (kind of like congested highways becoming even more
congested when they're expanded).
-- Daniel