On Mon, Jul 23, 2012 at 7:25 AM, Derric Atzrott
<datzrott(a)alizeepathology.com> wrote:
This is all a fantastic idea. Distributing Wikipedia
in a fashion
similar to git will make it a lot easier to use in areas where
Internet connections are not so common.
It always surprises me when people express enthusiasm for
this kind of idea, since my instinct assumption is the exact
opposite: that this couldn't possibly be feasible or practical.
Just out of curiosity, how large are the git-managed projects
that you have successfully handled this way? Number of files,
lines of code, bytes or commits per day? Did you ever run into
a software project where a fully decentralized git solution was
impractical, e.g. because pulling in the daily updates took
more than an hour on your available bandwidth?
I can't say that I've handled an large git-managed projects this way, but I
am to understand that this is the very thing for which git was designed.
Given this I would hope that a git like model would be good for
decentralized editing.
It's really not. Things that are (relatively) simple in the database tend
to require walking the entire revision tree in Git in order to figure the
same data out.
Git is awesome for software development, but trying to use it as an
article development tool is really a bad solution in search of a
problem. We could've had the same argument years ago and said
"why use a database, SVN stores information in a linear history
that's useful for articles." Having diverging articles may be cool/
desired, but using Git is not the answer.
-Chad