We could have two recent changes, one that has everything, another that has
pages of interest, a sort of watchlist for everyone that had all pages that
were embroiled in controversy listed.
Fred
From: "Karl A. Krueger"
<kkrueger(a)whoi.edu>
Reply-To: English Wikipedia <wikien-l(a)Wikipedia.org>
Date: Mon, 7 Feb 2005 13:51:15 -0500
To: English Wikipedia <wikien-l(a)Wikipedia.org>
Subject: Re: [WikiEN-l] Neo-nazis to attack wikipedia
On Mon, Feb 07, 2005 at 04:58:55PM +0000, Jake Waskett wrote:
On Monday 07 February 2005 17:06, David Gerard
wrote:
I'm not sure we need this for *all* pages -
it's really the problematic
0.1%.
I'm sorry, I meant only the highly controversial (read: problematic) pages.
One place where a few more technical tools might be useful is in
ascertaining where the problematic pages are, and bringing it to more
people's attention when a page goes into edit-war.
As the volume of editing on Wikipedia increases, it's harder and harder
to discern anything about current problems by looking at Recent Changes
-- there are hundreds of changes every hour. I'll notice an edit war if
it shows up on my watchlist or WP:RFC, but if nobody is watching and the
editors involved don't post to WP:RFC (or don't know about it), the
problem can persist indefinitely.
Each of us can name some pages that have experienced edit wars, NPOV
controversies, and problems with advocates insisting on unverifiable
inclusions. But consider an editor who wants to help resolve these
problems but doesn't know where the worst of them are ... or an editor
who's in the middle of one of these conflicts and doesn't realize that
it's *unusual*, that the behavior they're facing is well outside of the
Wikipedia norm, and that it's time to ask for help.
It should be relatively simple to automatically discern if a given edit
is a revert to a previous version of the article. (Even if an abuser is
using cut-and-paste reverts instead of editing older versions, the
people *responding* to the abuse will probably use older versions.)
Given this, the software could calculate a "revert temperature" of each
article -- the number of times it's been reverted over the past 48 or 96
hours, say.
Articles with a high revert temperature get listed on a special page --
which might serve as a short list for admins protecting articles.
(Automatic protection would probably be a BAD idea, since the software
can't tell if a version is a blatantly vandalized one ... and it could
be deliberately gamed to lock-in a maliciously edited version.)
Possibly other technical cues might kick in when an article's revert
temperature gets high ... such as a warning on the edit page that the
reverts were getting noticed, and recommending dispute resolution; or a
requirement that the edit comment be filled in; or the like.
--
Karl A. Krueger <kkrueger(a)whoi.edu>
Woods Hole Oceanographic Institution
_______________________________________________
WikiEN-l mailing list
WikiEN-l(a)Wikipedia.org
http://mail.wikipedia.org/mailman/listinfo/wikien-l