On Saturday 25 May 2002 12:01 pm, Brion wrote:
> As far as I know, the ability to protect pages from edits by non-sysop
> users was added to protect against frequent vandalism of the main page.
> Is it appropriate to be using this ability in a content war, for example
> [[Reciprocal System of Theory]]?
As a general rule, most definitely yes. But in this case the eidt war had
degenerated in outright replacement with one totally different version with
another, then back again, then another revert, etc. This amounts to a type
vandalism and is counterproductive for all involved (although since this
"vandalism" is only being wrought onto a single article, it makes FAR more
sense to protect the page than even begining to consider banning an IP).
The article can just as easily be unprotected as soon as things settle down
(the period of protection was never meant to last more than a few days --
just enough time for the partisans to talk about what to do next). If a sysop
ever has to do this, then a reason why it had to be done should always be
placed in the talk for that the page in question. As was done.
If that sysop makes a mistake in judgement, then another syop can just as
easily reverse the protection. BUT the other sysops need to be informed that
such action is being taken. Perhaps it will be better to first bring this to
the mailing list instead of just placing a note on talk page....
Just read the talk and especially the old talk for [[Reciprocal System of
Theory]]: One inherently POV version was being advocated by a single person
named Doug and another NPOV version had been worked and agreed upon by a half
dozen others. After a previous version war in January Doug returns and places
his POV version in the place of the agreed upon NPOV version of January. Then
the version war started up again with one version being replaced by another
in kind.
Some discussion should take place as to when this ability should be used. If
it is only appropriate to use the protection function of the software on the
main page and maybe special and possibly on some wikipedia-specific pages,
then this function should be disabled by default to all but those pages.
Please reverse the protection of [[Reciprocal System of Theory]] if this
action was not appropriate.
maveric149
As far as I know, the ability to protect pages from edits by non-sysop
users was added to protect against frequent vandalism of the main page.
Is it appropriate to be using this ability in a content war, for example
[[Reciprocal System of Theory]]?
-- brion vibber (brion @ pobox.com)
My opinion would be no, because I thought we were never to use sysop privileges in a fight over content. I would like to hear what others think, though. I know there's been some conflict over the page.
KQ
You Wrote:
>As far as I know, the ability to protect pages from edits by non-sysop
>users was added to protect against frequent vandalism of the main page.
>Is it appropriate to be using this ability in a content war, for example
>[[Reciprocal System of Theory]]?
>
>-- brion vibber (brion @ pobox.com)
>
>[Wikipedia-l]
>To manage your subscription to this list, please go here:
>http://www.nupedia.com/mailman/listinfo/wikipedia-l
>0
On Thursday 23 May 2002 12:01 pm, Axel wrote:
> That said, here's my minimalistic suggestion: everything works exactly
> as it does now, except that every page gets two additional links:
> "View last reviewed version of this article" and "I have reviewed this
> version of the article and I think it is ok". The history of every
> article would record who has reviewed which version of the article and
> when.
>
> The set of all "last reviewed versions" could then be seen as the
> "stable" Wikipedia and could be pressed on CD. This would at least
> guard against vandalism, stupid jokes and blatant propaganda and
> advertising that sometimes gets through.
>
Cool! What a fantastic idea!. I also like the idea of having two versions of
an article; but I would simply call them Reviewed and Development. The
reviewed one would be a static page that could be replaced with the
Development (editable) version of an article whenever a certain number of
reviewers give it the OK. It also would be most excellent to have an even
higher level of review called "Peer Review" that would be performed by
somebody with a related college degree.
This level of review would be roughly analogous to what Nupedia had (has?)
set-up but should done in a better way. But then the devil is in the details
here and we may need to temporarily freeze an article while it is being peer
reviewed so the expert can fix any glitches, submit the article as peer
reviewed and reopen the development version back to the masses (the
reviewer(s) should only have a very limited amount of time to review/fix the
article while the development version is locked). This might be a messy thing
to do in practice though and we should discuss this at length in order to
work out the details if we decide to do this at all.
So potentially we might have three versions of an article; one which is world
editable, one which was voted to be OK by users, and one that was tweaked and
made to conform to higher Nupedia-like standards.
> The only issue is: who is allowed to review articles? The pragmatic
> answer would be: all sysops.
Somewhat disagree: I tend to agree with Chuck's previous comment here in that
I think we should open this up to anybody who has been a user for over 3
months. I would add though that exceptions should be made for anybody who has
edited a reasonable set number of articles and who has the backing of at
least one sysop. We could update the software to give these user's special
<i>additional</i> rights automatically after three months and/or after they
edit more than a set number of articles (thus making them then eligible for
sysop promotion). We could then call these users "trusted hands" (which is
already in the wikiware code -- but a "trusted hand" doesn't have any more
rights than a regular user does in the current set up -- correct me if I am
wrong Brion).
It would also be nice for a sysop to have the ability to "reset the clock" as
Chuck proposed for anybody who doesn't follow established policy after being
warned. The warning process could be started by a posting a message on a
special page by at least one person with "trusted hand" or greater
status. These warnings shouldn't do anything other than inform the user that
"they have been warned" with a reason why (there could be an automatic
temporary suspension of "trusted hand" status by being warned -- but I'm not
sure if I like that idea). These warnings could also automatically expire
after a set amount of time and/or number of edits unless somebody else renews
the warning. In addition, a "trusted hand" or sysop should be able to remove
the warnings if a mistake had been made. If the user doesn't get his or her
act together during the warning period, a sysop could then reset the user's
clock (which should have to be a <i>different</i> person than the one
starting the warning process).
Blatant trolling or VANDALISM could also be logged on these warning pages.
The warner could choose from a checklist of options when warning the warnee.
Such a list might include these options: Introduced bias, Rambling
contributions, Falsification of facts, Nonsensical/inappropriate
contributions, Introduction of propaganda, Sloppiness, Violation of naming
conventions or VANDALISM. If VANDALISM is selected then in addition to the
warning page a VANDALISM in PROGRESS page can also be displayed on
RecentChanges with the "contributions of" the warnee automatically placed on
that page.
There should, however, be some safeguards to help ensure that honest mistakes
by contributors are not used as the basis to warn a person (perhaps have a
silent first warning that only the warner and warnee know of... but then how
would you make that work?) . I'm concerned that such a system might be used
too much and contributors might think twice about hitting the save button. If
misused, such a system could have negative effects on the number and extent
of contributions wikipedia receives. So, if something like this is deemed
necessary, then great care must be taken to minimize possible negative
side-effects. But something similar to this will probably be needed in the
coming years as the number of active contributors becomes larger than anyone
(including users like me who contribute at least 3-4 hours a day on average)
is able to keep track of.
> A code of honor is probably in order,
> saying that no sysop should review an article that they themselves
> substantially contributed to.
In principle this might be a good idea but in practice I don't think this
would work with the size of our current contributor-base (not to mention
sysop-base). As it is right now in many cases the people with the greatest
knowledge and ability to review an article have already significantly
contributed to it.
The "honor system" set-up may work after a few more years when wikipedia has
literally thousands of people contributing daily and has close to 100,000
articles (which is an arbitrary goal BTW -- I doubt we will all stop
submitting new articles when the project is "complete"). Then we could
reasonably expect that there would be enough "trusted hands" with sufficient
knowledge of the subject <i>and</i> who had not already contributed much to
the article to be able to vote it up or down.
Given all that -- I do have reservations that the power to warn and reset the
clocks of users may be abused and this could change the character of
wikipedia and discourage contributions. However, given that IP banning hasn't
<yet> been abused by any sysops I have hope that such a system might just
work.
If we could somehow ensure that we have reasonably competent reviewers then
Wikipedia could then (eventually) become what Nupedia was not able to be: an
extensive, trusted and useful source of human knowledge. We are almost at the
extensive stage -- Should we work on a framework for establishing reasonable
trust and therefore usefullness of our articles? I vote for yes.
Just my 1.5 cents
maveric149
On Friday 24 May 2002 12:01 pm, Manning wrote:
> Beta-stable: In the words of Mr Horse (from Ren and Stimpy) "No Sir, I
> don't like it". One of LMS's early ideas was that when an article looked
> good enough it would get moved to Nupedia. Problem with that was that
> no-one ever actually went to Nupedia.
My opinion on the reason why this particular idea of Larry's didn't work was
because there wasn't <i>any</i> practical (or otherwise) integration of
Nupedia with wikipedia <i>and</i> the two projects represented two different
cultures and methodologies -- not because the concept was inherently flawed.
If there is tight integration of this proposed functionality and it is made
very clear to viewers of the static pages that an editable version exists by
"clicking here", I think it just may work. Although to clarify, I think the
most current editable version of articles should be the one displayed by
default -- a user would have to click to view the reviewed version (the
difference between the two versions also needs to be made obvious and
intuitive).
> The same will happen here, if we create a Wikipedia that is static (can't
> be edited) then no-one will go there. Even the people who are truly only
> here to browse will gravitate to the dynamic site, simply because it is
> possible that that site "is more current".
Interesting idea, but I think this only really applies to casual visitors and
not to researchers or students who need reliable data. For example:
If I needed information about a topic I would put far more trust in a static
page that had been reviewed by 10 people with trusted status than I would
place on an "up to date" yet un-reviewed version. How could I trust the
validity of the world-editable version if I were witting a report? What use
would there be in citing and linking to the world-editable version if I can't
have a reasonable amount of confidence that the information I am citing will
be there when someone checking my work follows the link?
It would reflect very negatively on me if I cited something interesting I
found in a beautifully written and informative - yet world editable -
wikipedia article on say the Apollo Moon landings, and then a reviewer of my
work checked the link and instead of seeing what I saw, sees a totally
unrelated and unintelligible diatribe about CARROTS. Or worse, the version of
the article that the reviewer sees was rewritten by a pseudoscience freak who
is highly skeptical of NASA's "claim" that the Moon landings actually took
place. How is that going to make me and my work look to the reviewer?
> To prove this I'd recommend creating a one-off Static version, putting a
> link to it from the dynamic site and then monitoring web-traffic over a
> period of 2- 3 months. Even after the site has been around for a while I'd
> be willing to wager it would still be a very lonely place.
Sounds like a reasonable.proposal -- I too am not totally convinced that any
proposed "review process" can actually work. But my concern is that the size
of our current contributor base is not yet large enough to make this
practical. There doesn't seem to be enough eyes yet to keep reviewed versions
of articles reasonably up-to-date. And there is of course a HUGE backlog of
articles that haven't been touched by any editor for multiple months. But
then, starting this process wouldn't hurt the current "unreliable" status of
wikipedia articles and may in time lead to wikipedia being viewed as being a
more trusted source of information (if only for the handful of reviewed
articles).
> If I am wrong, and the site does get a lot of traffic, well then we should
> spend the time and develop a proper management and quality control system.
> But until we have proved its viability, we would just be wasting effort -
> effort better spent on creating/editing articles.
I agree in part -- if we decide to do this, then we need to give it a lot of
thought and not rush into things (best to do it right than to do it fast).
But because this idea has such great potential, I think it <i>is</i>
something we should investigate and give a chance. This is something that
won't really get a lot of traffic at first - there needs to be a critical
mass of reviewed articles before traffic to them in general increases
significantly (even a hundred reviewed articles would be totally lost in the
current database).
If done correctly this could finally begin to make wikipedia a critically
respected source of human knowledge that people can depend on and come back
to over and over again.
-- maveric149
On Thursday 23 May 2002 12:01 pm, Bryan wrote:
> How about having a page which presents links to articles in decreasing
> order of time since they were last reviewed? If everyone every once in a
> while grabbed a few off the top to review, then eventually all of the
> articles would be cycled through on a regular basis.
Great idea!
maveric149
This sounds like a trust metric, which the folks over at advogato.org are very fond of. There's some interesting ideas in this article: http://www.advogato.org/article/454.html
- Stephen Gilbert
----- Original Message -----
From: Axel Boldt <axel(a)uni-paderborn.de>
Date: Thu, 23 May 2002 01:17:39 +0200 (MET DST)
To: wikipedia-l(a)nupedia.com
Subject: [Wikipedia-l] beta/stable ideas
Honestly, I am not sure that anything in the direction of
review/stable release etc. should be done, at least as long as the
average article quality continues to rise, which I think it does.
People who complain that they can't trust anything they find in
Wikipedia should be commended for their healthy attitude; it is much
more dangerous to think "see, it says here, this is part of the stable
Wikipedia, so everything on this page must be right".
That said, here's my minimalistic suggestion: everything works exactly
as it does now, except that every page gets two additional links:
"View last reviewed version of this article" and "I have reviewed this
version of the article and I think it is ok". The history of every
article would record who has reviewed which version of the article and
when.
The set of all "last reviewed versions" could then be seen as the
"stable" Wikipedia and could be pressed on CD. This would at least
guard against vandalism, stupid jokes and blatant propaganda and
advertising that sometimes gets through.
The only issue is: who is allowed to review articles? The pragmatic
answer would be: all sysops. A code of honor is probably in order,
saying that no sysop should review an article that they themselves
substantially contributed to.
Axel
[Wikipedia-l]
To manage your subscription to this list, please go here:
http://www.nupedia.com/mailman/listinfo/wikipedia-l
________________________________
Wikipedia: The Free Encyclopedia
http://www.wikipedia.com
--
Powered by Outblaze
Hi all from extremely (and fatally) hot Hyderabad, India... 1037 dead so far in
this town alone. Anywhere else and this would be an international tragedy, here
it barely makes the papers.
Beta-stable: In the words of Mr Horse (from Ren and Stimpy) "No Sir, I don't
like it". One of LMS's early ideas was that when an article looked good enough
it would get moved to Nupedia. Problem with that was that no-one ever actually
went to Nupedia.
The same will happen here, if we create a Wikipedia that is static (can't be
edited) then no-one will go there. Even the people who are truly only here to
browse will gravitate to the dynamic site, simply because it is possible that
that site "is more current".
To prove this I'd recommend creating a one-off Static version, putting a link
to it from the dynamic site and then monitoring web-traffic over a period of 2-
3 months. Even after the site has been around for a while I'd be willing to
wager it would still be a very lonely place.
If I am wrong, and the site does get a lot of traffic, well then we should
spend the time and develop a proper management and quality control system. But
until we have proved its viability, we would just be wasting effort - effort
better spent on creating/editing articles.
Cheers
Manning
(Back in civilisation around June 20)
*Doomsayers have successfully forecasted 35 of the last 2 problems in
Wikipedia's
history*
Honestly, I am not sure that anything in the direction of
review/stable release etc. should be done, at least as long as the
average article quality continues to rise, which I think it does.
People who complain that they can't trust anything they find in
Wikipedia should be commended for their healthy attitude; it is much
more dangerous to think "see, it says here, this is part of the stable
Wikipedia, so everything on this page must be right".
That said, here's my minimalistic suggestion: everything works exactly
as it does now, except that every page gets two additional links:
"View last reviewed version of this article" and "I have reviewed this
version of the article and I think it is ok". The history of every
article would record who has reviewed which version of the article and
when.
The set of all "last reviewed versions" could then be seen as the
"stable" Wikipedia and could be pressed on CD. This would at least
guard against vandalism, stupid jokes and blatant propaganda and
advertising that sometimes gets through.
The only issue is: who is allowed to review articles? The pragmatic
answer would be: all sysops. A code of honor is probably in order,
saying that no sysop should review an article that they themselves
substantially contributed to.
Axel
> There are at least two big questions about how this
> ought to work:
>
> 1) How do we get new folks to dive in and
> contribute? Having your
> improvements show up *immediately* is a big draw to
> the wikipedia
> experience.
Well, we could call it beta and stable if we wanted to
and people could know that the beta version was more
up-to-date than the stable version. A lot of people
at Wikipedia are here just to read and they can keep
doing that and others are here to write and I suggest
we have a link on the bottom of each stable article
that says, "Edit this article in Beta". I imagine all
of us would still just watch the beta anyway. :)
> 2) How does the review process work? If very few
> people are interested in
> a topic, a new article or change might get
> completely ignored, and very
> good articles may never be seen; so limiting
> reviewing to certain trusted
> users would likely not be sufficient. On the other
> hand, it's child's
> play for organized vandals to set up secondary
> accounts to mod up their
> own work, as many discussion-oriented sites have
> discovered on
> establishing user-run comment moderation systems.
This is of course the hardest part. Let's form a
Cabal! :) When there's enough positive reviews from
enough experienced users (maybe those signed up for
over 3 months) and *no* negative reviews (all negative
reviews should have a reason why they voted negative)
then the article makes it to stable. If someone is
found abusing the system, they could be knocked back
to standard user with a 3 month waiting period to get
back.
This is only a suggestion! The values of course could
be tweaked, but I think this could work. It needs to
a) not cause ppl to try to do things to improve their
possibilities of getting in the "Cabal" (um, reviewing
committee), i.e. time works well for this and b) not
easily subject to abuse and c) it needs to be simple!
We need to have a base idea to start a system like
this, so I'm presenting one that we can discuss.
Chuck
=====
Come to my homepage! Venu al mia hejmpagxo!
http://amuzulo.babil.komputilo.org/
====
Venu al la senpaga, libera enciklopedio
esperanta reta! http://eo.wikipedia.com/
_________________________________________________________
Do You Yahoo!?
Información de Estados Unidos y América Latina, en Yahoo! Noticias.
Visítanos en http://noticias.espanol.yahoo.com