[WikiEN-l] Don't push POV into the Wikipedia software and policies

John Lee johnleemk at gawab.com
Thu Feb 24 12:02:05 UTC 2005


Karl A. Krueger wrote:

>On Wed, Feb 23, 2005 at 08:17:56PM +0800, John Lee wrote:
>  
>
>>Karl A. Krueger wrote:
>>    
>>
>>> Therefore, any labeling of selected categories as "potentially
>>> offensive" is an NPOV violation and unacceptable on that regard.
>>>
>>>      
>>>
>>I refer you to the argument regarding disambiguation pages that has 
>>already been discussed by others on this list.
>>    
>>
>	[snip]
>  
>
>>It's not POV and actually inconveniencing for those of us looking for
>>other articles (imagine , but it's accepted because the cost to
>>benefit ratio is not that severe.
>>    
>>
>
>You're right:  disambiguation isn't an NPOV problem.  It also isn't
>terribly relevant to the issue at hand, even if you repeat that claim
>five times in one post.  I'll respond to it just once.
>
>Here's the difference:  Disambiguation doesn't involve anyone trying to
>hide away material, or "protect" me from things "for my own good".  It
>also does not involve any judgment that some material is "less suitable"
>or "offensive" -- only that it is less likely to be looked-for.
>  
>
Intent means nothing. The road to hell is paved with good intentions. 
There is no difference between the disambiguation of pages and this 
beyond their intention.

>In contrast, any kind of censorship / filtering regime necessarily
>contains within it the assumption that some material is _lesser_ than
>others, by dint of offending the protected group ... and moreover, that
>some people's offense is unimportant, while other people's offense is
>important.  It is not presented as a guess, but as a moral injunction
>("this material IS offensive, it MUST BE turned off by default") which
>it is not within Wikipedia's purview to make.
>  
>
We don't judge material to be offensive; rather, we judge it to be 
likely undesirable for a good portion of our readers. Filtering images 
does not compromise NPOV because readers will still have the choice in 
their hands. Otherwise, I suppose purposely not directly linking to 
Goatse/Tubgirl is also going against NPOV and assuming that material is 
lesser. Likewise with not hosting any such images at all for easy access 
(surely one of the most-viewed websites on the internet is fair game for 
a screencap?), even though it's very easy to obtain and upload them 
under fair use.

>>However, I have also pointed out before that photographs have more
>>psychological impact than illustrations, which have more psychological
>>impact than text (just try asking a psychologist). Therefore, a good
>>deal of people offended (or at least annoyed) by the photo formerly at
>>[[autofellatio]] probably wouldn't feel the same way about text.
>>    
>>
>
>I don't think we actually have sufficient evidence on that.  Again,
>consider _Ulysses_, a work of text (no images) that was once banned in
>the U.S. for obscenity.
>  
>
That was then, this is now. Sure, based on this we might as well 
disregard all editorial decisions regarding propriety, but Wikipedia 
should progress with society, not force it ahead.

>One of the past discussions related to this issue had to do with whether
>Wikipedia should link to sexually explicit material offsite, rather than
>keeping copies of them within the database.  The idea was that people
>who are judgmental about such material would thus not be able to decry
>Wikipedia as a collection of pornography.  However, it seemed to me (and
>still does) that such folks would just as easily call Wikipedia a Web
>directory of pornography.
>  
>
If it's going to be as untasteful as the autofellatio image, that would 
seem a reasonable objection to me. Especially considering it's hard to 
find pornographic material online without getting a package of spyware 
to come with it. No, I think it's better to host such material but 
filter certain things by default. That, IMO, is much fairer to our readers.

>>I have never argued for catering to the whim of extremists. That's
>>stupid. There is no reason why we shouldn't filter certain images,
>>though.
>>    
>>
>
>It seems to me that when challenged, you rather readily lean on what you
>call "common sense".  I'm not sure what you mean by "common sense",
>since the some of points that we're discussing are actually pretty deep
>political ideas and not everyday issues at all.  "Common sense" usually
>means people's knowledge of the everyday, like "things fall when you
>drop them".
>  
>
Common sense as in "communism is an extremely unrealistic ideal because 
it presumes everyone will be honest and altruistic". That sort of thing.

>It sounds to me kind of like you're using "common sense" to mean that
>everyone reasonable must agree with your assumptions, and anyone who
>disagrees is an "extremist" and their views should be ignored.
>
>I don't think that's a very convincing mode of argument.
>  
>
Extremists shouldn't be ignored since there's occasionally something 
useful coming from them, but seriously, how many things can you push to 
an extreme without it becoming a bad thing? Extreme freedom results in 
anarchy. Extreme eating results in obesity. Extreme dieting results in 
malnutrition/anorexia/bulimia. I always believe in striking a balance, 
and a filtering system will strike a balance.

>>It's the cost to benefit ratio we're talking about here: You're
>>extremely far more likely to find someone damn pissed to be looking at
>>a photograph of a clitoris than to find someone damn pissed to be
>>looking at a photograph of a caterpillar. 
>>    
>>
>
>That position generalizes very poorly to other classes of articles,
>though.  (And generalizability is a must for rules.)  The fact that
>people get "damn pissed" about something is no reason not to keep it in
>an encyclopedia.  There are an awful lot of _historical facts_ it's
>worth being damn pissed about.
>  
>
The difference is that publishing them won't give people ground to 
attack you. If we had some sort of filtering system, we can always point 
to it and say "We tried". That counts for something. If we let images 
like the autofellatio one be, a decent number of publications will carry 
the story when it breaks, quite likely with a neutral or even slight 
anti-Wikipedia bent. If we have a filtering system, though, even a 
rudimentary one, we can always point to it for some ass-coverage. It 
won't protect us from severe right-wingers, but at least public opinion 
will be sufficiently divided.

>Moreover, it isn't clear to me that we can do a cost-benefit analysis of
>the principles on which Wikipedia was founded, such as NPOV.  Neutrality
>is not just a benefit, but a basic ground-rule of Wikipedia.
>  
>
Of course. The problem is, taking it to an extreme is a bad idea. An 
extreme definition of NPOV means we could never use weasel words (even 
in the lead section). An extreme definition of NPOV means all sorts of 
weird extrapolations, i.e. that letting an article contain charting data 
but not another do the same is POV, or that not giving an equal share to 
all sides of an argument is POV. Like most Wikipedia policies so far, 
NPOV can easily be broken down to "Don't let any bias show in the 
writing of the article".

>(NPOV for Wikipedia is not just a "benefit", but rather a "boundary".
>We don't ask, "Is this violation of NPOV worth what it gets us?"
>Rather, we take it as read that any violation of NPOV is a bad thing,
>and we try to avoid such violations wherever possible rather than making
>excuses for them.  We don't always _succeed_, but we don't give up.)
>  
>
Since filtering images does not violate NPOV, I don't see what this has 
to do with it.

>>Besides, it's not really an imposition of someone's POV as to what is 
>>offensive. It's an editorial decision about what images would be 
>>potentially offensive or greatly distractive to a substantial amount of 
>>people.
>>    
>>
>
>Sure, and I'll just go make an "editorial decision" that homeopathy is
>bunkum, and put that all over the article [[Homeopathy]].  When someone
>accuses me of an NPOV violation, I'll just tell them it was nothing of
>the sort, it was an "editorial decision".  :)
>  
>
But it is an NPOV violation. That cannot be compared to filtering 
images, since all it takes is one click to view them, and no more 
technical expertise is required than knowing how to move and click a mouse.

>Calling a tail a leg doesn't make it one.  An NPOV violation is still
>against Wikipedia policy, even if you call it "editorial decision".
>  
>
Yes, not using disambiguation pages on [[Hey Jude]] is an NPOV 
violation. We shouldn't be disenfranchising the 1% of visitors who might 
want the album instead of the song.

>>>1. All images are treated equally by default -- either displayed, or not
>>> displayed.  Thus, the system does not incorporate any POV biases such
>>> as the idea that nudity is "offensive" and caterpillars are not.
>>>
>>>      
>>>
>>I do not entertain the idea of such extremist pandering to NPOV. By 
>>virtue of that logic, we might as well host a disambiguation page at 
>>[[Chicago]] even though almost everyone will be looking for [[Chicago, 
>>Illinois]].
>>    
>>
>
>If you think NPOV is "extremist" and that that's bad, you're on the
>wrong project.  This is Wikipedia, where NPOV is a basic ground-rule.
>  
>
NPOV isn't extremist, but giving it an extremist bent like I pointed out 
above is bad.

>Or, in another sense, there's nothing wrong with being extreme about
>NPOV.  It's what we do here.  Wikipedia _is_, in this sense, an
>extremist project; that's why there are so people from Britannica,
>mainstream media, and other "normal" "middle-of-the-road" publications
>who can't figure us out, and assume that we're doomed to failure.
>  
>
If Wikipedia were extremist, it would have no editorial policies, no 
arbcom, no administrators, no page protection. It'd be a giant 
free-for-all. After all, that's what an extreme interpretation of a wiki 
is, isn't it?

>>The trouble is as Wikipedia grows, we seem to have a penchant for
>>relying on legalistic issues. Instead of having a general policy of
>>"reverting except for vandalism is bad" we need to have the 3RR which
>>is easily gamed. Instead of having a policy on personal attacks, we
>>have to go through the whole merry-go-round of RFC, RFM and RFAr.
>>    
>>
>
>Sounds to me like your proposal would be more of this, rather than less.
>
>It would end up with nasty, unresolvable arguments over what should go
>in the "off by default" categories, because of what those categories
>would really mean:
>
>If the category "Nudity" is off by default, then the meaning of that
>category would really be not just "these images contain nudity" but
>rather "these images contain nudity AND Wikipedia believes nudity is
>offensive".
>  
>
No, it would imply "these images contain nudity, and the style of their 
depiction has convinced Wikipedia's editors that they would be 
unsuitable for the general viewing public, although if you want to view 
this image, there's nothing stopping you". Basically, it's giving 
readers a choice. The difference is that the default choice is to hide 
for images we know _will_ offend a good number of people.

>And that is a judgment that Wikipedia still has no business making; and
>it is a judgment that would cause endless dispute.  Just as Wikipedia
>does not have an opinion on the issue of whether homeopathy is bunkum,
>Wikipedia does not have an opinion on whether nudity is offensive.
>  
>
Of course, it doesn't. It just hides it by default because a great deal 
of readers _do_ find it offensive. For those who don't, it's easy to 
remove the filtering system.

John Lee
([[User:Johnleemk]])




More information about the WikiEN-l mailing list