Karl A. Krueger wrote:
On Tue, Feb 22, 2005 at 07:24:35PM +0800, John Lee
wrote:
What's your problem with letting others
disable the viewing of certain
images for themselves?
I fully support end-users using whatever systems are at their disposal
to avoid seeing images that they do not want to see. Those systems are
chiefly at the client end, and include goofy censorware programs as well
as working facilities such as turning images off in the browser before
you open up an article whose title suggests that it might traumatize
your poor innocent eyebulbs.
The problem is these are all far too fuzzy, either catching too few
unwanted images or too many wanted ones. Most of them don't provide easy
access to the images should it be decided they are desirable, either.
For example, even if some censorware did figure out all images on
[[autofellatio]] should be blocked and didn't do the same for a random
bunch of unrelated Wikipedia articles, it's hard to disable it for
images the reader wants to view (i.e. they may not mind an illustration,
but a photograph would be undesirable).
The problem is you are still viewing this as an attempt to bowdlerise
Wikipedia so it's suitable for consumption by right-wing wackos or the
underaged. It will never be. This is an attempt to give readers fine
control over what they are viewing, prevent unnecessary distraction
(images like the one originally on autofellatio distract too much from
the article for people not used to nudity, even if they aren't offended
by it) and at the same time (this is a side benefit: something nice to
have, but not the main pillar of the argument) taking a little bit of
sting out of accusations that Wikipedia is offensive.
Let's try summaring your stance:
The core of my position is really simple: There isn't really a problem.
The status quo regarding images is actually working extremely well.
Gratuitous (uninformative) images do get removed or replaced, especially
those gratuitous images that also offend some people.
The problem is, these cases are beginning to take up a great deal of
editors' time, as Raul654 pointed out in his quote from my earlier
email. We should be building an encyclopedia, not squabbling over these
details. Putting the decision in readers' hands means the issue is out
of our hands. Sure, if we go on with the status quo, these crappy images
will disappear. They just disappear after reams of discussion, maybe a
few blocks, and God knows what else.
At the same time, censorious editors are not allowed to
vandalize
Wikipedia by deleting or concealing informative images just because they
are offended by caterpillars or clitorises.
Good.
This is all as it should be. As the community gains
more experience and
precedent cases, and more understanding of what to watch for in order to
detect vandalism (of either kind) it will become easier rather than
harder.
Somehow I doubt that. Rarely things get solved without something
official. Michael and Wik have caused a lot of problems, and nothing was
solved until Jimbo or the arbcom stepped in. We got trapped in this
autofellatio quagmire as well even though we already knew from the
clitoris dispute that this would occur.
Because
that is what your (and others') "filtering" proposals are. They
are not really about making Wikipedia "safe" or "inoffensive", since
those goals have been demonstrated to be impossible. (As long as
Wikipedia is an editable wiki, there is no way to provide assurances
that it is inoffensive. As long as it is NPOV, there is no way to
decide _whose_ offense shall be considered significant. And those who
really militate against "indecency" and the like, in law and public
policy, are offended by text just as much as by images.)
So because it's impossible, we should never try to strive for it?
It's possible to do a remarkable amount of harm while pursuing
unreachable goals.
True.
I don't think you actually understood what I
wrote,
though, so I'll try to spell it out a little more clearly:
1. As long as Wikipedia is a wiki, bozos and jerks can put "offensive"
content into it. Vandalism may not last long, but it does not have
to. Therefore, we can _never_ provide a guarantee that any given
article will not have goatse on it. And thus, Wikipedia will _never_
be suitable for classrooms where such a guarantee is required.
Absolutely.
2. As long as Wikipedia has an NPOV policy, there is no
way to decide
whose view of what is "offensive" shall be used for labeling. Some
people are offended by sex, some by violence, some by caterpillars,
and some get seizures from viewing [[moire]] images.
Therefore, any labeling of selected categories as "potentially
offensive" is an NPOV violation and unacceptable on that regard.
I refer you to the argument regarding disambiguation pages that has
already been discussed by others on this list. Roughly, it goes like
this: [[NSA]] leads to the National Security Association and not the
National Stuttering Association or any other particular article or even
the disambiguation page (although that would be the most NPOV) for the
sake of readers' convenience. It's not POV and actually inconveniencing
for those of us looking for other articles (imagine , but it's accepted
because the cost to benefit ratio is not that severe.
Here's an example of how not automatically using disambiguation pages
where there's more than one related article (this example's just
theoretical; to the best of my knowledge it has never occured) is POV:
Let's say the article [[Greenwar]] is about this organisation opposed to
Greenpeace's environmentalism (assume Greenwar is as well-known and
prevalent as Greenpeace). However, we also maintain an article about an
organisation of the same name that is pro-environmentalism (they just
favour violent methods of protest). Now, what do we do? People looking
for the article about the pro-environmentalism organisation will be
pissed off. Replacing [[Greenwar]] with a disambiguation page is the
most neutral solution, but it's simply unfair since most people will be
looking for the anti-environmentalism organisation.
3. Your and others' proposals for image censorship
have been defended in
part on the grounds that people will dislike or distrust Wikipedia
because it contains images that offend them. However, this is as
true of text as it is of images.
Some people, for instance, consider the sentence "Dave (age ten) and
Joe (age twelve) had sex" to be child pornography, just as some
people believe that a drawing can be child pornography. I believe
that sentence might even be illegal to publish in some jurisdictions.
That's just a side benefit (IMO) as I already pointed out. However, I
have also pointed out before that photographs have more psychological
impact than illustrations, which have more psychological impact than
text (just try asking a psychologist). Therefore, a good deal of people
offended (or at least annoyed) by the photo formerly at [[autofellatio]]
probably wouldn't feel the same way about text.
We should never try to please extreme right- nor left-wingers, because
such a thing is simply impossible. But for those leaning to the centre
(I generally have my centre-right father in mind), it's not the text so
much as the photographs that disturb them.
Therefore, the need for any system for image censorship implies the
need for a system for text censorship. I personally suspect that
text censorship would drive off a lot of editors and would harm
Wikipedia greatly.
Obviously. Taking things to an extreme is a very, very bad thing (even
with NPOV, which is why for very prevalent things, we do not have a
disambiguation article but the article of the thing most people will
want to view). I have never argued for catering to the whim of
extremists. That's stupid. There is no reason why we shouldn't filter
certain images, though.
Wikipedia can
never be guaranteed to be safe for viewing, but we can
_try_ to stem it. The proposal will not even affect users who explicitly
declare they refuse filtering, so what's the fuss?
Censorship by default is still censorship, and still is an imposition of
someone's POV as to what is "offensive" or not. If you block clitorises
by default and don't block caterpillars, then you're offending against
those who find caterpillars more objectionable than clitorises.
Refer to the argument regarding disambiguation. It's the cost to benefit
ratio we're talking about here: You're extremely far more likely to find
someone damn pissed to be looking at a photograph of a clitoris than to
find someone damn pissed to be looking at a photograph of a caterpillar.
Of course, the latter's opinion has to be respected, which is why we
have an NPOV policy (we don't ever present as fact an opinion almost
everyone has). But in minor cases like this...
Besides, it's not really an imposition of someone's POV as to what is
offensive. It's an editorial decision about what images would be
potentially offensive or greatly distractive to a substantial amount of
people. Just like the decision to disambiguate or to house a certain
article at a certain title is an editorial decision about what content a
substantial portion of people would be looking for.
Here's how to make an image-filtering system that is NPOV by design:
1. All images are treated equally by default -- either displayed, or not
displayed. Thus, the system does not incorporate any POV biases such
as the idea that nudity is "offensive" and caterpillars are not.
I do not entertain the idea of such extremist pandering to NPOV. By
virtue of that logic, we might as well host a disambiguation page at
[[Chicago]] even though almost everyone will be looking for [[Chicaago,
Illinois]].
2. IF the system blocks all images by default: Both
anonymous and
logged-in users are able to display all images. The system does not
violate WikiNature by incorporating a bias against anonymous users.
If we are to not discriminate against anons (although we already do so
in extremely minor things like date formatting, colour scheme, etc.),
that can be done with a cookie, although people who disable cookies
would be disenfranchised. (It's still just a tradeoff almost exactly
like the traditional disambiguation page tradeoff).
3. The system does not include any hard-coded categories reflecting the
POV of its creators. It uses the flexible Wikipedia category system.
Editors who contribute images are encouraged to categorize them
topically. "Uncategorized images" becomes a virtual category, which
can also be selected.
Er...I thought we *were* going to use the Wikipedia category system.
4. Existing NPOV policy is applied in full force to
categories. This
means that it is a violation of Wikipedia policy to create categories
such as "images appropriate for children" or "images of sexual
perversions". Editors who do so persistently are treated the same as
any other systematic POV-pushers.
Agreed.
Although such a system would not be an NPOV violation,
it would still be
(A) prone to abuse, and (B) prone to cause NPOV conflicts among editors.
Of course it would be prone to cause conflicts, because it flies in the
face of common sense (the first requirement, I mean). The trouble is as
Wikipedia grows, we seem to have a penchant for relying on legalistic
issues. Instead of having a general policy of "reverting except for
vandalism is bad" we need to have the 3RR which is easily gamed. Instead
of having a policy on personal attacks, we have to go through the whole
merry-go-round of RFC, RFM and RFAr.
Moreover, it is not necessary, as Wikipedia's
present editing regime
works great for dealing with the images which are most commonly cited as
problematic, that is, the _uninformative_ "offensive" ones.
Problems like these are going to crop up more often over the years, and
I am pessimistic that more than a very few editors will learn the
lessons of this debate. When the community is divided like hell on this
(in other words, you would need hell to freeze over before obtaining
anything even resembling consensus), we need something to defuse the
situation.
What works today probably won't work tomorrow. The question we must
always ask regarding policy should be "Is it scalable?" Wikipedia is a
hundred times smaller than MSN today. It won't be that way in a few
years if we keep growing at this pace. Everything in place will need to
be able to cope with problems a hundred times larger than they are now,
and a hundred times more problems than they deal with now.
Of course, if a scalable policy is detrimental to things in the
short-term, it can be delayed. We should never be afraid of questioning
the status quo, though, and right now, a good number of people don't
think the status quo will work for us when everything is a hundred times
bigger than it is now.
Thus, it is _possible_ to have an NPOV-safe
image-filtering system. But
it would be a constant struggle to *keep* it NPOV-safe.
Of course it would be when such a system would make not using a
disambiguation page on [[Chicago]] taboo.
Moreover, such
a system is not what censorship advocates seem to want, and in any event
it would not improve Wikipedia.
Of course, because it flies in the face of common sense. Extremist
solutions always do that (it's the reason why pure capitalism/communism
and pure totalitarianism/anarchy will never ever work).
John Lee
([[User:Johnleemk]])