I'm going to ignore most of Mr. Lee's speculations and get right to the
point instead, in the hope that we are not causing too much harm to the
signal-to-noise ratio on this list.
On Tue, Feb 22, 2005 at 07:24:35PM +0800, John Lee wrote:
What's your problem with letting others disable
the viewing of certain
images for themselves?
I fully support end-users using whatever systems are at their disposal
to avoid seeing images that they do not want to see. Those systems are
chiefly at the client end, and include goofy censorware programs as well
as working facilities such as turning images off in the browser before
you open up an article whose title suggests that it might traumatize
your poor innocent eyebulbs.
However, I suspect it would be possible to build an NPOV-safe image-
filtering system for Wikipedia ... although I do not think it would do
what you would want it to do. I'll deal with this in detail below.
Let's try summaring your stance:
The core of my position is really simple: There isn't really a problem.
The status quo regarding images is actually working extremely well.
Gratuitous (uninformative) images do get removed or replaced, especially
those gratuitous images that also offend some people.
Vandalism of articles by inserting shock-site images is dealt with the
same as any other vandalism.
At the same time, censorious editors are not allowed to vandalize
Wikipedia by deleting or concealing informative images just because they
are offended by caterpillars or clitorises.
This is all as it should be. As the community gains more experience and
precedent cases, and more understanding of what to watch for in order to
detect vandalism (of either kind) it will become easier rather than
harder.
Because that is
what your (and others') "filtering" proposals are. They
are not really about making Wikipedia "safe" or "inoffensive", since
those goals have been demonstrated to be impossible. (As long as
Wikipedia is an editable wiki, there is no way to provide assurances
that it is inoffensive. As long as it is NPOV, there is no way to
decide _whose_ offense shall be considered significant. And those who
really militate against "indecency" and the like, in law and public
policy, are offended by text just as much as by images.)
So because it's impossible, we should never try to strive for it?
It's possible to do a remarkable amount of harm while pursuing
unreachable goals. I don't think you actually understood what I wrote,
though, so I'll try to spell it out a little more clearly:
1. As long as Wikipedia is a wiki, bozos and jerks can put "offensive"
content into it. Vandalism may not last long, but it does not have
to. Therefore, we can _never_ provide a guarantee that any given
article will not have goatse on it. And thus, Wikipedia will _never_
be suitable for classrooms where such a guarantee is required.
2. As long as Wikipedia has an NPOV policy, there is no way to decide
whose view of what is "offensive" shall be used for labeling. Some
people are offended by sex, some by violence, some by caterpillars,
and some get seizures from viewing [[moire]] images.
Therefore, any labeling of selected categories as "potentially
offensive" is an NPOV violation and unacceptable on that regard.
3. Your and others' proposals for image censorship have been defended in
part on the grounds that people will dislike or distrust Wikipedia
because it contains images that offend them. However, this is as
true of text as it is of images.
Some people, for instance, consider the sentence "Dave (age ten) and
Joe (age twelve) had sex" to be child pornography, just as some
people believe that a drawing can be child pornography. I believe
that sentence might even be illegal to publish in some jurisdictions.
Therefore, the need for any system for image censorship implies the
need for a system for text censorship. I personally suspect that
text censorship would drive off a lot of editors and would harm
Wikipedia greatly.
Wikipedia can never be guaranteed to be safe for
viewing, but we can
_try_ to stem it. The proposal will not even affect users who explicitly
declare they refuse filtering, so what's the fuss?
Censorship by default is still censorship, and still is an imposition of
someone's POV as to what is "offensive" or not. If you block clitorises
by default and don't block caterpillars, then you're offending against
those who find caterpillars more objectionable than clitorises.
Here's how to make an image-filtering system that is NPOV by design:
1. All images are treated equally by default -- either displayed, or not
displayed. Thus, the system does not incorporate any POV biases such
as the idea that nudity is "offensive" and caterpillars are not.
2. IF the system blocks all images by default: Both anonymous and
logged-in users are able to display all images. The system does not
violate WikiNature by incorporating a bias against anonymous users.
3. The system does not include any hard-coded categories reflecting the
POV of its creators. It uses the flexible Wikipedia category system.
Editors who contribute images are encouraged to categorize them
topically. "Uncategorized images" becomes a virtual category, which
can also be selected.
4. Existing NPOV policy is applied in full force to categories. This
means that it is a violation of Wikipedia policy to create categories
such as "images appropriate for children" or "images of sexual
perversions". Editors who do so persistently are treated the same as
any other systematic POV-pushers.
Although such a system would not be an NPOV violation, it would still be
(A) prone to abuse, and (B) prone to cause NPOV conflicts among editors.
Moreover, it is not necessary, as Wikipedia's present editing regime
works great for dealing with the images which are most commonly cited as
problematic, that is, the _uninformative_ "offensive" ones.
Thus, it is _possible_ to have an NPOV-safe image-filtering system. But
it would be a constant struggle to *keep* it NPOV-safe. Moreover, such
a system is not what censorship advocates seem to want, and in any event
it would not improve Wikipedia.
--
Karl A. Krueger <kkrueger(a)whoi.edu>