On Wed, Feb 16, 2005 at 03:55:26PM -0500, Jim Trodel wrote:
"Karl A. Krueger" <kkrueger(a)whoi.edu>
wrote:
The class of images I suspect more people are
concerned about is not the
class "offensive images", but rather the class "gratuitously offensive
images". Most everyone recognizes that there is also a class of
"informative images which also offend some people" -- for instance,
internal organs, caterpillars, swastikas, hammer-and-sickles, Abu
Ghraib, Jesus fish, etc., and that we must use these images in articles
where they are relevant.
I agree in fact this is a much better description of what I meant
earlier - "gratitiously offensive image" - this would meet that
description - and although it does convey the meaning that it can be
done.
I agree. Images can be doctored, after all. Images taken from porn are
somewhat less than authoritative -- I believe airbrushing and other
image doctoring practices are relatively common there.
If the issue here is proof that the act is possible, a citation to a
reputable source on the subject would be more worthwhile.
I don't think that the "proof that it's possible" argument is a very
serious reason to include the image. We don't include pictures of
caterpillars in order to prove that caterpillars exist.
Unfortunately,
the state of the world today is that "offensive" text is
much easier for automated systems such as censorware to recognize than
"offensive" images. It is easier for a program to pattern-match the
word "fellatio" than a picture of same.
Don't some censorware do their job based on a page by page analysis
rather than a entire domain block - if not - maybe I have a new
project to work on :)
Yes, for instance, according to their Web lookup tool, Cyber Patrol does
not block Wikipedia but does block /wiki/Autofellatio. I don't know if
that block is by URL or by content, though.
I, for one, would prefer to read the article
(especially as an editor
without having to conciously ignore the images - or block all images
with my browser - as has been suggested).
Blocking images at the browser is an excellent way for users to take
charge of their own browsing experience, whenever they knowingly go to
a site or page that might present something they don't want to see.
It is not an extreme measure and should not be characterized as such;
it is a perfectly normal feature of the software.
(It is also much easier than doing it on the server side, since it is a
matter of setting a preference rather than altering program code.)
**It seems perfectly reasonable to provide
alternatives to people
especially if the upkeep is non-existent or minimal.**
What kind of alternative are you looking for?
If it is an option of turning images on or off entirely, that already
exists, as noted above. Insisting that the programmers have to put
this function in the MediaWiki code, when users can already do it on the
browser, strikes me as failure of the user to take responsibility.
The user who insists, "I don't WANNA change my browser settings, it's
too HAAARD, you SERVER PEOPLE have to quit sending me all these dirty
pictures" is, simply, whining. The request for the image is sent by the
browser. The server simply fulfills that request ... and the browser
can be easily set not to make that request, if the user so chooses.
If it is an option of *selectively* turning off "offensive" images, then
the upkeep is neither nonexistent nor minimal -- indeed, it would be a
crippling source of constant disputes.
If an image
has no educational or encyclopedic content, then it doesn't
belong on Wikipedia at all, regardless of whether it offends people.
Agreed - and I don't see the encyclopedic content of even the pencil
drawing.
Fair enough. That's what we have IfD for.
--
Karl A. Krueger <kkrueger(a)whoi.edu>