[Foundation-l] Letter to the community on Controversial Content

Tobias Oelgarte tobias.oelgarte at googlemail.com
Tue Oct 18 10:56:09 UTC 2011


Am 18.10.2011 11:43, schrieb Thomas Morton:
>>> And that is a mature and sensible attitude.
>>>
>>> Some people do not share your view and are unable to ignore what to
>>> them are rude or offensive things.
>>>
>>> Are they wrong?
>>>
>>> Should they be doing what you (and I) do?
>>>
>>> Tom
>> The question is, if we should support "them" to not even try to start
>> this learning progress. It's like saying: "That is all you have to know.
>> Don't bother with the rest, it is not good for you."
>>
>> nya~
>>
>   Which assumes that they want to, or should, change - and that our approach
> is better and we are right. These are arrogant assumptions, not at all in
> keeping with our mission.
I don't assume that. I say that they should have the opportunity to 
change if they like to. That controversial content is hidden or that we 
provide a button to hide controversial content is prejudicial. It 
deepens the viewpoint that this content is objectionable and that it is 
generally accepted this way, even if not. That means that we would 
fathering the readers that have a tendency to enable a filter (not even 
particularly an image filter).
> It is this fallacious logic that underpins our crazy politics of
> "neutrality" which we attempt to enforce on people (when in practice we lack
> neutrality almost as much as the next man!).
... and that is exactly what makes me curious about this approach. You 
assume that we aren't neutral and Sue described us in median a little 
bit geeky, which goes in the same direction. But if we aren't neutral at 
all, how can we even believe that an controversial-content-filter-system 
based upon our views would be neutral in judgment or as proposed in the 
referendum "cultural neutral". (Question: Is there even a thing as 
cultural neutrality?)
> It's like religion; I am not religious, and if a religious person wants to
> discuss their beliefs against my lack then great, I find that refreshing and
> will take the opportunity to try and show them my argument. If they don't
> want to, not going to force the issue :)
We also don't force anyone to read Wikipedia. If he does not like it, he 
has multiple options. He could close it, he could still read it, even if 
he don't like any part of it, he could participate to change it or he 
could start his own project.

Sue and Phoebe didn't liked the idea to compare Wikipedia with a library 
since we create our own content. But if can't agree on that, than we 
should at least be able to agree that Wikipedia is like a huge book, an 
encyclopedia. But if we apply this logic, then you will see the same 
words repeated over and over again. Pullman did, any many other authors 
before his lifetime did it as well. Have a look at his short and direct 
answer to this question:

* http://www.youtube.com/watch?v=HQ3VcbAfd4w
>> It's like saying: "That is all you have to know. Don't bother with the
> rest, it is not good for you."
>
> Actually, no, it's exactly not that. Because we are talking about
> user-choice filtering. In that context, providing individual filtering tools
> for each user should not be controversial.
I indirectly answered to this already at the top. If we can't be neutral 
in judgement and if hiding of particular content results in strengthen 
the already present positions, then even a user-choice-filtering-system 
fails in the aspect of being neutral and being not prejudicial.
> I understand where that becomes a problem is when we look at offering
> pre-built "block lists", so that our readers don't have to manually
> construct their own preferences, but can click a few buttons and largely
> have the experience they desire. So we have this issue of trading usability
> against potential for abuse; I don't have an immediate solution there, but I
> think we can come up with one. Although we do quite poorly at handling abuse
> of process and undermining of content on-wiki at the moment, this could be a
> unique opportunity to brainstorm wider solutions that impact everywhere in a
> positive way.
We can definitely think about possible solution. But at first i have to 
insist to get an answer to the question: Is there a problem, big and 
worthy enough, to make it a main priority?

After that comes the question for (non neutral) categorization of 
content. That means: Do we need to label offensive content, or could 
same goal be reached without doing this?

In the other mail you said that you have a problem with reading/working 
on Wikipedia in public, because some things might be offensive to 
bystanders. One typical, widely spread argument. But this problem can 
easily be solved without the need for categorization. The brainstorming 
sections are full with easy, non disturbing solutions for exactly this 
potential problem.
> If an individual expresses a preference to hide certain content, it is
> reasonable for us to provide that option for use at their discretion.
>
> Anything else is like saying "No, your views on acceptability are wrong and
> we insist you must see this".[1]
>
> *That* is censorship.
>
> Tom
>
> 1. I appreciate that this is the status quo at the moment, I still think it
> is censorship, and this is why we must address it as a problem.
Nobody ever forced him to read the article. He could just choose 
something else. That isn't censorship at all. Censorship is hiding of 
information, not showing of information someone might not like.

The only reason this could be seen as censorship by some is the moment 
when they don't want to let others see content that "they should not 
see". This matches exactly the case noted by Joseph Henry Jackson:

“Did you ever hear anyone say, 'That work had better be banned because I 
might read it and it might be very damaging to me'?”

The only group claiming, that not being able to hide something is 
censorship, are the people, that want to hide content from others.

nya~



More information about the foundation-l mailing list