[Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

David Gerard dgerard at gmail.com
Tue May 11 21:00:44 UTC 2010


On 11 May 2010 21:42, Aryeh Gregor <Simetrical+wikilist at gmail.com> wrote:
> On Tue, May 11, 2010 at 12:48 PM, David Gerard <dgerard at gmail.com> wrote:

>> You're a developer. Write something for logged-in users to block
>> images in local or Commons categories they don't want to see. You're
>> the target market, after all.

> I'd be happy to do any software development if that were helpful.
> I've been thinking about how best to do it, on and off, for some time.
> However, I don't think it's reasonable to require opt-out for images
> that a large percentage of viewers don't want to see without warning.
> If the people who want to see it can see it with just a click anyway,
> they aren't losing anything if it's hidden by default.  Especially if
> it's just blurred out.


You're making an assumption there on no evidence: a "large percentage"
wanting to be opted out by default.

If you write it, then logged-in users could give you numbers. (e.g. a
Western World "worksafe" filter set will undoubtedly be popular.)

Commons admins are in fact *painstaking* in accurate categorisation;
the filter sets should be stable.


>> (If that isn't enough and you insist it has to be something for
>> default, then I fear you are unlikely to gain consensus on this.)

> Does that mean you disagree with me but aren't saying why, or that you
> agree but aren't bothering to say so because you're sure it won't
> happen?  The latter is where self-fulfilling prophecies come from.


I think it's a bad idea *and* you are unlikely to obtain consensus.
Because filtering for people who *haven't* asked is quite a different
proposition from filtering for people who *have* asked, in what I'd
hope are fairly obvious ways.


wjohnson at aol.com wrote:

>I would suggest that any parent who is allowing their "young children" as one message put it, to browser without any filtering mechanism, is deciding to trust that child, or else does not care if the child encounters objectionable material.  The child's browsing activity is already open to five million porn site hits as it stands, Commons isn't creating that issue.  And Commons cannot solve that issue.  It's the parents responsibility to have the appropriate self-selected mechanisms in place.  And I propose that all parents who care, already *do*.  So this issue is a non-issue.  It doesn't actually exist in any concrete example, just in the minds of a few people with spare time.


Indeed.


- d.



More information about the foundation-l mailing list