[Foundation-l] Spam blacklisting on Foundation wikis

Herby herbythyme at fmail.co.uk
Thu Aug 23 08:14:03 UTC 2007


Discovering a backlog of requests on Meta spam blacklist Aphaia & I have
put some work into clearing this and now have it under control (I
hope!).

However in the course of this I have learnt a lot and some (to me)
Foundation level questions arise.

The policy at Meta has been only to blacklist those sites with
persistent cross wiki records of link placement.  However some of the
sites that have been requested for blocking are fairly obviously
undesirable whether they currently are troubling just one wiki or many
(porn being the obvious example but there are plenty more that I think
consensus on undesirability would be found).

So should we be rejecting requests for such sites saying that they
should use their own local lists?  I am aware that preventative blocking
of anything or anyone can be frowned on (personally if I can see trouble
coming I prefer to take action before it arrives rather than clean up
afterwards).  It may be that, in asking en wp for example to block
locally, we are merely missing the opportunity to avoid problems across
wikis in the near future.  Spammers are adept at exploiting any opening
that they can.

There has been a tendency to treat the Meta blacklist as a place of last
resort - I question whether a more thoughtful approach to keeping
Foundation sites clean might not be desirable?

Herby
[[user:Herbythyme]] most places
-- 
  Herby
  herbythyme at fmail.co.uk

-- 
http://www.fastmail.fm - A fast, anti-spam email service.




More information about the foundation-l mailing list