"Aerik Sylvan" <aerik(a)thesylvans.com> wrote in
message news:355a36af0601212140i7bd35f8ao2802c84c968935a4@mail.gmail.com...
[snip]
The idea is that the normal behaviour for a wiki is
that it is very
unlikely
for an url to be referenced in an edit more than X times in 24 hours (I'll
postulate twice, just for fun), but it is very common behaviour for a
spammer. Therefore, if one could ascertain (do some tests) that some
large
percentage (say 99.9%) of valid (non-spam) urls posted via edits occur
less
than X in 24 hours, you could put in a filter inconveniencing a very small
percentage of users that would autmatically add those urls to a blacklist.
Would this be checking for identical URLs, or simply multiple URLs
referencing the same site?
Because I can think of one concrete example where I have been adding many
URLs: references to Placeopedia. Does the fact that I am using a small
template help?
Sometimes, you might have a WikiProject which is bringing their articles up
to a standard which includes adding references, and if this means that a
bunch of articles get similar URLs added in a short space of time, this
might trigger your filter.
--
Phil
[[en:User:Phil Boswell]]