Forgive me if this has already been discussed, but has anyone done any
analysis of the frequency of posted urls to look into developing a spam
filter? What I'm getting at is this: it is fairly common for a spammer to
post an url (or a bunch of urls) to multiple pages, sometime using proxies,
but in rapid succession. It would be interesting to collect data about how
often legitimate urls are added (via edits) to test the viability of a
filter that only blocks urls after they've been posted several times in the
lasts day (or whatever time period).
Like this: It is pretty unlikely that you would get 5 edits in a day adding
the same url to different pages, but that is a pretty good indication of
spammy behavior.
So, anyone looked at anything like that?
Best Regards,
Aerik