On 4/11/03 10:46 AM, "Marco Krohn"
<marco.krohn(a)gmx.de> wrote:
> Hi all,
>
> I saw just by chance that [[Rust]] was vandalized by simply adding the
word
> "bull shit" to the text. Thinking about
this I asked myself if it
wouldn't be
> possible to add an automatic detection for this
kind of vandalism?
Something
> in the way the spam detection programs work. For
the beginning a very
simple
> heuristic algorithm that detects certain word /
word combinations would
be
> sufficient.
>
> Of course the algorithm should not block the modification, simply
because it
> is impossible to prevent false positives, but it
could add the page to a
new
> special page, making it easier to detect obvious
cases of vandalism.
>
> Being still quite new to wikipedia I am not sure if this fits well in
the
> wikipedia framework and if it is realizable, but
nevertheless I would
like to
hear your
comments.
Without calling it "vandalism", a word I still think is inappropriate
for
Wikipedia, automatic sifting of contributions is certainly a good idea.
It would indeed be very interesting to see what would be the result of
Bayesian ranking of contributions.
We may also detect big article size decrease (ex: 1 Kb -> 20 bytes).
I think the easer way to show hypothetical "vandalism" is to use different
font (color, italic or font) in RecentChange.
For exemple, display "red" the article where someone just added words like
"fuck" or big size decrease can be very useful.
It can be an option in Preferences.
Aoineko