On 9/2/05, Tony Molloy <tony.molloy(a)uconn.edu> wrote:
Last night we had about 59 new accounts created on our
Wiki. They then
proceeded to do multiple edits to about 40 or so different pages. Each
page had about 15-20 changes made.
The weird thing is that the last spider/bot to visit each page erased
all the stuff that the others had created, leaving each page (from what
I can tell so far) as it was originally.
The sites they were injecting into the pages were:
http://WTHP5.disney.com
Now for my questions:
I had enabled "registered users can only edit", but this didn't help
because they obviously automated this process. Is there something
stronger I can do while still enabling the spirit of the Wiki?
Some of the pages won't seem to revert back to my last edit. Can I
somehow completely delete any changes they made to the system and get
their record of touching the pages out?
Is there a faster way to revert a bunch of pages at once? This is taking
forever to read each page and verify things are OK.
I'm pretty disheartened by this. If it continues, I'll have to turn off
external internet access to our wiki (this is for an academic library).
We already see quite a few visits from other libraries, and we have some
valuable information to share.
I assume you're on MySQL 1.5 something. This won't work for 1.4.
This MySQL query MAY do the job. This is UNTESTED, and if it
contained a mistake, it MAY TRASH YOUR WIKI. I would STRONGLY
recommend making a backup.
This will revert everything to the most recent version before
TIMESTAMP. (TIMESTAMP is in the format 'YYYYMMDDHHMMSS', with the
single quotes.)
I wouldn't recommend using this if you're not somewhat familiar with SQL.
UPDATE page, recentchanges
SET page_latest = rc_this_old_id
WHERE page_id = rc_cur_id
AND rc_timestamp < TIMESTAMP;
Once again: this is UNTESTED. Use at your own risk.
-- Josh
-Tony
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)Wikimedia.org
http://mail.wikipedia.org/mailman/listinfo/mediawiki-l