On Feb 9, 2004, at 14:49, Timwi wrote:
Also, it says:
# Friendly, low-speed bots are welcome viewing
article pages, but not
# dynamically-generated pages please.
and yet you're not blocking
Special:Recentchanges or stuff like that :)
Plain old recentchanges now and then is fine. Iterating through ten
billion possible combinations of result sets for display, edit, history
lists, contribs, and diffs for every page? That's not fine.
Well, your robots.txt seems to disallow access to /w/
for all
User-Agents. This would mean I would disrespect robots.txt if I were
to submit edits, right?
That's right.
Then I would humbly like to ask to be allowed to run
my little script
on your server. The sole purpose of the script is to replace
occurrances of one text with another. Most of the time, I use it for
common spelling corrections, or fixing links like [[Kurt Godel]] where
someone omitted the diacritic marks.
So what User-Agent string should I use? "Timwiscript"? (I don't like
calling it a bot, it really isn't... it still requires a great deal of
manual interaction, and doesn't wildly do stuff automatically.)
Rather then building yet another set of editing tools from scratch you
may want to see:
http://sourceforge.net/projects/pywikipediabot/
-- brion vibber (brion @
pobox.com)