[Mediawiki-l] robots.txt

Roger Chrisman roger at rogerchrisman.com
Mon Sep 25 21:53:15 UTC 2006


Hi All,

Wikipedia's robots.txt file (http://www.wikipedia.org/robots.txt) 
excludes robots from action pages (edit, history, etc.) with this:

User-agent: *
Disallow: /w/

But in the interest of short URLs, I serve my MediaWiki directly from 
site / without any /wiki/ or /w/ directories. So above meathod would 
not work on my installation.

Any ideas how I can exclude robots from crawling all my wiki's edit, 
history, talk, etc, pages *without* excluding its article pages?

Thanks,

Roger Chrisman
http://Wikigogy.org (MediaWiki 1.6.7)



More information about the MediaWiki-l mailing list