To avoid this, it should have the <meta name="robots"
content="noindex,follow" /> tag on head.
However, i don't think there is a method to generate meta tags with the wiki
code. Maybe we should make a new page-status non-indexable to avoid google
indexing of such pages?
Another aproach wouldbe avoiding the links to that page to make google not
finding it but it is linked only by pages as Wikipedia:Articles for
deletion/Log/* and Wikipedia:List of pages protected against re-creation
so no way because it should stay. We could set that pages to not following
the links... if we had the method previously explain, so.. no way.
<homey2005(a)sympatico.ca> escribió en el mensaje
news:20060211132825.UMBW15582.tomts46-srv.bellnexxia.net@[209.226.175.82]...
Can something be done to prevent "protected
deleted" pages from being
indexed by google? See, for instance,
http://www.google.ca/search?hl=en&q=shane+ruttle&btnG=Google+Search…
Thanks,
homey