[Foundation-l] Fwd: Fwd: [foundation-l] Bot policy on bots operating interwiki

White Cat wikipedia.kawaii.neko at gmail.com
Fri Sep 7 08:23:32 UTC 2007


Bugzilla has lots and lots of bugs waiting dev attention. A bug fix may not
necessarily happen in a "timely" fashion. Code wise interwiki links aren't
critical unlike many other bugs that need to be fixed as soon as posible.
Wiki can survive without them but they benefit the project greatly.

Yes the three challenges you mention are the root of the problem. Having
more interwiki bots operate on all wikis rather than a few would be a
solution to the problem. Say that I noticed a mistakenly linked language...
Correcting it would be quite a challenge.

We are talking about tens of thousands of pages on hundereds of wikis.
Surely you aren't suggesting that edits like this should be done manually.
For the most part interwiki bots operate without a problem. When they do
make improper edits that is due to errors made locally. The problem itself
could be solved if we had one bot per wiki scanning local pages. Sadly few
people have the courage to deal with the 'insane' workload of seeking
permission from individual wikis.

I'd really wish you did not turn this into a technical issue as it isn't
one. Or at least there are a lot of dependencises that hasn't been addressed
such as interwiki templates. Only then can we have a serious discussion on
your suggestion which may have problems that require bot edits.

   - White Cat

On 9/7/07, Peter van Londen <londenp at gmail.com> wrote:
>
> Hi,
>
> I would like to turn this also in a technical issue.
> Interwiki/Interlanguage
> (there is a difference between interwiki and interlanguage, most people
> mean
> interlanguage when talking about interwiki) organized as it is
> (decentralized), is becoming more and more a problem, because:
> * amount of edits needed, growing exponentially with growth of languages
> * multiplying the mistakes (wrong interlanguages) through bot actions.
> * bots are set to automatic, which means that only the easy interwiki's
> are
> done. The difficult interwiki's, requiring handmade changes language
> knowledge and investigations, are not done.
>
> There is only one real solution imho: organize it centrally: which means
> something like a central database hosted by commons.
>
> Over the years there have been several proposals about that, also on this
> list but until now it was apparently not seen as a huge problem. Maybe
> that
> still is the case or maybe it is time to plan for a solution?
>
> Talking about interlanguage some feature requests come into my mind:
> * a possibility to limit the shown interwikis, set in the preferences
> * a possibility to set the order of interwikis, also set in the
> preferences.
>
> I would be interested in a comment from the devs if they see this as a
> potential problem and if they would see some solutions to interlanguage.
>
> Kind regards, Peter van Londen/Londenp
>
> 2007/9/7, teun spaans <teun.spaans at gmail.com>:
> >
> > interwiki bots occasionally need serious attention, interwiki bots
> spread
> > interwiki links but not always in the right fashion. When one wiki has a
> > link to the wrong article, interwiki bots tend to spread this errror to
> > all
> > wikis.
> >
> >
> > On 9/6/07, White Cat <wikipedia.kawaii.neko at gmail.com> wrote:
> > >
> > > I think we have a serious problem with this. When the interwiki bot
> > issue
> > > was last discussed there only was a handful of wikis. I think it is
> time
> > > to
> > > bring some attention to this.
> > >
> > > http://meta.wikimedia.org/wiki/Special:SiteMatrix displays quite a
> large
> > > number of wikis (I was told this is around 700). Wikipedia alone has
> 253
> > > language editions according to
> > > http://meta.wikimedia.org/wiki/List_of_Wikipedias
> > >
> > > I was told only 60 of these 700ish wikis have an actual local bot
> policy
> > > of
> > > which most are just translations or mis-translations of en.wiki.
> > >
> > > Why is this a problem? Well, if a user decides to operate an interiwki
> > bot
> > > on all wikis. He or she (or it?) would have to make about 700 edits on
> > the
> > > individual wikis. Aside form the 60 most of these wikis do not even
> have
> > a
> > > bot request page IIRC. Those individual 700 edits would have to be
> > listed
> > > on
> > > [[m:Requests for bot status]]. A steward will have to process these
> 700
> > -
> > > wikis with active bcrats. Thats just one person. As we are a growing
> > > community, now imagine just 10 people who seek such interwiki bot
> > > operation.
> > > Thats a workload of 7000. Wikimedia is a growing community. There are
> > far
> > > more than 700 languages on earth - 7000 according to
> > >
> >
> http://en.wikipedia.org/wiki/Natural_language#Native_language_learningthats
> > > ultimately 7000 * (number of sister projects) wikis per individual
> bot.
> > > With
> > > the calculation of ten bots thats 70,000 requests.
> > >
> > > There are a couple of CPU demanding but mindless bot tasks. All these
> > > tasks
> > > are handled by the use of same code. Tasks that come to my mind are:
> > >
> > >    * Commons delinking
> > >    * Double redirect fixes
> > >    * Interwiki linking
> > >    * Perhaps even anti-spam bots
> > >
> > >
> > > Currently we already have people who make bot like alterations to
> > > individual
> > > such as mediawiki developers wikis without even considering the
> opinions
> > > of
> > > local wikis. I do not believe anyone finds this problematic. Also we
> > elect
> > > stewards from a central location. We do not ask the opinion of
> > individual
> > > wikis. Actions a steward has access to is vast but the permission they
> > > have
> > > is quite limited. So the concept of centralized decisions isn't a new
> > > concept. If mediawiki is a very large family we should be able to make
> > > certain decisions family wide.
> > >
> > > I think the process on bots operating inter-wiki should be simplified
> > > fundamentally. Asking every wiki for permission may seem like the nice
> > > thing
> > > to do but it is a serious waste of time, both for the bot operator and
> > for
> > > the stewards as well as the local communities actually. There is no
> real
> > > reason to repetitively approve "different" bots operating the same
> code.
> > >
> > > My suggestion for a solution to the problem is as follows:
> > >
> > > A foundation/meta bot policy should be drafted prompting a centralized
> > bot
> > > request for a number of very spesific tasks (not everything). All
> these
> > > need
> > > to be mindless activities such as interwiki linking or double redirect
> > > fixing. The foundation will not be interfering with the "local"
> affairs,
> > > but
> > > instead regulating inter-wiki affairs. All policies on wikis with a
> bot
> > > policy should be compatible or should be made compatible with this
> > > foundation policy. Bot requests of this nature would be processed in
> > meta
> > > alone saving every one time. The idea fundamentally is "one nom per
> bot"
> > > rather than "one nom per wiki" basically.
> > >
> > > If a bot breaks, it can simply be blocked. Else the community should
> not
> > > have any problem with it. How much supervision do interwiki bots
> really
> > > need
> > > anyways?
> > >
> > > Perhaps an interface update is necessary allowing stewards to grant
> bot
> > > flags in bulk rather than individually if this hasn't been implemented
> > > already.
> > >
> > >
> > >   - White Cat
> > > _______________________________________________
> > > foundation-l mailing list
> > > foundation-l at lists.wikimedia.org
> > > http://lists.wikimedia.org/mailman/listinfo/foundation-l
> > >
> > _______________________________________________
> > foundation-l mailing list
> > foundation-l at lists.wikimedia.org
> > http://lists.wikimedia.org/mailman/listinfo/foundation-l
> >
> _______________________________________________
> foundation-l mailing list
> foundation-l at lists.wikimedia.org
> http://lists.wikimedia.org/mailman/listinfo/foundation-l
>


More information about the foundation-l mailing list