Just to answer these questions for my _java_ interwiki bot MerlIwBot:
Am 12.10.2012 16:45, schrieb Amir E. Aharoni:
Will the bots be smart enough not to do anything to
articles that are
already listed in the repository and have the correct links displayed?
Will the bots be smart enough not to do anything with
articles that
have interwiki conflicts (multiple links, non-1-to-1 linking etc.)?
yes, because this is wikidata independent for my java bot and worked
already before. So running my bot is always save on these wikis.
non-1-to-1 langlinks can be moved partly to wikidata because mixing if
wikidata and local langlinks is possible. But please note that since
change
https://gerrit.wikimedia.org/r/#/c/25232/ which is live with
1.21.wmf2 no multiple langlinks are display anymore.
Will the bots be smart enough to update the repo in
the transition
period, when some Wikipedias have Wikidata and some don't?
My bot can update the repository but will cause much load on local
wikis. That's because currently there is no possibility to know if a
langlinks is stored local or at wikidata. Local langlinks can be stored
on main page or any included page. So the whole source code of every
included page must be checked first.
I created a feature request which could solve this problem:
<https://bugzilla.wikimedia.org/show_bug.cgi?id=41345> . If hope this
will be added before wikidata goes live.
To update the repository my bot need to know the corresponding
repository script url for a local wiki. Currently
http://wikidata-test-repo.wikimedia.de/w/api.php is hard coded at my bot
framework. But this repository url will change for hewiki. There is
currently no way to request this info using api. I also created a
feature requests for this:
<https://bugzilla.wikimedia.org/show_bug.cgi?id=41347>
Merlissimo