[Wikibots-l] Plan regarding merging and splitting of warnfiles

Quistnix quistnix at kabelfoon.net
Tue Jun 7 05:13:08 UTC 2005


Dear list members,

Last week, a plan to do the interwiki links more efficiently started to 
grow, and I am posting it here to see if anyone of you can find a flaw 
in it, or want to add more ideas to it.

Here's my plan:

In the current situation, a warning file has to be produced from each 
language, and run on each other language. This requires n * ( n - 1) 
steps for n languages for one complete interwiki update.

If all warnfiles would be merged before splitting, n * 2 steps would be 
enough to do the job, and each run will be shorter than before.

By merging all warnfiles, all required and broken links are gathered for 
all languages. After removal of inconsistencies and double entries, the 
file is ready to be split into n warnfiles. Each warnfile will contain 
practically *all* needed changes, instead of just a subset of it. By 
merging, all pages have to be opened just once, even if the changes are 
collected from different warnfiles.

If bot operators on every language will upload their most recent - 
complete - warnfile to a specific location for that language, 
overwriting the previous one, this location will contain the most recent 
warnfile from that language. Suppose they do this between the first and 
15th of each month, and inform all other bot users via the interwiki 
robot page when they did it.

One of the operators can download all warnfiles, merge them, split them 
and upload them to a different location as soon as all warnfiles of the 
month are ready, or on the 15th - whichever comes first. This operator 
sends an e-mail and makes a note on the interwiki bot page on commons as 
soon as the new warnfiles are ready. After that, all operators have the 
time until the 1st of the next month to process the warnfile for their 
own language. Whenever a warnfile is processed, the bot operator will 
report it on the interwiki bot page. At the end of the month, there is 
time for other bot operators to process files that have not been 
processed previously on their language.

This ensures maximum effectiveness of interwiki bot operation.

I'll try to write the merger-script (if an experienced bot writer wants 
to help, please contact me as I haven't programmed in Python for more 
than a year now), and I'll include a filter to eliminate all no/nb 
stuff. That will be sorted out as soon as a page has to be updated anyway.

Regards
Anton






More information about the Wikibots-l mailing list