Hello,
Wikilabels [1] is the system to label edits for ORES. Until now, users
would have to visit a page in Wikipedia, for example WP:Labels [2] and
install a gadget and then label edits for ORES. With the new version
(0.4.0) deployed today, you can directly go to Wikilabels home page, for
example https://labels.wmflabs.org/ui/enwiki and label edits from there. If
you installed the gadget, you can remove it now. We also provided some sort
of minification and bundling to improve its performance.
Labeling edits would help ORES work more accurately and in case ORES review
tool is not enabled in your wiki, you can provide these data for us using
wikilabels so can enable it for your wiki as well!
[1] https://meta.wikimedia.org/wiki/Wiki_labels
[2] https://en.wikipedia.org/wiki/Wikipedia:Labels
Best
--
Amir Sarabadani Tafreshi
Software Engineer (contractor)
-------------------------------------
Wikimedia Deutschland e.V. | Tempelhofer Ufer 23-24 | 10963 Berlin
http://wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/681/51985.
Question, is there a gerrit repo that has "global" localisation/lang l8n files im interested in providing localisation and gammar/spelling fixes for English for many extentsions and such for MW.
Thanks,
Zppix
Userpage: www.enwp.org/User:Zppix
Hello!
*tl;dr: add srenablerewrites=yes to your API search queries to enable
search results from different language projects*
The Search Team
<https://www.mediawiki.org/wiki/Wikimedia_Discovery#Search:_Backend> is
thrilled to announce that secondary search results are now available over
the API. This means that automated language detection (provided by TextCat
<https://www.mediawiki.org/wiki/TextCat>) and query forwarding can now be
used by API consumers.
Here's the explanation. The Search Team's analysis of common search queries
<https://www.mediawiki.org/wiki/User:TJones_(WMF)/Notes/Survey_of_Zero-Resul…>
showed that there are quite a few search queries that aren't in the
language of the wiki the user is on. To help alleviate this problem, and
give users useful results, we added language detection and query
forwarding; for example, Луковичная глава
<https://en.wikipedia.org/w/index.php?title=Special:Search&profile=default&s…>
now
gives the user results from the Russian Wikipedia. This is the
functionality that's now available over the API, as you can see if you perform
the same search over the API
<https://en.wikipedia.org/w/api.php?action=query&list=search&srsearch=%D0%9B…>
with the srenablerewrites parameter enabled.
The secondary results functionality was added to MediaWiki core
<https://gerrit.wikimedia.org/r/#/c/324652/> and is extendable so that, in
the future, if we (or someone else!) provide secondary results from other
sources, then this functionality can be used for that. For backwards
compatibility, don't add the srenablerewrites parameter and you'll continue
getting the same results in the same format as before this change.
Happy querying!
Thanks,
Dan
--
Dan Garry
Lead Product Manager, Discovery
Wikimedia Foundation
Hi,
On the second day of the Wikimedia Developer Summit (January 10) there will
be a Q&A session with Victoria Coleman (Wikimedia Foundation CTO) and Wes
Moran (VP of Product). It is a plenary session and it will be
video-streamed.
The questions for this session are being crowdsourced at
http://www.allourideas.org/wikidev17-product-technology-questions. Anyone
can propose questions and vote, anonymously, as many times as you want. At
the moment, we have 25 questions and 451 votes.
An important technical detail: questions posted later have also good
chances to make it to the top of the list as long as new voters select
them. The ranking is made out of comparisons between questions, not
accumulation of votes. For instance, the current top question is in fact
one of the last that has been submitted so far.
Why posting or voting a good question? One obvious reason is to encourage
the Foundation's Technology and Product top managers to bring a good answer
in a public session with minutes taken and video recording. :) Beyond
that, if the ranking of questions makes sense and is backed by
participation numbers, it has a serious chance to influence plans and
discussions beyond the Summit.
The current ranking does make sense, but maybe you could help covering more
areas, other perspectives?
1. How do we deal with the lack of maintainers for all Wikimedia
deployed code?
2. Do we have a plan to bring our developer documentation to the level
of a top Internet website, a major free software project?
3. For WMF dev teams, what is the right balance between pushing own work
versus seeking and supporting volunteer contributors?
4. During the next year or so, what balance do you think we should
strike between new projects and technical debt?
5. When are we going to work on a modern talk pages system for good?
6. Whose responsibility is to assure that all MediaWiki core components
and the extensions deployed in Wikimedia have active maintainers?
7. How important is to have a well maintained and well promoted catalog
of tools, apps, gadgets, bots, templates, extensions...?
8. Will MediaWiki ever become easier to install and manage? (e.g. plugin
manager à la Wordpress). How much do we care about enterprise users?
9. What should be the role of the Architecture Committee in WMF planning
(priorities, goals, resources...) and are we there yet?
10. In addition to Community Tech, should the other WMF Product teams
prioritize their work taking into account the Community Wishlist results?
The full list:
http://www.allourideas.org/wikidev17-product-technology-questions/results
--
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
TLDR: Templates can now use datasets on Commons for all their localizable
messages, making it possible to copy/paste templates between wiki without
any changes.
If you look at the Graph:Lines template [1], there is a small link under
each graph that points to the data source. That link is automatically
localized on enwiki and ruwiki because the actual messages are stored in a
Commons' Dataset, while the template itself is identical on both wikis.
The magic happens in the Module:TNT (hoping it won't blow up):
*{{#invoke:TNT | msg | DatasetDict.tab | message-key | param1 | param2 |
... }}*
I would love to support template doc pages too, but I'm not sure how well
VE and other tools will work with the <templatedata>
This is an experimental thing, just hacking on it in my free time, and
would love to get your feedback on the viability of this approach. Thanks!
Links:
enwiki: https://en.wikipedia.org/wiki/Template:Graph:Lines
ruwiki: https://ru.wikipedia.org/wiki/Template:Graph:Lines
TNT: https://commons.wikimedia.org/wiki/Module:TNT
Dictionary:
https://commons.wikimedia.org/wiki/Data:Original/Template:Graph:Lines.tab
Hi all,
I'm doing some renovations on recitation-bot and running into trouble when
the time comes for pywikibot to upload article data to wikisource and
commons. The thread doing so hangs without any sort of informative error. I
made sure that the unix user under which the web service that is using
pywikibot is running is logged into each wiki per Max's advice but I still
have the problem. I'm going to try to get more information about what's
going on but would also appreciate pointers about what might be going
wrong. Particularly, the web service is now running under Kubernetes rather
than sun grid engine, so I suspect that the login state might not be making
it into the container - can anyone advise on where the login state is
maintained and whether this will be transferred into the kubernetes
container?
Thanks,
Anthony
Team Practices Group Clinic
TPG will be offering 15-minute consultations on team practices and
collaboration topics of your choosing; for example: task tracking,
improving communication between or within teams, or communicating with
geographic diversity. See TPG_Clinic
<https://www.mediawiki.org/wiki/Wikimedia_Developer_Summit/2017/TPG_Clinic>
to get more information.
This is open to anyone attending the Summit.
*-- Joel Aufrecht*
Team Practices Group
Wikimedia Foundation