Hi,
Is the UserMerge extension actually used on Wikimedia sites?
As far as I can see, it is only installed on Wikivoyage, and the last log
entries of its usage are from 2015. However, it's possible that I'm missing
something.
The reason I'm asking is that I'd love to know how to configure it best in
translatewiki. If it's not actually used on Wikimedia sites or developed
significantly, then it should be filed under "Legacy".
Thanks! :)
--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
“We're living in pieces,
I want to live in peace.” – T. Moore
📘 Read this post on Phabricator at
https://phabricator.wikimedia.org/phame/post/view/130/
<https://phabricator.wikimedia.org/phame/post/view/130/production_excellence…>
-------
How’d we do in our strive for operational excellence last month? Read on to
find out!
- Month in numbers.
- Lighting round.
- Current problems.
## 📊 *Month in numbers*
* 4 documented incidents. [1]
* 20 Wikimedia-prod-error tasks closed. [2]
* 18 Wikimedia-prod-error tasks created. [3]
* 172 currently open Wikimedia-prod-error tasks (as of 16 January 2019).
Terminology:
* An *Exception* (or fatal) prevents a user action. For example, a page
would display “Exception: Unable to render page”, instead the article
content.
* An *Error* (or non-fatal, warning) can produce pages that are technically
unaware of a problem, but may show corrupt, incorrect, or incomplete
information. For example — a user may receive a notification that says “You
have (null) new messages”.
For December, I haven’t prepared any stories or taken interviews. Instead,
I’ve got a lightning round of errors in various areas that were found and
fixed this past month.
## ⚡️ *Contributions view fixed*
MarcoAurelio reported that Special:Contributions failed to load for certain
user names on meta.wikimedia.org (PHP Fatal error, due to a faulty database
record). Brad Jorsch investigated and found a relation to database
maintenance from March 2018. He corrected the faulty records, which
resolved the problem. Thanks! — https://phabricator.wikimedia.org/T210985
## ⚡️ *Undefined talk space now defined*
The newly created Cantonese Wiktionary (yue.wiktionary.org) was
encountering errors from the Siteinfo API. We found this was due to invalid
site configuration. Urbanecm patched the issue, and also created a new unit
test for wmf-config that will prevent this issue from happening on other
wikis in the future. Thanks! — https://phabricator.wikimedia.org/T211529
## ⚡️ *The undefined error status... error*
After deploying the 1.33.0-wmf.8 train to all wikis, we found a regression
in the HTTP library for MediaWiki. When MediaWiki requested an HTTP
resource from another service, and this resource was unavailable, then
MediaWiki failed to correctly determine the HTTP status code of that error.
Which then caused another error! This happened, for example, when
Special:Collection was unable to reach the PediaPress.com backend in some
cases. Fixed by Bill Pirkle. Thanks! —
https://phabricator.wikimedia.org/T212005
## ⚡️ *Fatal error: Call to undefined function in Kartographer API*
When the 1.33.0-wmf-9 train reached the canary phase on Tue 18 December
(aka, group0 [1]), Željko spotted a new fatal error in the logs. The fatal
originated in the Kartographer extension and would have affected various
users of the MediaWiki API. Patched the same day by Michael Holloway,
reviewed by James Forrester, and deployed by Željko. Thanks! —
https://phabricator.wikimedia.org/T212218
## 📉 *Current problems*
Take a look at the workboard and look for tasks that might need your help.
The workboard lists known issues, grouped by the week in which they were
first observed.
→ https://phabricator.wikimedia.org/tag/wikimedia-production-error/
December’s theme will continue for now, as I imagine lots of you were on
vacation during that time! I’d like to draw attention to a subset of PHP
fatal errors. Specifically, those that are publicly exposed (e.g. don’t
need elevated user rights) and emit an HTTP 500 error code.
* Wikibase: Clicking “undo” for certain revisions fatals with a
PatcherException. — https://phabricator.wikimedia.org/T97146
* Flow: Unable to view certain talk pages due to workflow
InvalidDataException. — https://phabricator.wikimedia.org/T70526
* Translate: Certain Special:Translate urls fatal. —
https://phabricator.wikimedia.org/T204833
* MediaWiki (Special-pages): SpecialDoubleRedirects unavailable on
tt.wikipedia.org. — https://phabricator.wikimedia.org/T204800
* MediaWiki (Parser): Parse API exposes fatal content model error. —
https://phabricator.wikimedia.org/T206253
* CentralNotice: Certain SpecialCentralNoticeBanners urls fatal. —
https://phabricator.wikimedia.org/T149240
* PageViewInfo: Certain “mostviewed” API queries fail. —
https://phabricator.wikimedia.org/T208691
Public user requests resulting in fatals can (and have) caused alerts to
fire that notify SRE of wikis potentially being less available or down.
💡*ProTip*: Use “Report Error” on
https://phabricator.wikimedia.org/tag/wikimedia-production-error/ to create
a task with a helpful template. This template is also available as “Report
Application Error”, from the “Create Task” dropdown menu, on any task
creation form.
## 🎉 *Thanks!*
Thank you to everyone who has helped by reporting, investigating, or
resolving problems in Wikimedia production. Including MarcoAurelio, Anomie,
Urbanecm, BPirkle, zeljkofilipin, Mholloway, Esanders, Jdforrester-WMF, and
hashar.
Until next time,
— Timo Tijhof
-------
Footnotes:
[1] Incidents. —
https://wikitech.wikimedia.org/wiki/Special:AllPages?from=Incident+document…
[2] Tasks closed. —
https://phabricator.wikimedia.org/maniphest/query/Pe2KaRZhJJ.H/#R
[3] Tasks opened. —
https://phabricator.wikimedia.org/maniphest/query/aqbDey80TU02/#R
[4] What is group0? —
https://wikitech.wikimedia.org/wiki/Deployments/One_week#Three_groups
Hey all!
I'm reviving an old project to embed sandboxed HTML/JavaScript "widgets"
into wiki pages as a click-to-play media type, using modern browsers'
<iframe> sandbox and Content-Security-Policy restrictions.
Intro and detail notes which I'll keep updating:
https://www.mediawiki.org/wiki/User:Brion_VIBBER/EmbedScript_2019
I hope to extend it with a headless "plugin" mode which allows less-trusted
user-written code to interact safely with fully-trusted host APIs, and a
dependency system to let common library modules, string localizations,
image files from Commons, and data from Wikidata be bundled up and used
safely, without cross-site data exposure.
I'm hoping to solicit some more feedback while I'm in the prototyping
stage, with an eye towards issues we'll need to resolve before it reaches a
productizable stage we could seriously deploy.
Open questions include:
* Can we really replace some user scripts and gadgets with a split-trust
model, and which ones are good ones to start experimenting with?
* What should a user-permissions UX look like for plugins? What threat
models are not examined yet?
* What kind of comment / code review system is needed?
* What about patches, and forks, and copies and centralization? what's the
best Commons-centric or alternate model that will prevent fragmentation of
code?
* How should libraries / dependencies work?
* How should localization work?
* How much coupling to MediaWiki is desired/required?
* How to implement mobile app and offline support?
Feel free to poke me directly or on the wiki talk page with
questions/comments/ideas. Love it? Hate it? Great! Let me know. :)
-- brion
Using an abstract language as an basis for translations have been
tried before, and is almost as hard as translating between two common
languages.
There are two really hard problems, it is the implied references and
the cultural context. An artificial language can get rid of the
implied references, but it tend to create very weird and unnatural
expressions. If the cultural context is removed, then it can be
extremely hard to put it back in, and without any cultural context it
can be hard to explain anything.
But yes, you can make an abstract language, but it won't give you any
high quality prose.
On Mon, Jan 14, 2019 at 8:09 AM Felipe Schenone <schenonef(a)gmail.com> wrote:
>
> This is quite an awesome idea. But thinking about it, wouldn't it be possible to use structured data in wikidata to generate articles? Can't we skip the need of learning an abstract language by using wikidata?
>
> Also, is there discussion about this idea anywhere in the Wikimedia wikis? I haven't found any...
>
> On Sat, Sep 29, 2018 at 3:44 PM Pine W <wiki.pine(a)gmail.com> wrote:
>>
>> Forwarding because this (ambitious!) proposal may be of interest to people
>> on other lists. I'm not endorsing the proposal at this time, but I'm
>> curious about it.
>>
>> Pine
>> ( https://meta.wikimedia.org/wiki/User:Pine )
>>
>>
>> ---------- Forwarded message ---------
>> From: Denny Vrandečić <vrandecic(a)gmail.com>
>> Date: Sat, Sep 29, 2018 at 6:32 PM
>> Subject: [Wikimedia-l] Wikipedia in an abstract language
>> To: Wikimedia Mailing List <wikimedia-l(a)lists.wikimedia.org>
>>
>>
>> Semantic Web languages allow to express ontologies and knowledge bases in a
>> way meant to be particularly amenable to the Web. Ontologies formalize the
>> shared understanding of a domain. But the most expressive and widespread
>> languages that we know of are human natural languages, and the largest
>> knowledge base we have is the wealth of text written in human languages.
>>
>> We looks for a path to bridge the gap between knowledge representation
>> languages such as OWL and human natural languages such as English. We
>> propose a project to simultaneously expose that gap, allow to collaborate
>> on closing it, make progress widely visible, and is highly attractive and
>> valuable in its own right: a Wikipedia written in an abstract language to
>> be rendered into any natural language on request. This would make current
>> Wikipedia editors about 100x more productive, and increase the content of
>> Wikipedia by 10x. For billions of users this will unlock knowledge they
>> currently do not have access to.
>>
>> My first talk on this topic will be on October 10, 2018, 16:45-17:00, at
>> the Asilomar in Monterey, CA during the Blue Sky track of ISWC. My second,
>> longer talk on the topic will be at the DL workshop in Tempe, AZ, October
>> 27-29. Comments are very welcome as I prepare the slides and the talk.
>>
>> Link to the paper: http://simia.net/download/abstractwikipedia.pdf
>>
>> Cheers,
>> Denny
>> _______________________________________________
>> Wikimedia-l mailing list, guidelines at:
>> https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and
>> https://meta.wikimedia.org/wiki/Wikimedia-l
>> New messages to: Wikimedia-l(a)lists.wikimedia.org
>> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
>> <mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe>
>> _______________________________________________
>> Wikipedia-l mailing list
>> Wikipedia-l(a)lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikipedia-l
>
> _______________________________________________
> Wikidata mailing list
> Wikidata(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
Official site: https://ocs.letras.up.pt/index.php/IWSC2019/IWSC2019
The* International Wiki Scientific Conference 2019 *will be held on *March
11, 12 and 13, 2019 *in Porto (Portugal) and will take place at the Faculty
of Arts of the University of Porto.
*ORGANIZATION:*
*FLUP - Faculdade de Letras da Universidade do Porto (PT)*
*CIC.DIGITAL Porto (PT)*
*Wiki Educação Brasil (BR)*
*PPGCI - Universidade Federal Fluminense (RJ-BR)*
CALL FOR PAPERS
https://ocs.letras.up.pt/index.php/iwsc2019/IWSC_2019/author/submit?require…
In 2018, after the European Year of Cultural Heritage was celebrated by the
first time and by the initiative of the European Union, the *IWSC 2019*
promotes *Heritage*, both material and intangible, to draw attention to the
presence of Culture and heritage on Wiki platforms and to the contribution
this digital resource/look could have on social and economic development in
Europe and worldwide.
Therefore, works will be considered provided they focus on Wiki culture as
a topic, source or way of spreading scientific knowledge and teaching tool
and, particularly in the scope of heritage. Texts should be written in
English, Spanish or Portuguese and will be assessed by members of the
Scientific Commission.
Submission period starts on the 15 October and ends on the *30 January 2019*.
Authors will be notified whether their works have been accepted by the *10
February 2019*.
*The papers should be submitted in the following thematic groups:*
- Wiki Culture
- Information accreditation on Wikipedia
- Spreading and promoting Science on Wikipedia
- Dissemination and promotion of material and intangible heritage
- Educational Projects on Wiki Platforms
Posters may be presented (up to 3 pages, including references) or articles
(from 8 to 10 pages, including references).
The papers will be published in the digital proceeding of the event. For
this reason, at least one of the authors must register for the event and
perform oral presentation.
In the case of changes requested by the evaluators, the authors have
until *February
20, 2019* to submit their final versions.
*Opening of Call for Papers: *October 15, 2018
*Call for papers closes:* January 30, 2018
Best regards
Rodrigo Padula
Hi all,
I am undergraduate student in computer science.I find myself good with
skills required to contribute in the projects in your organization.
I will be applying as GSOC student for the organization by developing
features.
How can I get started ? Just to confirm if this organization is going
to participate in upcoming GSOC.Please let me know where can I find all
details to get started.
Thank You!
Hello,
Due to the recent issues with the PDUs on both A2 (T213748) and A3
(T213859) we need to swap s3 db primary master (T213858), as it lives on A2.
We are going to do this on Thursday 17th at 07:00 AM UTC, and we have
requested a 30 minutes window (T213864) as we have to go read only for all
the wikis living on s3 (P7994).
Impact: Writes will be blocked, reads will remain unaffected.
Time: 07:00AM UTC - 07:30 AM UTC (we do not expect to use the full 30
minutes window).
Communication will happen at #wikimedia-operations
If you are around at that time and want to help with the monitoring, please
join us!
Thanks