Hi all,
Recently, the Web team closed a large number of tasks on Phabricator within
the Readers-Web-Backlog <https://phabricator.wikimedia.org/project/view/67/>.
These are tasks that the team will not have capacity for in the near future
(~6 months from now) or team representatives deemed as inactionable. Most
of these tasks were created by members of the team. We opted for a silent
edit to avoid noise for all subscribers. If you feel like a task has been
resolved prematurely or needs additional attention, feel free to re-open or
leave a comment on the task. Note, these tasks may be reopened at a later
date if the team has expanded capacity.
Here is the full list of declined tasks:
https://phabricator.wikimedia.org/maniphest/query/oyaVe_xbp_Gn/#R
Thank you,
- Olga and the Web Team
--
*Olga Vasileva* // Lead Product Manager // Web
https://wikimediafoundation.org/
*Imagine a world in which every single human being can freely share in
the sum of all knowledge. That's our commitment. Donate
<http://donate.wikimedia.org/>. *
This is a summary of this week's deployment of the 1.37.0-wmf.14
branch of MediaWiki and its extensions (also known as "the train").
The primary person in charge this week is Ahmon Dancy, with Brennen
Bearnes as backup, both from the Wikimedia Foundation Release
Engineering team.
The summary/blocker task for this week is
https://phabricator.wikimedia.org/T281155
The new version is running on all sites: https://versions.toolforge.org/
== Stats and blockers ==
* 1 new WMF branch of MediaWiki
* 2 intrepid train crew folk, one operator and one conductor
* 2 new and unnecessary train metaphors
* 4 useless statistics
* 1 risky patch identified
* 2 tasks momentarily deemed blockers then deemed no longer:
** Deadlock found when trying to get lock (UserOptionsManager::saveOptionsQuery)
*** https://phabricator.wikimedia.org/T286521
** Some interface messages are in Welsh when British English is
selected as the interface language
*** https://phabricator.wikimedia.org/T286679
== 🚂🌈 ==
As usual, lots of people helped find, triage, and solve problems with
this train. Thanks to all involved (listed by Phabricator username):
* Daimona
* Krinkle
* Ladsgroup
* LucasWerkmeister
* Nikerabbit
* Nikki
* Pchelolo
* Reedy
* RhinosF1
* TimStarling
* Urbanecm
* abi_
...and anyone else we're forgetting. Without your help we wouldn't be
able to deploy the train.
Thanks!
-- Your WMF train crew
Hi all
A few questions to provoke discussion/share knowledge better:
* Why does the train run Tue,Wed, Thur rather than Mon,Tue,Wed
* Why do we only have 2 group 1 Wikipedia's (Catalan and Hebrew)
* Should there be a backport window Friday mornings for certain changes?
Longer spiel:
A few weeks ago a change I made led to a small but noticeable UI
regression. The site was perfectly usable, but looked noticeably off. It
was in a more obscure part of the UI so we missed it during QA/code review.
Late Wednesday a ticket was reported against Wikimedia commons, but I only
became aware of it late Thursday when the regression rolled out to English
Wikipedia. A village pump discussion was started and several duplicate
tickets were created. While the site could still be used it didn't look
great and upset the experience of many editors.
Once aware of the problem, the issue was easy to fix. A patch was written
on Friday.
I understand Friday backports are possible, but my team tend to use them as
a last resort in fear of creating more work for my fellow maintainers over
weekend periods. As a result, given the site was still usable, the fix
wasn't backported until the first available backport window on Monday. This
is unfortunately a regular pattern, particularly for small UI regressions.
We addressed the issue on Monday, but I got feedback from several users
that this particular issue took too long to get backported. I mentioned the
no Friday deploy policy. One user asked me why we don't run the train
Monday-Wednesday and to be honest I wasn't sure. I couldn't find anything
on https://wikitech.wikimedia.org/wiki/Deployments/Train.
My team tries to avoid big changes on Mondays as Monday merged patches are
more likely to have issues since they don't always get the time to go
through QA during the week by our dedicated QA engineer.
So... Why don't we run the train Monday-Wednesday? Having a Thursday buffer
during which we can more comfortably backport any issues not caught in
testing, particularly UI bugs would be extremely helpful to my team and I
don't think we'd lose much by losing the Monday to rush last-minute changes.
Assuming there are good reasons for Tuesday-Thursday train, I think there
is another problem with our deploy process which is the size of group 1.
Given the complexity of our interfaces (several skins, gadgets, multiple
special pages, user preferences, gadgets, multiple extensions, and
different user rights), generally, many obscure UI bugs get missed in QA by
people who don't use the software every day and have a clear mental model
of how it looks and behaves. My team mostly works on visible user interface
changes and we rely heavily on Catalan and Hebrew Wikipedia users - our
only group 1 wikis to notice errors with UI before they go out to a wider
audience. Given the size of those audiences, that often doesn't work, and
it's often group 2 wikis that make us aware of issues. If we are going to
keep the existing train of Tue-Thur, I think it's essential we have at
least one larger Wikipedia in our group 1 deploy to give us better
protection against UI regressions living over the weekend. My understanding
is for some reason this is not a decision release engineering can make, but
one that requires an on-wiki RFC by the editors themselves. Is that
correct? While I can understand the reluctance of editors to experience
bugs, I'd argue that it's better to have a bug for a day than to have it
for an entire weekend, and definitely something we need to think more
deeply about.
How’d we do in our strive for operational excellence last month?
Incidents
3 documented incidents. That's lower than June in the previous five years
where the month saw 5-9 incidents. I've added a new panel ⭐️ to the Incident
statistics <https://codepen.io/Krinkle/full/wbYMZK> tool. This one plots
monthly statistics on top of previous years, to more easily compare them.
Learn more from the Incident documents
<https://wikitech.wikimedia.org/wiki/Incident_status> on Wikitech, and
remember to review and schedule Incident Follow-up
<https://phabricator.wikimedia.org/project/view/4758/> in Phabricator,
which are preventive measures and other action items filed after an
incident.
-------
Trends
In June, work on production errors appears to have stagnated a bit. Or more
precisely, the work only resulted in relatively few tasks being resolved.
15 of the 26 new tasks are still open as of writing.
Of the tasks from previous months, only 11 were resolved, leaving most
columns unchanged. See the table further down for a more detailed breakdown
and links to Phabricator queries for the tasks in question.
With the 15 remaining new tasks, and the 11 tasks resolved from our
backlog, this raises the chart from 150 to 154 tasks.
Figure 1, Figure 2: Unresolved error reports stacked by month.
<https://phabricator.wikimedia.org/phame/post/view/240/production_excellence…>
Month-over-month plots based on spreadsheet data
<https://docs.google.com/spreadsheets/d/e/2PACX-1vTrUCAI10hIroYDU-i5_8s7pony…>
.
-------
Outstanding errors
Take a look at the workboard and look for tasks that could use your help.
→ https://phabricator.wikimedia.org/tag/wikimedia-production-error/
Summary over recent months:
Jan 2020 (1 of 7 left) ⚠️ Unchanged (over one year old).
Mar 2020 (2 of 2 left) ⚠️ Unchanged (over one year old).
Apr 2020 (4 of 14 left) ⚠️ Unchanged (over one year old).
May 2020 (5 of 14 left) ⚠️ Unchanged (over one year old).
Jun 2020 (5 of 14 left) ⚠️ Unchanged (over one year old).
Jul 2020 (4 of 24 issues) ⚠️ Unchanged (over one year old).
Aug 2020 (11 of 53 issues) ⬇️ One task resolved. -1
Sep 2020 (7 of 33 issues) ⚠️ Unchanged (over one year old).
Oct 2020 (19 of 69 issues) ⚠️ Unchanged (over one year old).
Nov 2020 (8 of 38 issues) ⚠️ Unchanged (over one year old).
Dec 2020 (7 of 33 issues) ⚠️ Unchanged (over one year old).
Jan 2021 (3 of 50 issues
<https://phabricator.wikimedia.org/maniphest/query/WIP9W8q54HB6/#R>) ⚠️
Unchanged (over one year old).
Feb 2021 (6 of 20 issues
<https://phabricator.wikimedia.org/maniphest/query/5MzPJAb5oJgv/#R>) ⬇️ One
task resolved. -1
Mar 2021 (13 of 48 issues
<https://phabricator.wikimedia.org/maniphest/query/RsVPep46KRY4/#R>) ⬇️ One
task resolved. -1
Apr 2021 (19 of 42 issues
<https://phabricator.wikimedia.org/maniphest/query/rYyMt_gYYymb/#R>) ⬇️
Four tasks resolved. -4
May 2021 (25 of 54 issues
<https://phabricator.wikimedia.org/maniphest/query/tmkGqt0C93YG/#R>) ⬇️
Four tasks resolved. -4
June 2021 (15 of 26 issues
<https://phabricator.wikimedia.org/maniphest/query/roL0TaxtcaLQ/#R>) 📌 26
new issues, of which 11 were closed. +26, -11
-------
Tally
150 issues open, as of Excellence #32 (May 2021)
<https://phabricator.wikimedia.org/phame/post/view/236/production_excellence…>
.
-11 issues closed, of the previous 150 open issues.
+15 new issues that survived June 2021.
154 issues open as of yesterday.
-------
Thanks!
Thank you to everyone who helped by reporting, investigating, or resolving
problems in Wikimedia production. Thanks!
Until next time,
– Timo Tijhof
🔗 Share or read later via
https://phabricator.wikimedia.org/phame/post/view/240/
Hello all,
We'd like to inform you of a change coming in how media is structured in the parser's HTML output. It has been [in the works for quite some time][1]. The new structure was prototyped in Parsoid's output since its inception and outlined in [its specification][2].
The proposed change has gone through the [RFC process][3] and an implementation to output this new structure in MediaWiki's core parser was [recently merged][4], gated behind a flag. So far, it has been enabled on testwiki and testwiki2.
There are [a number of known issues][5] but we don't expect to see many rendering differences since we've done some [extensive visual diff testing][6]. Templates won't be impacted; the old CSS styles will remain, for now.
However, where we do expect work is needed is with code interacting with the page, be it user scripts, gadgets, extensions, bots, or other things.
If you'd like to help us out and get ahead of the changes before they have the potential to interfere with your workflow, please visit these wikis and test them out. You can file tasks in Phabricator with the Parsoid-Media-Structure project tag.
Thanks,
The Parsing Team
[1]: https://www.mediawiki.org/wiki/Parsing/Media_structure
[2]: https://www.mediawiki.org/wiki/Specs/HTML/2.2.0#Media
[3]: https://phabricator.wikimedia.org/T118517
[4]: https://gerrit.wikimedia.org/r/c/mediawiki/core/+/507512
[5]: https://phabricator.wikimedia.org/project/board/5428/
[6]: https://phabricator.wikimedia.org/T266149
The Search Platform Team
<https://www.mediawiki.org/wiki/Wikimedia_Search_Platform> usually holds
office hours the first Wednesday of each month. Come talk to us about
anything related to Wikimedia search, Wikidata Query Service, Wikimedia
Commons Query Service, etc.!
Feel free to add your items to the Etherpad Agenda for the next meeting.
Details for our next meeting:
Date: Wednesday, July 14th, 2021 (not July 7 due to WMF holiday)
Time: 15:00-16:00 GMT / 08:00-09:00 PDT / 11:00-12:00 EDT / 17:00-18:00 CEST
Etherpad: https://etherpad.wikimedia.org/p/Search_Platform_Office_Hours
Google Meet link: https://meet.google.com/vyc-jvgq-dww
Join by phone in the US: +1 786-701-6904 PIN: 262 122 849#
Hope to talk to you in a week!
Trey Jones
Sr. Computational Linguist, Search Platform
Wikimedia Foundation
UTC–4 / EDT
Dear all,
we have developed a tool that is (in some cases) capable of checking if
formulae in <math/>-tags in the context of a wikitext fragment are likely
to be correct or not. We would like to test the tool on the recent changes.
From
https://www.mediawiki.org/wiki/API:Recent_changes_stream
we can get the stream of recent changes. However, I did not find a way to
get the diff (either in HTML or Wikitext) to figure out how the content was
changed. The only option I see is to request the revision text manually
additionally. This would be a few unnecessary requests since most of the
changes do not change <math/>-tags. I assume that others, i.e., ORES
https://www.mediawiki.org/wiki/ORES,
compute the diffs anyhow and wonder if there is an easier way to get the
diffs from the recent changes stream without additional requests.
All the best
Physikerwelt (Moritz Schubotz)
Hi everyone,
I'm very happy to announce:
OpenRefine [1] has two Junior Developer job openings (paid contractor positions; part-time, fully remote) for building Structured Data on Wikimedia Commons [2] functionalities.
Needless to say, we would love to receive applications from Wikimedians :-)
* Junior Developer - Wikimedia Development [3] (6 months, from September 2021 till February 2022)
* Junior Developer - OpenRefine Development [4] (8 months, from November 2021 till June 2022)
All the best!
Sandra (User:Spinster / User:SFauconnier)
[1] https://openrefine.org
[2] https://w.wiki/UR
[3] https://openrefine.org/blog/2021/07/07/Wikimedia-Commons-reconciliation-bat…
[4] https://openrefine.org/blog/2021/07/07/OpenRefine-SDC-developer.html