Hi all,
Here are the minutes from this week's TechCom meeting:
== RFC: Drop support for database upgrade older than two LTS releases ==
<https://phabricator.wikimedia.org/T259771>
* New RFC
* Moved from P1: Define to P2: Resource
* TT: The “UPGRADE” documentation file has a warning about upgrading
from certain older versions being unsupported. Maybe formalizing that in
some way would be useful. Interesting that the install is mentioned as slow
since it’s less than a second in CI as far as I know.
* DK: It’s hard to reason about database updates, so there’s mental load
involved. Unclear what to do with patches when changing database fields.
There was an idea that maintenance scripts would run after the schema
changes, but it doesn’t work in all cases, so now there are scripts that run
at different points in the process, which introduces additional complexity.
* TT: Right, running it mid-migration makes the state easier to reason
about,
but falls apart if it runs dynamic MW code that can’t/shouldn’t support
running
on an old MW schema.
* DK: This is a question of MediaWiki as a product, which doesn’t have a
clear
owner. Who has the authority to make this decision?
* TS: The people maintaining it are the most impacted, so it makes sense for
them to make the decision since it is related to developer productivity.
* TT: The RFC can help bring together different points of view on this, have
teams weigh in on the costs from their perspectives.
* TS: Two LTS releases sounds like a pretty short time compared to the
current policy.
* DK: I think two is reasonable (four years). One cost is testability; it’s
hard to
test this update logic.
== Stable Interface Policy: Timeline for removing obsolete code ==
<https://www.mediawiki.org/wiki/Topic:Vrwr9aloe6y1bi2v>
* Regarding a proposal to change wording on the talk page
* DK: The stable interface policy didn’t change the deprecation policy.
Things that aren’t used within the MediaWiki ecosystem don’t have to use
the deprecation policy. The intention was to make it easy to remove
obsolete code that isn’t used. So the question is if you make code
obsolete and then remove usages, can you remove it?
* TT: Hard deprecation is fine, not the same thing.
* DK: Depending on the interpretation of the policy, we could remove
anything if we remove all usages of it.
* TT: It’s meant to include a requirement to email wikitech-l with a notice
and justification. I think this made sense when there was an assumption
of stable by default. If we intentionally create an extension for public
use,
I’m not sure why we’d need to bypass deprecation.
* DK: There should be a clear, easy path for getting something into the
ecosystem.
* TT: Codesearch goes through external repositories as well. The process
for adding things to this isn’t easy or transparent, which needs to be
improved. The policy incentivizes the Wikimedia Foundation to make a
choice to help third party users migrate or continue support for some time.
As phrased now, the policy requires the code to no longer be used in the
default branch, allowing for the code manager to reject the change.
* TT: We can resolve within TechCom, but we should send a message to
wikitech-l.
== Introduce Authority objects to represent the user performing a given
action ==
<https://phabricator.wikimedia.org/T231930>
* DK: Strawman design for managing permission checks with an experimental
patch. <https://gerrit.wikimedia.org/r/c/mediawiki/core/+/582889/> There may
be resourcing available to start working on this in the next few weeks. We
could try to come up with a comprehensive design and submit an RFC or do
it the experimental way and use it in one corner of the codebase.
* TT: This is about making sure we can’t accidentally bypass permission
checks for non-current users. Experimenting seems fine, if there’s a
commitment to actively limit it to a narrow scope and not allow other
features
to adopt it.
* DK: The authority is something that actually encapsulates the checks
themselves, as opposed to a user. We’re missing a clean place to store
state for IP addresses when checking for IP-address blocks. Also applies
to OAuth sessions. This new idea partially supersedes permission manager,
so there’s some sunken cost there.
* TT: You might also want to talk to the AHT team about their experience
with Block Manager. Similar issues.
* DK: The tradeoff is more flexibility now and a commitment to clean it up
later.
Thinking about this as example-driven development. Trying to think through
everything in advance tends to bring up important edge cases and problems,
but it tends to change again during implementation. The goal is to provide
more room for exploration and experimentation without losing the feedback
process.
* TT: Needs an RFC to be used outside of the narrowly-defined, experimental
scope
== Next week IRC office hours ==
No IRC discussion scheduled for next week
You can also find our meeting minutes at
<https://www.mediawiki.org/wiki/Wikimedia_Technical_Committee/Minutes>
See also the TechCom RFC board
<https://phabricator.wikimedia.org/tag/mediawiki-rfcs/>.
If you prefer you can subscribe to our newsletter here
<https://www.mediawiki.org/wiki/Newsletter:TechCom_Radar>
--
Alex Paskulin
Technical Writer
Wikimedia Foundation
Hello,
This email contains updates for the last two weeks - August 5 and 12, 2020.
For the HTML versions, see:
https://www.mediawiki.org/wiki/Scrum_of_scrums/2020-08-05https://www.mediawiki.org/wiki/Scrum_of_scrums/2020-08-12
Cheers,
Deb
------------------------
*= 2020-08-05 =*
== Callouts ==
* FYI: Removing our old helm charts endpoint (<nowiki>
https://releases.wikimedia.org/charts</nowiki>) and moving to a new one,
powered by chartmuseum. <nowiki>https://helm-charts.wikimedia.org/</nowiki>.
if you are using helm locally with the old repo, you will have to switch to
the new one. On the plus side, this is integrated with our CI now and no
longer requires manually creating helm chart artifacts. Docs at <nowiki>
https://wikitech.wikimedia.org/wiki/ChartMuseum</nowiki>
== Product ==
=== iOS native app ===
* Blocked by:
* Blocking:
* Updates:
** Continuing development on [[phab:project/view/4661/|6.7 release]].
*** "Article as a living document" experiment
*** Event Platform Client MVP
*** iOS14 widgets & bug fixes
=== Web ===
* Updates:
**'''Summary''': Desktop Improvements Project's (DIP) deployment continues;
continuing WVUI Vector integration and the network client now has a
MediaWiki REST API implementation in review.
** [[Reading/Web/Desktop_Improvements|Desktop Improvements Project (Vector
/ DIP)]]:
*** [[phab:T250968|[ShoutWikiAds] Replace use of deprecated hook
VectorBeforeFooter]]
*** [[phab:T254227|Switch test wikis to new version of vector by default]]
*** [[phab:T253842|Fix the printable versions of modern Vector]]
*** [[phab:T250851|Allow skins to override mediawiki.page.ready
initialisation to enable search JavaScript to be swapped]]
*** [[phab:T249363|Move the existing search to the header in preparation
for Vue.js search development]]
*** [[phab:T259372|Refactor: Move PHP logic into JS for collapsing tabs
under more menu]]
*** [[phab:T257647|Integrate WVUI into Vector for Vue.js search]]
*** [[phab:T251212|[Dev] Drop VectorTemplate usage in Vector]]
*** [[phab:T248399|Document Skin API and their stability, if any (including
Vector)]]
*** [[phab:T247790|wgLogos follow up work]]
*** [[phab:T255319|Eventually deprecate SkinTemplateNavigation::SpecialPage
and SkinTemplateNavigation hooks in favor of
SkinTemplateNavigation::Universal]]
*** [[phab:T244392|Vue.js search case study]]:
**** See [[Reading/Web/Desktop Improvements/Vue.js case study/Status
log|weekly status updates]].
** Mobile website (MinervaNeue / MobileFrontend):
*** [[phab:T259080|Minerva bundle checks are broken]]
*** [[phab:T257872|Uncaught Error: Set map center and zoom first on mobile
domain Android]]
*** [[phab:T258096|Regression: Nested references do not open if user clicks
on [ or ] (which are wrapped in span)]]
** Standardization
*** [[phab:T250762|UsersMultiselectWidget not announcing status message]]
*** [[phab:T248062|Deprecate and remove `.background-image-svg()` mixin
from 'mediawiki.mixins.less']]
*** [[phab:T259086|Remove obsolete 'set-graphics'/'svg2png'/'imagemin'
tasks and Less background-image raster fallback for former Grade C browser
support]]
*** [[phab:T254195|Implement a core 'clearfix' mixin in mediawiki.mixin and
evaluate deprecation/removal of 'visualClear' class]]
*** [[phab:T258752|Unify `line-height` to `20px` in widgets to simplify
code and better i18n]]
*** [[phab:T257279|Standardize 'mediawiki.ui' variables to CSS variables
naming scheme in preparation for WikimediaUI Base variables takeover]]
*** [[phab:T247033|Add 'i18n-directionality.less' file to core and extract
overarching theme styles from legacy.less]]
** Portals
*** [[phab:T128546|[Recurring Task] Update Wikipedia and sister projects
portals statistics]]
** Miscellaneous
*** [[phab:T258256|<nowiki>OOUI window management broken on pages with with
additional frames, due to cross document access</nowiki>]]
*** [[phab:T255913|Document tagline and icon options of $wgLogos by putting
them in the installer]]
*** [[phab:T253047|TypeError: undefined is not an object (evaluating
'mw.config.get('wgFormattedNamespaces')[namespace].replace')]]
*** [[phab:T257877|MediaWiki installer appears unstyled]]
== Technology ==
=== Site Reliability Engineering ===
* Blocked by:
** None
* Blocking:
** None
* Updates:
** Removing our old helm charts endpoint (<nowiki>
https://releases.wikimedia.org/charts</nowiki>) and moving to a new one,
powered by chartmuseum. <nowiki>https://helm-charts.wikimedia.org/</nowiki>.
if you are using helm locally with the old repo, you will have to switch to
the new one. On the plus side, this is integrated with our CI now and no
longer requires manually creating helm chart artifacts. Docs at <nowiki>
https://wikitech.wikimedia.org/wiki/ChartMuseum</nowiki>
------------------------
*= 2020-08-12 =*
== Product ==
=== Web ===
* Updates:
** '''Summary''': WVUI Vector integration in Vector, Vue.js-focused week
starting.
** [[Reading/Web/Desktop Improvements|Desktop Improvements Project (Vector
/ DIP)]]:
*** [[phab:T258493|[Spike 8hrs] "Use Legacy Vector" is not working as a
global preference]]
*** [[phab:T254227|Switch test wikis to new version of vector by default]]
*** [[phab:T250851|Allow skins to override mediawiki.page.ready
initialisation to enable search JavaScript to be swapped]]
*** [[phab:T249363|Move the existing search to the header in preparation
for Vue.js search development]]
*** [[phab:T248399|Document Skin API and their stability, if any (including
Vector)]]
*** [[phab:T244392|Vue.js search case study]]:
**** See [[Reading/Web/Desktop Improvements/Vue.js case study/Status
log|weekly status updates]].
** Mobile website (MinervaNeue / MobileFrontend):
*** [[phab:T240622|[Technical debt payoff] Remove InlineDiffFormatter and
InlineDifferenceEngine from MobileFrontend]]
** Standardization
*** [[phab:T248062|Deprecate and remove `.background-image-svg()` mixin
from 'mediawiki.mixins.less']]
*** [[phab:T259086|Remove obsolete 'set-graphics'/'svg2png'/'imagemin'
tasks and Less background-image raster fallback for former Grade C browser
support]]
*** [[phab:T257279|Standardize 'mediawiki.ui' variables to CSS variables
naming scheme in preparation for WikimediaUI Base variables takeover]]
** Portals
*** [[phab:T128546|[Recurring Task] Update Wikipedia and sister projects
portals statistics]]
** Miscellaneous
*** [[phab:T259955|Skin hooks should not have unexpected side effects to
OutputPage]]
*** [[phab:T259630|TypeError: results.error is undefined]]
*** [[phab:T259400|Drop MonoBookAfterContent hook]]
*** [[phab:T259193|[betalabs] FF only - Notifications - vertical scrolling
in the header]]
*** [[phab:T258488|MenuTagMultiselectWidget removes selected tag if invalid
input is entered]]
*** [[phab:T258420|MenuTagMultiselect adds items twice if specified as
options and selected options]]
*** [[phab:T258256|<nowiki>OOUI window management broken on pages with with
additional frames, due to cross document access</nowiki>]]
*** [[phab:T250968|[ShoutWikiAds] Replace use of deprecated hook
VectorBeforeFooter]]
*** [[phab:T203023|skins.monobook.mobile.uls dependency doesn't do mobile?]]
*** [[phab:T259354|ActionField + dropdown styling issues]]
*** [[phab:T259316|Drop Monobook/resources/screen-desktop.css from
Monobook]]
*** [[phab:T258752|Unify `line-height` to `20px` in widgets to simplify
code and better i18n]]
*** [[phab:T257015|Redeploy quicksurvey on enwiki (for a Growth study)]]
Hi Everyone,
Mark your calendars! Wikimedia Tech Talks 2020 Episode 6 will take
place on Wednesday
on 12 August 2020 at 17:00 UTC.
Title: Retargeting extensions to work with Parsoid
Speaker: Subramanya Sastry
Summary:
The Parsing team is aiming to replace the core wikitext parser with Parsoid
for Wikimedia wikis sometime late next year. Parsoid models and processes
wikitext quite differently from the core parser (all that Parsoid
guarantees is that the rendering is largely identical, not the specific
process of generating the rendering). So, that does mean that extensions
that extend the behavior of the parser will need to adapt to work with
Parsoid instead to provide similar functionality [1]. With that in mind, we
have been working to more clearly specify how extensions need to adapt to
the Parsoid regime.
At a high level, here are the questions we needed to answer:
1) How do extensions "hook" into Parsoid?
2) When the registered hook listeners are invoked by Parsoid, how do they
process any wikitext they need to process?
3) How is the extension's output assimilated into the page output?
Broadly, the (highly simplified) answers are as follows:
1) Extensions now need to think in terms of transformations (convert this
to that) instead of events (at this point in the pipeline, call this
listener). So, more transformation hooks, and less parsing-event hooks.
2) Parsoid provides all registered listeners with a ParsoidExtensionAPI
object to interact with it which extensions can use to process wikitext.
3) The output is treated as a "fully-processed" page/DOM fragment. It is
appropriately decorated with additional markup and slotted into place into
the page. Extensions need not make any special efforts (aka strip state) to
protect it from the parsing pipeline.
In this talk, we will go over the draft Parsoid API for extensions [2] and
the kind of changes that would need to be made. While in this initial
stage, we are primarily targeting extensions that are deployed on the
Wikimedia wikis, eventually, all MediaWiki extensions that use parser hooks
or use the "parser API" to process wikitext will need to change. We hope to
use this talk to reach out to MediaWiki extension developers and get
feedback about the draft API so we can refine it appropriately.
[1] https://phabricator.wikimedia.org/T258838
[2] https://www.mediawiki.org/wiki/Parsoid/Extension_API
The link to the Youtube Livestream can be found here:
<https://www.youtube.com/watch?v=jNNy8ALGjaE>
https://www.youtube.com/watch?v=lS1xPkERWCM
During the live talk, you are invited to join the discussion on IRC at
#wikimedia-office
You can browse past Tech Talks here:
https://www.mediawiki.org/wiki/Tech_talks
If you are interested in giving your own tech talk, you can learn more here:
https://www.mediawiki.org/wiki/Project:Calendar/How_to_schedule_an_event#Te…
Kindly,
Sarah R. Rodlund
Senior Technical Writer, Developer Advocacy
<https://meta.wikimedia.org/wiki/Developer_Advocacy>
srodlund(a)wikimedia.org
Hi,
we've released v0.40.0 and follow-up hot fix v0.40.1 of OOUI library
last Friday.
It will rollout on the normal train tomorrow, Tuesday, 11 August.
Highlights in this release:
- Removal of Internet Explorer 8 specific code and fallbacks, largely
reducing size of CSS (~800+ lines on all icons) due to removal of PNG
fallback images, relying only on SVG icons from here on. This is going
to have a positive performance impact for all other users of the
library.
This is in alignment with approved RFC on removing IE 8 Basic support
for MediaWiki core.[0]. Hence, this release is a breaking release. If
you need continuous support in your environment for IE 8 you need to
stick with v0.39.3.
You can find details on additional new features, code-level, styling
and interaction design amendments, and all improvements since v0.38.0
in the full changelog[1].
If you have any further queries or need help dealing with breaking
changes, please let me know.
As always, interactive demos[2] and library documentation is available
on mediawiki.org[3], there is comprehensive generated code-level
documentation and interactive demos and tutorials hosted on
doc.wikimedia.org[4].
OOUI version: 0.40.1
MediaWiki version: 1.36.0-wmf.4
Date of deployment to production: Regular train, starting Tuesday 11 August
[0] - https://phabricator.wikimedia.org/T248061
[1] - https://gerrit.wikimedia.org/g/oojs/ui/+/v0.40.1/History.md
[2] - https://doc.wikimedia.org/oojs-ui/master/demos/#widgets-mediawiki-vector-ltr
[3] - https://www.mediawiki.org/wiki/OOUI
[4] - https://doc.wikimedia.org/oojs-ui/master/
Best,
Volker
Hi All
tl;dr: Don't deploy on Thursday except for emergencies; the deployment
calendar on Wikitech
<https://wikitech.wikimedia.org/wiki/Deployments#Week_of_August_10> is
up-to-date.
Friday of next week (2020-08-14) is a wmf holiday. As such, Thursday should
be treated as Friday for the purposes of deployment; that is, no
deployments on Thursday except for emergencies
<https://wikitech.wikimedia.org/wiki/Deployments/Emergencies>.
This has implications for the (1.36.0-wmf.4) train—we'll be going to all
wikis on Wednesday evening UTC rather than Thursday evening UTC:
- Tue, 11 Aug, EU Train Window, 13:00 UTC: Group0
- Wed, 12 Aug, EU Train Window, 13:00 UTC: Group1
- Wed 12 Aug, US Train Window, 19:00 UTC: All Wikis
Hopefully, this will allow time for manual testing on Group0 wikis, and
allow us to complete the train on time.
<3
-- Tyler
Hey,
I have an ethical question that I couldn't answer yet and have been asking
around but no definite answer yet so I'm asking it in a larger audience in
hope of a solution.
For almost a year now, I have been developing an NLP-based AI system to be
able to catch sock puppets (two users pretending to be different but
actually the same person). It's based on the way they speak. The way we
speak is like a fingerprint and it's unique to us and it's really hard to
forge or change on demand (unlike IP/UA), as the result if you apply some
basic techniques in AI on Wikipedia discussions (which can be really
lengthy, trust me), the datasets and sock puppets shine.
Here's an example, I highly recommend looking at these graphs, I compared
two pairs of users, one pair that are not sock puppets and the other is a
pair of known socks (a user who got banned indefinitely but came back
hidden under another username). [1][2] These graphs are based one of
several aspects of this AI system.
I have talked about this with WMF and other CUs to build and help us
understand and catch socks. Especially the ones that have enough resources
to change their IP/UA regularly (like sock farms, and/or UPEs) and also
with the increase of mobile intern providers and the horrible way they
assign IP to their users, this can get really handy in some SPI ("Sock
puppet investigation") [3] cases.
The problem is that this tool, while being built only on public
information, actually has the power to expose legitimate sock puppets.
People who live under oppressive governments and edit on sensitive topics.
Disclosing such connections between two accounts can cost people their
lives.
So, this code is not going to be public, period. But we need to have this
code in Wikimedia Cloud Services so people like CUs in other wikis be able
to use it as a web-based tool instead of me running it for them upon
request. But WMCS terms of use explicitly say code should never be
closed-source and this is our principle. What should we do? I pay a
corporate cloud provider for this and put such important code and data
there? We amend the terms of use to have some exceptions like this one?
The most plausible solution suggested so far (thanks Huji) is to have a
shell of a code that would be useless without data, and keep the code that
produces the data (out of dumps) closed (which is fine, running that code
is not too hard even on enwiki) and update the data myself. This might be
doable (which I'm around 30% sure, it still might expose too much) but it
wouldn't cover future cases similar to mine and I think a more long-term
solution is needed here. Also, it would reduce the bus factor to 1, and
maintenance would be complicated.
What should we do?
Thanks
[1]
https://commons.wikimedia.org/wiki/File:Word_distributions_of_two_users_in_…
[2]
https://commons.wikimedia.org/wiki/File:Word_distributions_of_two_users_in_…
[3] https://en.wikipedia.org/wiki/Wikipedia:SPI
--
Amir (he/him)