Your help is welcome to provide feedback and guidance:
== in "mediawiki/tools/mwdumper": ==
since 2016-09-08:
Major refactoring
https://gerrit.wikimedia.org/r/#/c/309314/
Thanks in advance for your reviews.
Of last weeks' 5 listed patches, 4 got merged & 1 got reviewed. Thanks
to Amire80, Daniel, FlorianSW, Jdlrobson, MatmaRex, Smalyshev, Tjones!
andre
--
Andre Klapper | Wikimedia Bugwrangler
http://blogs.gnome.org/aklapper/
Hi everyone:
Currently, the CSRF token for anonymous users are very predictable.
This potentially allows someone to make CSRF attacks against
non-logged in users. I would like to propose we change that. Since
this is a sort of major change, I'd appreciate everyone's feedback.
There are multiple proposals on the bug -
https://phabricator.wikimedia.org/T40417#2612673 . Its unclear which
solution we should chose.
Personally, my preferred solution [0] [I might be biased in evaluating
them] would be to base the CSRF token on a session cookie if one
exists. If one does not exist, use a HMAC of the users IP addressed,
keyed using a server side secret (The only state an Anon has is what
IP address it is, so this should be safe). This way it will work for
users without cookies (Maybe none exist, but I like the idea you can
edit wikipedia without cookies) and for users who have rapidly
changing IPs. It will also have minimal breakage, as you won't have to
adjust any existing usages of tokens (For example, on special pages).
It also ensures that users are not forced to skip varnish cache (from
session cookie) unless they really need to.
Anyways, I'd appreciate everyone's comments. I would really like to
settle on a specific implementation, and then go do it.
[0] https://phabricator.wikimedia.org/T40417#2034118
--
Bawolff
Hi, the call for participation for the Wikimedia Developer Summit 2017 is
now open:
https://www.mediawiki.org/wiki/Wikimedia_Developer_Summit/Call_for_particip…
We welcome especially proposals related to these main topics:
* A plan for the Community Wishlist 2016 top results
* Handling wiki content beyond plaintext
* A unified vision for editorial collaboration
* Building a sustainable user experience together
* Useful, consistent, and well documented APIs
* How to manage our technical debt
* Artificial Intelligence to build and navigate content
* How to grow our technical community
If you want to propose an activity pre-scheduled in the Summit program, you
have time until Monday, October 31. There is no deadline to propose
Unconference sessions.
ABOUT
The Wikimedia Developer Summit is the annual meeting to push the evolution
of MediaWiki and other technologies supporting the Wikimedia movement. We
welcome all Wikimedia technical contributors and third party developers
using the Wikimedia APIs or MediaWiki.
https://www.mediawiki.org/wiki/Wikimedia_Developer_Summit
(This information wants to be forwarded!)
--
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
Hello,
I am not sure where to ask, thus i write here. Sorry if this is the wrong place!
I noticed that a Code of Conduct for Phabricator is getting developed. Cool to see that people are creating such a policy, it is standard yet in big other projects. :-)
Just a few thoughts:
Unfortunately, the whole Code of Conduct Voting hasn't been widely announced (just on phabricator). There are a lot users which are using phab just once at year. Such a important decision should be widely announced imho. I mainly see participation just from a very closed group.
Last but not least: I am not happy at all that me comment has been strike from https://www.mediawiki.org/w/index.php?title=Talk:Code_of_Conduct/Draft&diff… by Matt Flaschen (WMF). Looks like it is no longer allowed to comment over there, thus i write here.
:-)
Regards,
Steinsplitter
----
http://meta.wikimedia.org/wiki/User:Steinsplitter
With the advent of Wikidata-based infoboxes, the page contents can
change without the local text being changed, so without a new
revision. Is there any way tho find out when this happens from the
API? I know I can always do 2 API calls, one for the page and one for
the item, but that's time consuming.
Thanks,
Strainu
Hi everyone,
Abstract
-----------
This mail doubles as an invitation to come to this week's ArchCom
office hour (Phab:E285), and provides an attempt to provide answers to
questions about the agenda of the Wikimedia Developer Summit
(WikiDev17). I'm hoping we find a way to work with people who can't
travel, and noting we're also working to make remote participation
more rewarding. This emphasizes the importance of WikiDev17 being a
better event for online attendance this year, and the primacy of
online conversations in our community's decision making.
Table of contents for the rest:
* ArchCom office hour E285
* Previous Dev Summits (2014-16)
* The dangers of prior RFC requirement
* Less spectators, more participants
ArchCom office hour
----------------------------
This week's ArchCom IRC office hour will be this coming Wednesday,
<https://phabricator.wikimedia.org/E285>
Wednesday, 2016-09-28, 21:00 UTC (2pm PDT, 23:00 CEST) on #wikimedia-office
This is a continuation of the many conversations we've had on this
mailing list (e,g, the "Wikimedia Developer Summit 2017: Registration
Open" thread last week) and elsewhere about the summit.
Previous Dev Summits (2014-16)
----------------------------------------------
This is an admittedly biased version of history, based on my
involvement in the program committee that is still forming. Quim is
chairing the program committee, and I'm one of the members, but my
understanding from Quim is that we're still waiting for some invitees
to respond.
Previous years, we had a more explicit emphasis on "architecture"
(e.g. even calling our 2014 event the "Architecture Summit"[1]). The
ties between "architecture"<->MediaWiki<->wikitech-l are very strong.
Additionally, the 2014 and 2016 events had very explicit instructions
insisting on submission of MediaWiki RFCs[2].
The benefit of requiring submission of RFCs was that it caused many
people to write down "this is what I want to talk about". There were
many conversations leading up to last year's summit that might not
have happened without such an explicit prompt. Many of the
unconference sessions last year were discussions that were submitted
as RFCs, but were turned down for plenary session time. Those
unconference sessions benefited from the prep work.
The dangers of prior RFC requirement
--------------------------------------------------------------
We don't intend to make the requirement so tough this year. A
challenge we face is that many topics don't fit well into RFC form.
"RFC" and "conversation" are not interchangeable terms. We hope that
all RFCs are indeed conversations, but certainly not all conversations
belong in RFCs. Last year, the RFC requirement also meant that all of
the scheduled topics had Phab IDs associated with them. Is there some
other short identifier we can use as a standard conversation
identifier? Maybe Wikidata QIDs? ;-)
The hope is that WikiDev17 enriches conversations that are well
underway *before* everyone shows up in San Francisco. Complicated
conversations require a shared context, but humans have traditionally
had a difficult time building shared context without physically
putting everyone in the same room at the same time. Developers
frequently want to cram "context building" into their discussion time,
spending 70 minutes out of a 90 minute conversation time bringing
attendees up-to-speed, so that we can have a "really good" 20 minute
conversation.
A big fear: we fail to connect WMF staff developers with larger
Wikimedia community developers. Meetings bust up unconference style
into two camps: WMF staff led discussions where participation relies
on having the kind of knowledge that one needs "insider" access to
stay abreast of, and non-WMF staff led discussions where participants
try to solve problems that WMF staff doesn't seem interested in
solving.
A huge challenge: we can't *know* in September 2016 what conversations
will be important in January 2017, but based on our experience with
past Dev Summits, it's worth creating the opportunity for important
conversations to happen. We have plenty of conversations that are
still ongoing, and plenty of conversations many of you all likely know
need to happen in January. Let's start the conversations we know
about now, and *hope* that they're already done before the summit
Less spectators, more participants
----------------------------------------------
One thing we know from all of our experience: y'all don't want to make
a point of coming to San Francisco to be talked at by someone. As in
past years, we're working to bring up to 200 people together to have
great conversations about the collective hopes of the Wikimedia
development community. It's happening the same week as the Wikimedia
Foundation All-Staff meeting, so the attendance will be heavily biased
toward WMF staff, but we hope this isn't just us talking to ourselves.
We're working hard to provide a framework for good conversations to
happen; not for us to talk at you, or for you to talk at us, but for
all of us to learn from each other. We're really happy with the
satisfaction numbers from last year[3], and in particular, we hope
that the 75 respondents (out of 84 responses) who agreed with "I would
like to attend this event again next year" still believe that now.
Let's use the IRC meeting this week to prepare for the January event.
Rob
[1]: https://www.mediawiki.org/wiki/Architecture_Summit_2014
[2]: https://www.mediawiki.org/wiki/Wikimedia_Developer_Summit_2016/Program
[3]: https://www.mediawiki.org/wiki/Wikimedia_Developer_Summit_2016/Lessons_Lear…
(satisfaction numbers)
Hi Quim,
Although I have used MediaWiki as a mechanism to submit conference
> proposals (here and in my previous job), I don't think I know any MediaWiki
> user or any event organizer out of Wikimedia using MediaWiki to handle a
> call for participation. Wikimania and some WikiCons do, but I don't know
> the reasons why they do it, neither how happy are the organizers and the
> participants using those MediaWiki-based processes for an event.
There's SMWCon (the Semantic MediaWiki Conference), although you could say
that this too is an example of "dogfooding". But the annual Chaos
Communications Congress [0] also uses MediaWiki for its talk submissions,
and that's a real conference, which had over 13,000 attendees last year.
Most relevantly, the Chaos Communications Congress wiki uses the Semantic
Forms [1] extension to handle submissions - speakers use a form to enter
their talk proposals. I don't know how exactly talks are approved, or
whether the form is used for approving/rejecting talks too - or, for that
matter, whether they have any sort of real screening process. But the basic
mechanics are that forms are used for entering all the relevant information
about each talk, including tags (among many other fields).
Here's an example of one such page for a session:
https://events.ccc.de/congress/2015/wiki/Session:A_New_Business_Model_for_t…
...and here's the form used to create/edit it:
https://events.ccc.de/congress/2015/wiki/index.php?title=Session:A_New_Busi…
Why have Wikimedia events never used MediaWiki + Semantic Forms to manage
their talk proposals? I don't know - it might be a case of "not invented
here" syndrome, ironically, since Semantic Forms is a non-Wikimedia
extension. But it's certainly an option.
[0] https://en.wikipedia.org/wiki/Chaos_Communication_Congress
[1] https://www.mediawiki.org/wiki/Extension:Semantic_Forms
-Yaron
--
WikiWorks · MediaWiki Consulting · http://wikiworks.com
(Posted at https://www.mediawiki.org/wiki/Topic:Tcg07knzz72wixak and pasted
here for convenience)
The Wikimedia Developer Summit 2017 aims to focus the call for
participation around these main topics:
* A plan for the Community Wishlist 2016 top results
* Handling wiki content beyond plaintext
* A unified vision for editorial collaboration
* Building a sustainable user experience together
* Useful, consistent, and well documented APIs
* How to manage our technical debt
* How to grow our technical community
You can find more information about these topics and how we selected them at
https://www.mediawiki.org/wiki/Wikimedia_Developer_Summit and
https://www.mediawiki.org/wiki/Topic:Tb6bztglijowk8x3
The next steps are:
* Each main topic needs at least one owner, who will assure that the topic
is well promoted in the right places, the right activities are proposed,
and the right people participate in them with common goals. These owners
will join the Program committee
https://www.mediawiki.org/wiki/Wikimedia_Developer_Summit/About
* Each main topic needs good quality and quantity of proposals submitted,
so a good selection can be made for the pre-scheduled Summit program.
* Each main topic needs its own wiki page for information, coordination,
discussion, and documentation.
If by the end of October a main topic is still missing an owner, a critical
mass of proposals and/or a decent wiki page, then the chances are that such
main topic will be dissolved. If that happens, the activities proposed can
be still pushed by their promoters in the context of the Unconference. Just
like the rest of proposals not fitting under the main topics.
--
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
https://www.mediawiki.org/wiki/Scrum_of_scrums/2016-09-28
= 2016-09-28 =
== Product ==
=== Reading ===
==== Mobile Content Service (MCS) ====
* Fixed video anchors
* No deploys this week due to ops offsite
* Starting discussions versioning sections endpoint
==== Android native app ====
* Current sprint board: https://phabricator.wikimedia.org/project/view/2238/
* Next sprint board: TBA
* Navigation overhaul work for beta release is complete. Release will go
out the door today (9/28).
** We'll likely promote to stable quickly if no significant issues emerge.
==== Reading Web ====
* Current sprint board:
**Begin rolling out wikidata descriptions to top 6 wikis
**Move Related Pages to stable mobile web Wikipedias (all but top 6)
**Investigate hovercards instrumentation issues around 9 Sep -
https://phabricator.wikimedia.org/T146620
*** Big drop in EL events in a few extensions -
https://phabricator.wikimedia.org/T146840
==== iOS native app ====
* Current release board:
https://phabricator.wikimedia.org/project/board/2220/
** 5.3 is in development
** Release date is expected to be before end of October
** WIll include migration to MCS feed endpoint
==== Reading Infrastructure ====
* blocking: TemporaryPasswordPrimaryAuthenticationProviderTest is flaky
https://phabricator.wikimedia.org/T146498
* blocked: nothing
* updates: last call for comments on the proposed api.php integration for
ORES and pageviews: https://phabricator.wikimedia.org/T143895https://phabricator.wikimedia.org/T144865
=== Community Tech ===
* No blockers
* Moving CopyPatrol to French Wikipedia
https://phabricator.wikimedia.org/T145431
* Adding account creation to Striker
https://phabricator.wikimedia.org/T144710#2674663
* Updating CentralAuth tables for cross-wiki watchlist
https://phabricator.wikimedia.org/T142507 (Gerrit review needed:
https://gerrit.wikimedia.org/r/#/c/309553/ )
* Adding start/end times to WikiEd Programs Dashboard
https://phabricator.wikimedia.org/T125546
* 11-year old task to send a cookie with each block
https://phabricator.wikimedia.org/T5233
* Slowly deploying PageAssessments to English Wikipedia
https://phabricator.wikimedia.org/T146679
=== Editing ===
==== Language (not present, only notes) ====
* Blocked
** Would appreciate comments from ops or puppet-knowledgeable person how to
override restbase url in puppet for a service in labs
*** https://phabricator.wikimedia.org/T129284#2674533
*** The goal is to be able to text content translations on deployment-prep
with real articles from different wikipedias.
* Blocking: not aware
==== Parsing ====
* Tim working on updates to the parser tests infrasatructure (
https://gerrit.wikimedia.org/r/#/c/312969/3 and related )
* Scott continuing work on having Parsoid parse language variant markup
(close to moving out of WIP status)
* Planning to deploy Parsoid's native <gallery> implementation in a
backward compatible way and have VE pick it up when they are ready to use
it -- if it doesn't happen next week, it will only happen in 3 weeks time
(since we have parsing team and editing dept offsites starting oct 10).
==== Collaboration ====
* Blocked
** Continuing collaboration with Services team on ReviewStream
* Blocking
* Updates
** Will have a meeting with the Wiki Labels team on Friday to discuss how
we can collaborate
** Continuing work on Flow back-end, including caching refactor and bug
fixes for opt-in on user talk
** Continuing Echo work, including instrumentation to track the number of
page views before viewing notifications.
=== UI Standardization ===
* Working on
** Align Minerva (Mobile Frontend) to overhauled color palette:
MobileFrontendhttps://phabricator.wikimedia.org/T146799
** Review current style and integrate messages and message boxes as
MediaWiki UI component
https://phabricator.wikimedia.org/T127405
** MediaWiki theme: Use `color-progressive` for switched-on binary inputs
https://phabricator.wikimedia.org/T145629
* Finished
** Published new color palette https://phabricator.wikimedia.org/M82 with
WCAG 2.0 level AA compliant colors
** Button styles differ in mediawiki.UI from design templates and OOjs UI
https://phabricator.wikimedia.org/T146823
** Align Wikimedia Portal UI elements to improved color palette
https://phabricator.wikimedia.org/T146231
== Technology ==
=== Analytics ===
* Deployed new throttling limits to Pageview API, doing real well with
super low latencies, we will add capacity to double requests served next
week:
https://grafana.wikimedia.org/dashboard/db/aqs-elukey?from=1474479314693&to…
* Erik Z vetting data from edit reconstruction project on simplewiki, after
edit reconstruction calculating metrics is much easier. We can calculate
metrics frm very beginning:
https://analytics.wikimedia.org/dashboards/standard-metrics/#simplewiki
* Load testing druid and about to put pivot in production accesible
internally to wmf, this is a UI to some of our data in hive, mostly
pageviews, no need to use sql
== Security ==
* Intake and interviews for security roles continue
* Investigating appearance of 400 errors in logs:
https://phabricator.wikimedia.org/T144100
* Lego, Timo: https://gerrit.wikimedia.org/r/#/c/310257/
=== Services ===
* Blocking: none
* Blocked: none
* Updates:
** Working on ServiceWorker page composition service
** Working on supporting Change Propagation and RESTBase for private
wikis
-
=== ArchCom ===
* Full status: https://www.mediawiki.org/wiki/ArchComStatus
* Last week: [[Phab:E273]] - Multi-Content Revisions ([[Phab:T107595]])
* This week [[Phab:E285]] - [[WikiDev17]]
** discussion about use of Phab for topics is happening now on wikitech-l
=== Discovery ===
* No blockers
* Analyzing BM25 A/B test, if results are positive deployment will be
somewhere early October
* Working on multiwiki indexes
* Working on file properties indexing (size, type, dimensions, etc.)
* ICU folding analysis:
https://www.mediawiki.org/wiki/User:TJones_(WMF)/Notes/Upgrading_ASCII_Fold…
** TLDR: positive effect, implementation in progress
* Completion with DEFAULTSORT demo:
http://mw-sug-subpages-relforge.wmflabs.org/w/default_sort_demo.html
=== RelEng ===
* Blocking
** none
* Blocked
** none
* Updates
** New scap (3.3.0)
*** scap caches local config for it's deployment (machines don't have to
reach back to tin)
** Reminder: no deploys week of Sept 26th
** Reminder: no train week of Oct 17th (but SWATs OK)
===Wikidata====
* wiktionary - automated interwiki links
* structured commons - federation, multiple repos
=== Fundraising Tech ===
* Need help with potential MessageCache issue relating to CentralNotice
** https://phabricator.wikimedia.org/T144952
** See Adam's investigation notes pointing to i18n / cache flukiness
** Andrew Green thinks it might even have something to do with
TranslationExtension
* Still trying to kill stubborn remaining ActiveMQ queues
* More CiviCRM data cleanup and tweaks to deduplication process
* Migrating old PayPal IPN listener to SmashPig framework