Hi,
for HTML version see
https://www.mediawiki.org/wiki/Scrum_of_scrums/2020-03-18
This was the last in-person Scrum of scrums meeting. Going forward we will
just update the etherpad.
Željko
--
= 2020-03-18 =
== Callouts ==
* Release Engineering
** Deployments/Covid-19 [[wikitech:Deployments/Covid-19]]
** [All] MediaWiki 1.35.0 will get cut on 7 April 2020. If your team has
any proposed blockers/deadlines for that, please get them done:
[[phab:tag/MW-1.35-release]]
* Structured Data needs review from Security on proposed new data dump for
MachineVision: [[phab:T236431#5900726]]
== SoS Meeting Bookkeeping ==
* Updates:
** This was the last in-person SoS meeting. Going forward we will just
update the etherpad.
== Product ==
=== Editing ===
* Updates:
** No Updates, we are shifting or focus but hoping to turn on Replying 1.0
on Beta
=== iOS native app ===
* Updates:
** Wrapping up development on 6.6 release (mobile-html integration)...Beta
soon[[phab:project/view/4273]]
=== Android native app ===
* Updates:
** Release of mobile_html ready for Beta, pending final testing.
** Release of Suggested Edits v4 (image tagging) pending user testing /
polishing updates.
=== Web ===
* Updates:
** Working on Logo and Header for Desktop Refresh
=== Structured Data ===
* Blocked by:
** Security on review of proposed new data dump: [[phab:T236431#5900726]]
* Updates:
** will miss the meeting because of a clash - back next week
** units for quantities, improved geo-coordinates, date, and monolingual
text input types all live
** constraints violations code merged should be live when train rolls
** some CAT improvements on the way
== Technology ==
=== Fundraising Tech ===
* Updates:
** Deployed NL bank transfer capability for backup processor, working on
making it possible to accept recurring bank donations there
** Investigating new ways to manage PayPal recurring donations
** Testing migrating recurring donations from main card processor's old API
integration to their new integration
** Getting started making an autocomplete dropdown of employers offering
matching gifts
=== Core Platform ===
* Blocking:
** Wikimedia DE/Wikidata: would appreciate someone advising on
[[phab:T225814]].
** Search Platform: MW Job consumers sometimes pause for several minutes
[[phab:T224425]]
* Updates:
** Continued work on jobrunner, changeprops
** Core REST API next version coming soon
** API Gateway
** Developer Portal prototype
[[Core_Platform_Team/Initiatives/API_Gateway/Documentation_Plan]]
** Process of adopting or co-parenting CentralNotice
=== Engineering Productivity ===
==== Release Engineering ====
* Blocking:
** Wikimedia DE/Wikidata: would appreciate someone chiming in on
[[phab:T245826]]
* Updates:
** [All] MediaWiki 1.35.0 will get cut on 7 April 2020. If your team has
any proposed blockers/deadlines for that, please get them done:
[[phab:tag/MW-1.35-release]]
** Train Health
*** This week: 1.35.0-wmf.24 - [[phab:T233872]]
*** Next week: 1.35.0-wmf.25 - [[phab:T233873]]
=== Scoring Platform ===
* Updates:
** Pile of volunteers: chtnnh, haksoat, clemons, nikhil, (+2 more on the
way to IRC/phab)
** Jade
*** Did first round of user-testing on Beta.
*** We got the Jade Diff view to work.
https://deployment.wikimedia.beta.wmflabs.org/wiki/Jade:Diff/4
**** Considered native OOUI element for Diff. Needs PHP. Crusty.
**** Struggling to get date format strings to the client side
** Released revscoring 2.6.9 which includes section-based features (e.g.
text complexity measures)
** Implemented improved feature extraction for idioms and images (re.
articlequality)
=== Search Platform ===
* Blocked by:
** Core: MW Job consumers sometimes pause for several minutes
[[phab:T224425]]
* Updates:
** Copy English Wikipedia drafttopic scores to other wikis somewhere in the
CirrusSearch pipeline [[phab:T241015]]
** Once the ORES articletopic - ElasticSearch pipeline is set up, update
data about all articles [[phab:T243357]]
=== Security ===
* Blocking:
** Structured Data: review of proposed new data dump:
[[phab:T236431#5900726]]
=== Site Reliability Engineering ===
* Blocking:
** Product Infrastructure on creations of k8s namespaces/tokens for proton,
mobileapps. Working on it
** Research on creation of k8s namespaces/tokens for recommendation-api.
Working on it.
== Wikimedia DE ==
=== Wikidata ===
* Blocked by:
** Release Engineering: We would appreciate someone chiming in on
[[phab:T245826]]
** Not sure who - Core Platform: We would appreciate someone advising on
[[phab:T225814]].
* Updates:
** wb_terms table (the old term store) is not being read or updated
anymore. [[phab:T208425]]
Hi All,
After Installing the MobileFrontEnd extension, I am unable to run it due to
some error.
*Fatal error: Uncaught Error: Call to undefined method
MediaWiki\MediaWikiServices::getContentHandlerFactory() in
/Library/WebServer/Documents/myweb/otherprojects/mw/core/extensions/MobileFrontend/includes/MobileFrontendEditorHooks.php*
I am not sure what is missing and how can I make this run.
Thanks
Hi all,
We're still working on a project with the MediaWiki API, and we've ran into
a different issue regarding page moves/redirects.
We're trying to pull revision and redirect data from the "Killing/Death of
Luo Changqing" page and talk page. Unfortunately, this page wasn't found
when pulling it through the MediaWiki API when we filtered using our date
range from 2009-2019. Either Death or Killing worked prior to the page
move, but now we found that we can no longer access the revisions that
occurred during the old time frame.
Regarding pages that have been moved/redirected, what would you recommend
us to do pull this data that was previously available?
Thanks,
Jackie, James, Junyi, Kirby
Hello all,
For almost a year, the Wikidata development team has been working on the
task of redesigning and migrating the wb_terms table, which had become too
big and unsustainable over the years.
You can read the tale of our journey on this blog post: Come to Terms with
Changes
<https://phabricator.wikimedia.org/phame/post/view/195/coming_to_terms_with_…>
If you’re a tool maintainer and your tool queries directly the Labs
database replicas, you can read more details
<https://lists.wikimedia.org/pipermail/wikidata/2020-March/013901.html>
about the next steps and how to update your code.
Congratulations to all the developers involved in this big project!
Cheers,
--
Léa Lacroix
Project Manager Community Communication for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
Yo! I've written a short post on improving JSDocs with typing. It's
called "the best documentation automation can buy." If you write
JavaScript, you may find it informative. If you're already using this
new tooling in your projects, I'd be interested to hear your thoughts as
well.
In Grant's words, "Write the docs!"
https://phabricator.wikimedia.org/phame/post/view/194/the_best_documentatio…
Stephen
Hi Emufarmers!
Thanks for your attention! :) Yes, the tool might detect the sentences that
already referenced, because Citation Detective actually feeds *every single
sentence* in an article to Citation Need model and extract sentences with
high scores.
Highlighting a sourced sentence doesn't mean the source used is unreliable
as Citation Detective has no idea of whether a sentence has a reference and
the content of the source. I would say it's just like double confirm that
the sentence needs a citation, and yes there is a citation already.
You might have questions why the tool didn't exclude those sentences
already have a reference. The reason is a reference doesn't necessarily
apply just to the sentence right before it. They could apply to more than
one sentence or a whole paragraph, and there's no way to determine that
from the Wikitext. So that's why the tool was designed (at least for the
initial version) to feed every sentence to the model and detect citation
need score.
There's the same case with { citation needed } tags, so you would also find
sentences with {cn} tag highlighted in the prototype. That means both human
and machine think the statement needs a citation to a reliable source.
I hope this clarifies things for you. :)
Aiko
Dear,
Hope This mail finds you well.
First of all I would like to introduce myself to you... I'm Mahmoud Ahmed
studying computer science at Cairo university.
I am currently trying to join GSoC 2020
(Add 'Reverted' filter to RC Filters)
But I have some questions, please.
Can you give me more details clearly about this project (Add 'Reverted'
filter to RC Filters)
Because I am very excited about it.
Finally... Thank you very much in advance.
Regards,
Mahmoud Ahmed.
--
Hi all,
I’m happy to announce the outcome of an Outreachy internship
<https://phabricator.wikimedia.org/T233707> that I’m finishing up. It is a
new tool and public dataset named Citation Detective which tool developers
and researchers can now use for their projects.
Citation Detective <https://meta.wikimedia.org/wiki/Citation_Detective>
contains sentences that have been identified as needing a citation using a
machine learning-based classifier published earlier last year
<https://arxiv.org/pdf/1902.11116.pdf> by WMF researchers and
collaborators. As part of Outreachy, I developed a tool
<https://github.com/AikoChou/citationdetective> (hosted on Toolforge
<https://tools.wmflabs.org>) to run through Wikipedia and extract
high-scoring sentences along with contextual information.
As an example use case for this data, I also created a proof of concept for
integrating Citation Detective and Citation Hunt
<https://tools.wmflabs.org/citationhunt>. Check out my prototype Citation
Hunt <https://tools.wmflabs.org/aiko-citationhunt>, which uses Citation
Detective to import sentences that would not normally be featured in
Citation Hunt. The repository for that is here
<https://github.com/AikoChou/citationhunt>.
This dataset currently includes sentences from ~120,000 randomly selected
articles from the English Wikipedia. In future work, we hope to expand this
to more language Wikipedia projects and a greater number of articles. It is
also possible to expand the database to contain more fields in a future
version according to feedback from tool developers and researchers. More
use cases for this type of data were identified in a design research project
<https://meta.wikimedia.org/wiki/Research:Identification_of_Unsourced_Statem…>
conducted last year by Jonathan Morgan.
You can find more information in our Wiki Workshop submission
<https://commons.wikimedia.org/wiki/File:Citation_Detective_WikiWorkshop2020…>
and in my blog <https://rollingmist.home.blog/> which documented the whole
journey.
Thank you very much!
Kind regard,
Aiko
Hi!
I'm a student currently pursuing a MSc in Data Science and I've been
thinking of applying to GSoC with Wikimedia this year. For over a year now
I've been a system admin of a medium-sized wiki, I wrote a couple
extensions (you can find them here:
https://www.mediawiki.org/wiki/User:Ostrzyciel) and some patches to core.
By being a sysadmin of a wiki I watch its performance closely and over time
I've discovered the single thing that slowed down the wiki the most was
InstantCommons. It turns out the ForeignApiRepo code is fine for a few
pages with little images, but once your wiki starts using Commons imagery a
lot, things get ugly, quick. Like parsing-a-page-takes-2-minutes-ugly. Or
the whole wiki can collapse if Commons isn't responding for some reason.
I think improving this would kind of correlate with Wikimedia's mission of
hosting the most accessible free media repository in the world :) I really
wish more people could use Commons extensively, and that would certainly
help it.
I did some research into that topic and came up with a few solutions, but I
am by no means an expert in MW architecture, I would be grateful if I
received some help from people familiar with Parsoid and the action API.
You can find a more detailed explanation here:
https://phabricator.wikimedia.org/T247406
I am also looking for mentors for this project :)
Thank you!
Ostrzyciel
Hello,
I'm forwarding an invitation to the Wikimedia Café meetup for this month.
Pine
( https://meta.wikimedia.org/wiki/User:Pine )
---------- Forwarded message ---------
From: Lane Rasberry <lane(a)bluerasberry.com>
Date: Sun, Mar 15, 2020 at 10:43 PM
Subject: [Wikimedia-l] Wikimedia Café - Sat 28 March 2020...
To: Wikimedia Mailing List <wikimedia-l(a)lists.wikimedia.org>
>
> Hello,
I am writing to invite anyone to join the next online meeting of Wikimedia
Café on Saturday 28 March 2020 4:30 PM UTC. Details for joining are at
https://meta.wikimedia.org/wiki/Wikimedia_Café
----> (video room open at that time) https://virginia.zoom.us/my/wikilgbt
The agenda for this month includes discussing COVID-19 and Wikipedia and
how Wikimedia community members feel about WMF/community relations.
Wikimedia Café is a modest, one-hour, monthly online meeting which for the
past few months has had fewer than 10 attendees. At these meetings anyone
can propose to discuss any topic of broad Wikimedia community interest, as
if we all were able to meet in person over coffee. The meetings themselves
are an experiment in small group Wikimedia community conversation with
video chat, phone access options, and online shared notetaking. Please see
WikiProject Remote Event Participation for more information about this
general style of online event.
https://meta.wikimedia.org/wiki/WikiProject_remote_event_participation
- Anyone interested in joining may do so.
- Anyone interested in reading notes of past meetings can find them on
the meta page.
- If there is anyone who wants to get their ideas published in the wiki
world, consider looking at how this Café works, because voice chat with
notetaking could be a way to organize your own wiki community.
Thanks Pine for performing as host in this and thanks to anyone who submits
topics for discussion or who is able to join.
--
Lane Rasberry
user:bluerasberry on Wikipedia
206.801.0814
lane(a)bluerasberry.com
_______________________________________________
Wikimedia-l mailing list, guidelines at:
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l(a)lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
<mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe>