I'd like to offer a big thank you to the Cloud Services team for resolving
issues with VPS and toolsdb. It would take several weeks of "Thank You
Tuesdays" to adequately express my gratitude for your hard work during this
ordeal. Thank you Brooke, Andrew, Bryan, Arturo, Giovanni, our fine DBAs,
and everyone else involved!
As I said on IRC, I think the toolsdb outage is in a way indicative of
success. Maybe increased usage wasn't the true culprit, but the fallout of
the outage clearly shows how much we rely on Cloud Services. Also, tools
that I maintain running on VPS have had some 99% uptime since they were
launched (XTools going back 1.5 years), evidenced by a monitoring service
that I use. That's phenomenal. All things considered, a week or so of
partial downtime related to VPS hardware and toolsdb is not that
significant. Your prompt, tireless efforts to restore stability over the
weekend has not gone unnoticed.
Y'all are amazing.
~ MusikAnimal
Hey folks,
We've had a request to reschedule the way the various wikidata entity dumps
are run. Right now they go once a week on set days of the week; we've been
asked about pegging them to specific days of the month, rather as the
xml/sql dumps are run. See https://phabricator.wikimedia.org/T216160 for
more info.
Is this going to cause problems for anyone? Do you ingest these dumps on a
schedule, and what works for you? Please weigh in here or on the
phabricator task; thanks!
Ariel
The Wikimedia REST API's /page/summary response contains content_urls and
api_urls keys that provide convenience lists of various URLs of potential
interest to the consumer. These lists appear in other endpoint responses
as well, as the page summary response is transcluded in various places
throughout the REST API.
Currently, these URL strings are constructed erroneously using unencoded
page title strings. A proposed patch (
https://gerrit.wikimedia.org/r/#/c/mediawiki/services/mobileapps/+/489329/)
applies encodeURIComponent to encode these before including them in URLs.
Since this endpoint is advertised as stable[1], I'm announcing the change
here in advance. Barring any objections, the change will be deployed late
next week.
[1]
https://en.wikipedia.org/api/rest_v1/#!/Page_content/get_page_summary_title
--
Michael Holloway
Software Engineer, Reading Infrastructure
Hello,
At this year's WMF All Hands there was an unconference session about
MediaWiki and Docker; what people currently are doing/using and what
plans are for the future.
One of the identified needs during the session was a single forum for
those working with this tooling to collaborate more efficiently. To that
end it was proposed to create a Special Interest Group[0] (a "SIG").
I'd like to announce that we now have the basics of such a group:
* Wiki: https://www.mediawiki.org/wiki/Docker/SIG
* Mailing list: https://lists.wikimedia.org/mailman/listinfo/docker-sig
* IRC: #wikimedia-pipeline (reusing an already in use topical channel
for efficiency)
Please join!
I will be setting up a first meeting Soon (TM) and will announce on the
SIG's mailing list (and probably cross-post it here the first time).
Best,
Greg
[0] https://www.mediawiki.org/wiki/Special_Interest_Groups
--
| Greg Grossmeier GPG: B2FA 27B1 F7EB D327 6B8E |
| Release Team Manager A18D 1138 8E47 FAC8 1C7D |
This sounds like an interesting potential approach to deal with spambots,
and hopefully to deter the people who make them.
https://techcrunch.com/2019/02/05/kasada-bots/
I don't know how practical it would be to implement an approach like this
in the Wikiverse, and whether licensing proprietary technology would be
required.
I would be interested in decreasing the quantity and effectiveness of
spambots that misuse WMF infrastructure, damage the quality of Wikimedia
content, and drain significant cumulative time from the limited supply of
good faith contributors.
Pine
( https://meta.wikimedia.org/wiki/User:Pine )
Hi team,
This is Kaushik Reddy.
After a long work, I would like to introduce you to my idea proposal for
the (Wikimedia) GSoC '19.
Here is it:
1) Building an animation to dynamically create popups overlapped on a
geographical map using a real-time API from Wikimedia.
I had found the root of this in here( I have gone through it a bit):
https://meta.m.wikimedia.org/wiki/Research:Data
I would like to work on this idea, this summer. Hope any mentor would join
to guide me . Also, feel free to comment on this.
With regards,
Kaushik.