Hi,
Anomie has written a new structure test[1] that checks whether API
modules have all the proper i18n messages needed for their documentation.
It's possible that your extension's tests will start failing due to
this; if you need help figuring it out, please ping one of us and we can
help out.
[1] https://gerrit.wikimedia.org/r/#/c/260598/
-- Legoktm
Hi,
At least twice I found AbuseFilters that prevented people from testing
extensions that I develop or posting feedback about them on MediaWiki.org
and the beta testing sites (such as
http://en.wikipedia.beta.wmflabs.org/wiki/Main_Page ).
By their nature, the extensions I am working on have a lot to do with
different (human) languages, and these AbuseFilters relied on the
assumption that on these sites everything is supposed to be in English, and
whatever is not English is probably spam.
One of them filtered out the whole Russian alphabet, so the extension
couldn't be tested by Russian-speaking users. Another one didn't allow
submitting Flow posts that don't have a sufficient amount of English words
from users who have less than 6 edits, so users who have a lot of edits in
their home wikis, but no edits in mediawiki.org, and who want to post in
languages other than English are blocked. Luckily, one of those people
pinged me directly, but I don't know how much useful feedback was filtered
out and lost.
I acknowledge that spam is a problem and that there may be a better
solution for multilingual testing and feedback than doing it all on one
site English site, but till there are solutions for this, please consider
non-English languages when defining AbuseFilters on our tech sites.
Thanks for understanding :)
--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
“We're living in pieces,
I want to live in peace.” – T. Moore
Hi everyone,
We've posted the Community Tech team's first status report on our progress
with the Community Wishlist Survey, and you're invited to come and check it
out:
https://meta.wikimedia.org/wiki/2015_Community_Wishlist_Survey/Status_repor…
In November and December, we invited active contributors to Wikimedia
projects to propose, discuss and vote on the features and fixes that they
most want to see. 634 people participated in the survey, voting on 107
proposals.
Our team has committed to investigating and responding to the top 10
wishes. In many cases, our team will be designing and building tools
ourselves, or collaborating with other teams and volunteers who are working
in that area. For the wishes that we can't build this year -- because it's
too big for our team, or there's a problem that we can't solve -- then we
can at least offer open discussion on the problem, and detailed
documentation explaining what we've learned, so the information can be used
by other developers in the future.
We've done a preliminary assessment of the top 10, which is described in
the status report. As of right now (mid-January), the two items that we're
actively working on are #1) Migrate dead links to the Wayback Machine, and
#7) Pageview Stats tool. Why are we working on those two and not the
others? Check out the status report for all the answers.
I'm going to post the quick overview of the top 10 wishes here. Each of
these wishes is discussed in detail on the status report page.
1. Migrate dead links to the Wayback Machine: Currently in progress,
working with a community developer and the Internet Archive. This is one of
the two projects we're actively working on now (mid-January).
2. Improved diff compare screen: Needs investigation and community
discussion to define the problems that we want to solve.
3. Central repository for templates, gadgets and Lua modules: Needs
underlying technical work that's currently under discussion by another team.
4. Cross-wiki watchlist: Needs technical investigation on the existing
Crosswatch tool, and the Collaboration team's cross-wiki notifications.
5. Numerical sorting in categories: Investigation is underway. There are a
couple potential solutions that we need to figure out.
6. Allow categories in Commons in all languages: Currently talking with
Wikidata about using structured metadata to solve the underlying problem.
7. Pageview Stats tool: Currently talking with the Analytics team about
their new pageview API. Needs some community discussion to define the
front-end spec. This is one of the two projects we're actively working on
now (mid-January), because the Analytics team is eager to use the new API
that they've developed.
8. Global cross-wiki talk page: Needs community discussion to define the
product.
9. Improve copy and paste detection bot: Need to work with volunteer
developers to define scope on improving the existing Plagiabot.
10. Add a user watchlist: We've heard significant pushback about the
vandal-fighting use case, because of the risk of enabling harassment.
Currently investigating an opt-in version that would be useful for mentors,
classes, editathons and WikiProjects.
Here's the status report link again, for lots more information:
https://meta.wikimedia.org/wiki/2015_Community_Wishlist_Survey/Status_repor…
Our team is really excited about the work that we'll get to do this year,
and we're looking forward to talking and working with you as we go along.
Thanks,
Danny Horn
Product Manager
WMF - Community Tech
User:DannyH (WMF)
A vulnerability has been found in RESTBase v0.9.1 and earlier that
allowed attackers to read arbitrary files on the host system by
passing a specially crafted URL. This vulnerability has been fixed in
[1].
All RESTBase users are strongly encouraged to upgrade to v0.9.2
immediately. Files readable by the RESTBase service user might have
been accessed by third parties, so appropriate measures should be
taken.
mediawiki-containers [2] users with automatic updates enabled have
already been upgraded to v0.9.2.
--
Gabriel Wicke
Principal Engineer, Wikimedia Foundation
[1]: https://github.com/wikimedia/restbase/commit/1ea649306ae4e85ab2cee5a36318e9…
[2]: https://github.com/wikimedia/mediawiki-containers
= 2015-01-20 =
== Reading ==
=== Android ===
* v2.1.137 promoted to production.
* Reviewing Gather API re: multiple lists on multiple languages
=== Web ===
* Planning
=== iOS ===
* Beta has been rolled out to over 1000 testers, there have been 422 installs to date
** Is there a way to mesaure beta API usage (i.e. CPU time?)
* Investigating using pageviews/top API as part of iOS "Explore" feature. Main blocker is filtering out things like non-main namespace, main page, Special: Search, etc.
** https://phabricator.wikimedia.org/T123442 (Dan: Joseph commented on this)
** https://phabricator.wikimedia.org/T118841 ? (Dan: this might be a misunderstanding, this sanitization is internal for privacy protection)
** https://phabricator.wikimedia.org/T121912 (Dan: redirects are quite confusing, I have some reading to do on this and it's hard to carve out time right now. But this is a priority)
** https://phabricator.wikimedia.org/T124082 (Dan: I'll comment on the ticket)
=== Reading Infrastructure ===
* Block: Could use a quick security look at https://gerrit.wikimedia.org/r/#/c/264309/.
* Hey Mobile people: Authmanager is replacing API action=login with action=clientlogin, and action=createaccount is going to completely change. We're currently aiming at end of February for the deployment.
** See https://gerrit.wikimedia.org/r/#/c/265201/ for details, then talk to Brad if you have questions.
== Technology ==
=== Research ===
* ORES serving ~100 external requests per minute
* ORES moving to new meso-level (Labs < Meso < Prod) support -- blocked on Ops time to set up machines
* See https://phabricator.wikimedia.org/T106867 -- working with Yuvi to flesh out sub-tasks -- need to know who to CC
=== Security ===
* kartographer, Ex:ORES, TextCat reviews in progress
* Patch to parser/stripmarkers (T110143) deploying today. Contact Security if you see an issue with Xml::escapeTagsOnly.
=== Services ===
* Mobile Content Service pre-generation enabled
:* running enwiki dump to populate it
* /page/definition/ end point for wiktionaries
:* functional only on en.wiktionary
:* pre-generated
* Move to Jessie and Node 4.2
:* Graphoid and Citoid - tomorrow
:* CXServer - please check!
::* Apertium pkgs?
:* Parsoid - needs testing
=== Release Engineering ===
* Blocking: none
* Blocked: none
* Updates:
** scap 3.0 will be tagged soon
*** finalizing changes to simplify configuration deployment
*** also working on puppet provider
=== Technical Operations ===
* Blocking: Research on ORES
* Blocked by: none
* Updates:
* RESTBase/Kernel security incident
* OTRS upgrade on Jan 28th to be rescheduled for Feb 2nd
* HHVM 3.11 packaging moving on, almost done
=== Analytics ===
* Event Logging replication to analytics-store has been slow for a long time. Right now it's behind by about 3 days for a lot of major schemas. The problem is not obvious but we're on it.
* Geowiki stopped updating on December 18th because one of the servers it was using was read-only, that's now fixed
* Piwik has been optimized and is working reasonably well with the 15.wikipedia.org website, other **low** traffic apps/sites can use piwik now if they wish (already talking to reading about doing this with their iOS client)
== Editing ==
=== Parsing ===
* No blockers
* Not blocking anyone (as far as I know)
* Updates:
** QR today (done)
** RT test server (ruthenium) being updated to jessie and node 4.2
*** Needs puppetization fixes to migrate upstart files to systemD
** Parsoid node 4.2 testing with a focus on memory usage / GC behavior
=== Language ===
* Delay in adding parallel corpora tables for Content Translation
** seems to be resolving now such that we are going to add them ourselves
=== Collaboration ===
* Cross-wiki notification Beta Feature now on test and test2.
* Coordinating on Flow dumps from production. Ariel is working on this - https://phabricator.wikimedia.org/T119511
=== Fundraising Tech ===
* Investigating queue outages during December campaign
* Updating DonationInterface session handling to work with 1.27
** Trying to figure out how to run CI tests against the core branch deployed in payments, with 1.27 as non-voting
* More CiviCRM enhancements
== Discovery ==
* Portal A/B test is running
* Working on Dallas cluster load tests, so far it doesn't appear to hold the peak load, more work is needed
* WDQS Blazegraph 2.0 testing successful, will upgrade when GA is released. Started to work on geospatial search implementation.
* Quarterly review this Thursday
* Waiting for security review on TextCat
--
| Greg Grossmeier GPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @greg A18D 1138 8E47 FAC8 1C7D |
Hi folks,
One thing we'd love to get better at is taking and publishing useful
notes from our meetings. An example of that is ArchCom meetings, for
which we have been taking notes for a while, but I haven't gotten
around to publishing. Another example is our weekly tech Engineering
Tech managers meetings, which are increasing in importance as we head
into budget setting season.
Is there anyone available to be a volunteer scribe for ArchCom and/or
Tech/Eng management meetings? For logistical reasons, I would prefer
existing WMF employee offers over non-WMF offers, but beggars can't be
choosers ;-)
The next meeting opportunity is in 15 hours: the ArchCom is having
it's weekly 1pm PST (21:00 UTC) meeting just prior to the 2pm PST
public IRC meeting. If you're interested, please let me know! The
best way to express your volunteer interest is by adding your sig on
the bottom of this page:
<https://www.mediawiki.org/wiki/User:RobLa-WMF/Scribes>
...and discussion is welcomed on the associated talk page.
Rob
---------- Forwarded message ----------
From: Chen Davidi <chen(a)wikimedia.org.il>
Date: Wed, Dec 23, 2015 at 9:02 AM
Subject: [Wikitech-ambassadors] Wikimedia Hackathon Jerusalem 2016:
registration is now open!
To: wikitech-ambassadors(a)lists.wikimedia.org, wikitech-l(a)lists.wikimedia.org,
wikitech-announce(a)lists.wikimedia.org, engineering(a)lists.wikimedia.org,
wikimedia-l(a)lists.wikimedia.org
Hi everyone,
I'm thrilled to tell you all that the registration for Wikimedia Hackathon
2016 is now open!
The Hackathon will be held in Jerusalem, between March 31st to April 3rd,
2016, by Wikimedia Israel.
Scholarship applications are open until January 22nd.
You know the drill!
Registration here -
https://docs.google.com/forms/d/17WFRHTCX5_dnCD5hFk1gHEQUz6pTrKp85QsX01rJGa…
More info and updates on
https://www.mediawiki.org/wiki/Wikimedia_Hackathon_2016
If you have any questions, please contact us at
hackathon2016(a)wikimedia.org.il
Hope to see you all in Jerusalem!
Chen Davidi-Almog,
Activity & Resources Coordinator.
Wikimedia Israel
An Outreachy candidate for http://mediawiki.org/wiki/Accuracy_review who
went ahead and started unpaid has been making good progress, and is about
to land the central guts of the project on github. It's a new way to
transition from creating to maintaining Wikipedia articles, with an
emphasis on detecting outdated statistics, fighting bias including paid
advocacy of all kinds, and proofreading WEP student work. It's been going
slow, mostly because the original trial run architecture was too dependent
on email.
However, before she gets there, could one or two people who are maybe
beginner or intermediate with Python but advanced with Mediawiki or PHP
please test her user authentication and login framework?
https://github.com/priyankamandikal/wikireview/
<https://github.com/priyankamandikal/wikireview/issues>
It's built for PythonAnywhere because it shouldn't run on Wikimedia
servers, because of the safe harbor DMCA provisions precluding editorial
control by web hosts. Please report any issues on github and note your
results on the Phabricator task to prevent duplication of effort.
Thanks in advance!
Best regards,
Jim Salsman
Hi folks,
Over the past few weeks (including WikiDev '16) we've had several
conversations the Wikimedia software development governance model.
For a lot of people, the most important aspect of this is MediaWiki.
More generally, though, the scope is about we deploy to Wikimedia
sites that we intend to maintain, enhance, and further leverage.
On Wednesday, January 20, we intend to continue this conversation, and
welcome your participation:
<https://phabricator.wikimedia.org/E135>
In particular, we plan to discuss the work that Gabriel Wicke started:
<https://phabricator.wikimedia.org/T123606>
"WIP RFC: Improving and scaling our technical decision making process"
...which is mainly a pointer to:
<https://www.mediawiki.org/wiki/Requests_for_comment/Governance>
I've quoted the current text as of this writing below, and you're
welcome to reply on list or on the talk page (though let's try to
ensure a summary of important comments gets captured on T123606).
Based on the conversations I've been part of (and Gabriel's initial
writeup), Rust's governance model seems to be the leading candidate to
iterate toward. This doesn't necessarily mean making one big change
with a fanfare-laden launch, but let's discuss.
Wait until Wednesday's IRC session if you have to (22:00 UTC, 14:00
PST), but we'd love to hear your thoughts sooner.
Rob
p.s. Here's the link to the RFC
<https://www.mediawiki.org/wiki/Requests_for_comment/Governance>
...and below is the plain text from [mw:Requests for comment/Governance].
Problem statement
[Goal: Describe the problem we are seeing with the current process,
and why it is important to solve them.]
Difficulty of making clear and accepted decisions on important and
broad topics.
Scaling the decision making process.
Stakeholder involvement & legitimacy.
Clarity and transparency of decision making process.
Prior art
[Goal: Give a brief summary and pointers to options we looked at.]
IETF process
Python PEP process
Rust
Debian
W3C
See also this prior etherpad discussion.
Strawman proposal
[Goal: Summarize key ideas that we consider worth adopting, and point
to prior art. Provide rationale by explaining how each addresses
specific issues.]
More structured RFC decision process
Based on the Rust decision making process.
Nominate a shepherd from a (sub)team to guide an RFC through the process.
Makes sure that stakeholders are informed.
Guides the discussion.
Once the discussion plateaus or stalls & in coordination with
the RFC author(s), announces and widely publicizes a "Final Comment
Period", which is one week.
At the end of the "Final Comment Period", the (sub)team decides
based on the points made in the RFC discussion, and justifies its
decision based on the overall project principles and priorities. If
any new facts or aspects are surfaced in this discussion, a new Final
Comment Period needs to be started before making a decision.
Scaling the decision making process with sub-teams
Based on Rust subteams.
The core team is responsible for creating sub-teams, with a member
of the core team as its team leader. Initial membership is determined
by the leader, later changes are by consensus within the team.
Each sub-team has a specific vision and problem set to work on,
and the team leader is responsible for keeping the team on topic.
Sub-teams are empowered to decide on RFCs within their scope. The
team leader is responsible for elevating RFCs with unclear or broader
scope to the core team.
Henning,
If we're going to solve the problem of dead links, it needs to involve
automation, at least for the heavy lifting. Obviously, if a human
contributor can add a better source, that's great. But there are more dead
links than people willing to replace them.
On English Wikipedia, there's Category:All articles with dead external
links, and it contains more than 134,000 articles[1] -- and those are just
the pages where somebody's added the Dead link template. There are a lot of
missing references -- not just on English WP, but on all the projects --
and connecting those links to a live archive makes them useful again.
For links that were moved, we may be able to collect and use that
information -- I know that we're looking into what kind of metadata we can
collect when a new link is added to the page. But I think finding
alternative sources has to come from human contributors, and that's hard to
scale.
Danny
PM, Community Tech
[1]:
https://en.wikipedia.org/wiki/Category:All_articles_with_dead_external_links
On Mon, Dec 28, 2015 at 9:51 AM, Henning Schlottmann <h.schlottmann(a)gmx.net>
wrote:
> On 16.12.2015 21:12, Danny Horn wrote:
>
> > #1. Migrate dead links to the Wayback Machine (111 support votes)
>
> I really hope, you don't follow that wish, as it is detrimental to the
> quality of Wikipedia.
>
> Switching dead links to the archive is a move to a dead end, instead of
> looking for
>
> a) the new correct URL, as many links were just moved.
> b) alternative sources for the same fact.
>
> Ciao Henning
>
>
>
> _______________________________________________
> Wikimedia-l mailing list, guidelines at:
> https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines
> New messages to: Wikimedia-l(a)lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
> <mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe>
>