Minutes and slides from Wednesday's quarterly review of the
Foundation's VisualEditor team are now available at
https://meta.wikimedia.org/wiki/Metrics_and_activities_meetings/Quarterly_r…
(A separate but related quarterly review meeting of the Parsoid team
took place today, those minutes should be up on Monday.)
On Wed, Dec 19, 2012 at 6:49 PM, Erik Moeller <erik(a)wikimedia.org> wrote:
> Hi folks,
>
> to increase accountability and create more opportunities for course
> corrections and resourcing adjustments as necessary, Sue's asked me
> and Howie Fung to set up a quarterly project evaluation process,
> starting with our highest priority initiatives. These are, according
> to Sue's narrowing focus recommendations which were approved by the
> Board [1]:
>
> - Visual Editor
> - Mobile (mobile contributions + Wikipedia Zero)
> - Editor Engagement (also known as the E2 and E3 teams)
> - Funds Dissemination Committe and expanded grant-making capacity
>
> I'm proposing the following initial schedule:
>
> January:
> - Editor Engagement Experiments
>
> February:
> - Visual Editor
> - Mobile (Contribs + Zero)
>
> March:
> - Editor Engagement Features (Echo, Flow projects)
> - Funds Dissemination Committee
>
> We'll try doing this on the same day or adjacent to the monthly
> metrics meetings [2], since the team(s) will give a presentation on
> their recent progress, which will help set some context that would
> otherwise need to be covered in the quarterly review itself. This will
> also create open opportunities for feedback and questions.
>
> My goal is to do this in a manner where even though the quarterly
> review meetings themselves are internal, the outcomes are captured as
> meeting minutes and shared publicly, which is why I'm starting this
> discussion on a public list as well. I've created a wiki page here
> which we can use to discuss the concept further:
>
> https://meta.wikimedia.org/wiki/Metrics_and_activities_meetings/Quarterly_r…
>
> The internal review will, at minimum, include:
>
> Sue Gardner
> myself
> Howie Fung
> Team members and relevant director(s)
> Designated minute-taker
>
> So for example, for Visual Editor, the review team would be the Visual
> Editor / Parsoid teams, Sue, me, Howie, Terry, and a minute-taker.
>
> I imagine the structure of the review roughly as follows, with a
> duration of about 2 1/2 hours divided into 25-30 minute blocks:
>
> - Brief team intro and recap of team's activities through the quarter,
> compared with goals
> - Drill into goals and targets: Did we achieve what we said we would?
> - Review of challenges, blockers and successes
> - Discussion of proposed changes (e.g. resourcing, targets) and other
> action items
> - Buffer time, debriefing
>
> Once again, the primary purpose of these reviews is to create improved
> structures for internal accountability, escalation points in cases
> where serious changes are necessary, and transparency to the world.
>
> In addition to these priority initiatives, my recommendation would be
> to conduct quarterly reviews for any activity that requires more than
> a set amount of resources (people/dollars). These additional reviews
> may however be conducted in a more lightweight manner and internally
> to the departments. We're slowly getting into that habit in
> engineering.
>
> As we pilot this process, the format of the high priority reviews can
> help inform and support reviews across the organization.
>
> Feedback and questions are appreciated.
>
> All best,
> Erik
>
> [1] https://wikimediafoundation.org/wiki/Vote:Narrowing_Focus
> [2] https://meta.wikimedia.org/wiki/Metrics_and_activities_meetings
> --
> Erik Möller
> VP of Engineering and Product Development, Wikimedia Foundation
>
> Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate
>
> _______________________________________________
> Wikimedia-l mailing list
> Wikimedia-l(a)lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l
--
Tilman Bayer
Senior Operations Analyst (Movement Communications)
Wikimedia Foundation
IRC (Freenode): HaeB
As part of the migration of labs from pmtpa to eqiad, I'll be
redirecting wikitech to a new host in eqiad on Tuesday. This will
require some downtime in order to make sure that the wikis are in sync
between the two boxes.
It should only take a few minutes, but don't be alarmed if things are
off for an hour or so.
-Andrew
Hello,
I have changed the DNS entries for the beta cluster to point to the
instances hosted in our EQIAD datacenter.
It seems most of the functions are working now beside SSL support.
If you see anything that looks wrong, please fill in bugs in bugzilla
against Wikimedia Labs > deployment-prep (beta).
The DNS might take sometime to propagate. The IP addresses are:
$ dig +short bits.beta.wmflabs.org
208.80.155.137
$ dig +short upload.beta.wmflabs.org
208.80.155.136
$ dig +short en.wikipedia.beta.wmflabs.org
208.80.155.135
$ dig +short en.m.wikipedia.beta.wmflabs.org
208.80.155.139
$
Thanks to everyone that helped reconfigure the beta cluster from scratch!
--
Antoine "hashar" Musso
+wikitech-l/qa
I presume this is a byproduct of the migration. This is urgent.
On Mon, Mar 31, 2014 at 11:25 AM, Ryan Kaldari <rkaldari(a)wikimedia.org>wrote:
> http://en.m.wikipedia.beta.wmflabs.org/ has been giving me a 503 all
> morning. I really need to be able to use it since we are deploying a change
> this week that is specifically related to the cluster configuration
> (specifically overriding the licensing messaging to support dual licensing
> for the WMF projects).
>
> Bug filed:
> https://bugzilla.wikimedia.org/show_bug.cgi?id=63315
>
> Ryan Kaldari
>
> _______________________________________________
> Mobile-l mailing list
> Mobile-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/mobile-l
>
>
--
Arthur Richards
Software Engineer, Mobile
[[User:Awjrichards]]
IRC: awjr
+1-415-839-6885 x6687
You can now submit tutorial or talk proposals for WikiConference USA, and
you can register and ask for a scholarship for your travel expenses (more
information below and at http://wikiconferenceusa.org/wiki/Scholarships ).
If it'll be hard for you to get to the Zurich or London hackathons this
year, consider meeting up at WikiConference USA.
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation
Date: Tue, 28 Jan 2014 16:56:55 -0500
From: Pharos <pharosofalexandria(a)gmail.com>
To: wikimediaannounce-l(a)lists.wikimedia.org
Subject: [Wikimedia Announcements] WikiConference USA Announcement
Message-ID:
<CAJcrdm5+HquZyAYERJd_4sF4Zj2oKtq8PmYR78LJ23DzykJyHQ(a)mail.gmail.com>
Content-Type: text/plain; charset="iso-8859-1"
I am very pleased to announce that Wikimedia NYC and Wikimedia DC are
working in collaboration to host the first national Wikimedia conference in
the United States!
Here are the details for the conference:
Dates: Friday, May 30, 2014 - Sunday, June 1, 2014
Location: New York Law School (185 West Broadway, New York, NY 10013)
Website: http://wikiconferenceusa.org
Email: wikicon(a)wikimedianyc.org
Registration: http://wikiconusa.eventbrite.org/
For more information, please review our official press release below! We
hope you will join us and help us spread the word!
https://commons.wikimedia.org/wiki/File:WikiCon_USA_2014_Press_Release_v1.p…
Thanks,
Richard (User:Pharos)
Wikimedia NYC
Hello,
I wrote a PHP based JSON linter which would bail out whenever a json
file can't be understood by PHP json_decode().
The linter will be able to the jslint Jenkins jobs which are already
running jshint. It would be nice to pass it on your repositories to
avoid failure when the job is enable which I plan to do next Monday on
March 17th.
The lint script is in integration/jenkins.git bin/json-lint.php which
you can fetch with:
curl
'http://git.wikimedia.org/raw/integration%2Fjenkins.git/master/bin%2Fjson-li…
'> /tmp/json-lint.php
Usage:
json-lint.php .
Example output:
./node_modules/es6-shim/bower.json: Syntax error
./node_modules/es6-shim/component.json: Syntax error
They are missing comma in dictionaries structures of the git repository
mediawiki/services/parsoid/deploy
I have passed the scripts against mediawiki/extensions/* and mw/core but
other repository would need to be verified if they have a jslint job
running.
Thanks!
--
Antoine "hashar" Musso
Hello all,
I would like to announce the release of MediaWiki Language Extension
Bundle 2014.03. This bundle is compatible with MediaWiki 1.22.4 and
MediaWiki 1.21.7 releases.
* Download: https://translatewiki.net/mleb/MediaWikiLanguageExtensionBundle-2014.03.tar…
* sha256sum: f3a253e05f6b7c4f451882a1a78a138a1dcaecd2777237d0c2b8af7c3ecced70
Quick links:
* Installation instructions are at: https://www.mediawiki.org/wiki/MLEB
* Announcements of new releases will be posted to a mailing list:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-i18n
* Report bugs to: https://bugzilla.wikimedia.org
* Talk with us at: #mediawiki-i18n @ Freenode
Release notes for each extension are below.
-- Kartik Mistry
== Babel, CleanChanges ==
* Only localisation updates.
== CLDR ==
* Updated to CLDR 25.
* Localisation updates.
== LocalisationUpdate ==
LocalisationUpdate is rewritten for JSON file format support for
extensions. This is compatible with MediaWiki 1.19 and above.
* Extensions using JSON are fully supported and LocalisationUpdate
will pickup new languages as they appear.
* LocalisationUpdate will store translation using JSON file format. If
extension is not yet migrated to JSON, messages will be stored as PHP
file.
=== Configuration changes ===
==== Global Variables ====
====== $wgLocalisationUpdateRepository ======
Default repository source to fetch translation. MediaWiki's Github
mirror is set to default repository to fetch translations. This is
added with version 1.1.
===== $wgLocalisationUpdateRepositories =====
Array of repositories URLs from which to retrieve localisations for
core MediaWiki and extensions. Default is set to GitHub's MediaWiki
Git repositories and you should not change this unless you're using
this performance improvement or otherwise know what you're doing. This
is added with version 1.1.
==== Script parameters changes ====
With version 1.1,
--repoid: Fetch translations from repositories identified by this
==== Notes ====
* Once core MediaWiki switches to JSON format, all users must update
LU to keep receiving updates. They will need to update anyway because
older versions of LocalisationUpdate do not support JSON at all.
* If an extension switches to JSON, we will not be able to provide
updates until its local checkout is updated to such a version.
LocalisationUpdate does not know about the switch and will try to
fetch the PHP shim, which does not have translations any longer. This
is usually okay, as the PHP shims support old MediaWiki versions;
problems might appear if the extension has other incompatible changes
which prevent updating.
* For more details, also see:
https://www.mediawiki.org/wiki/Extension:LocalisationUpdate
== Translate ==
=== Noteworthy changes ===
* Added support for insertables with numbers in the end on translatable pages.
* Improvements in TUX shortcuts: Shortcut indicators are now more
visible. Fixed display in RTL. Added indicators for up/down arrows.
* characterEditStats.php: Mention the max age of recent changes
($wgRCMaxAge) limits.
* Added a notice about ULS extension dependency. On Special:Translate,
it will fail with MediaWiki error page if ULS is not installed.
== UniversalLanguageSelector ==
=== Noteworthy changes ===
* Compacting the interlanguage links with the ULS as new Beta feature.
Displays a shorter version of the language list with the languages
that are more relevant to you. More info at:
https://www.mediawiki.org/wiki/Universal_Language_Selector/Design/Interlang…
* Bug 56081: Reset webfonts where inline css style found upon reset.
This fixes broken live preview for content font.
=== Fonts ===
* Updated Lohit Oriya font to new upstream version and renamed to Lohit Odia.
--
Kartik Mistry/કાર્તિક મિસ્ત્રી | IRC: kart_
{kartikm, 0x1f1f}.wordpress.com
Hi Gryllida,
I am extremely sorry I took so long to reply to your mail, I was down with
typhoid and I am slowly recovering.
I made corrections as you suggested. Thanks a lot for sparing your time.
As instructed by you I included my full opensource contribution history and
just for the record I am stating it here as well.
I have four bug patches merged from last year. bug
45580<https://bugzilla.wikimedia.org/show_bug.cgi?id=45580>
, bug 43504 <https://bugzilla.wikimedia.org/show_bug.cgi?id=43504>,
bug 33438 <https://bugzilla.wikimedia.org/show_bug.cgi?id=33438> & bug 48197
<https://bugzilla.wikimedia.org/show_bug.cgi?id=48197>
This year due to my ill health I could only get two in.
https://gerrit.wikimedia.org/r/117854https://gerrit.wikimedia.org/r/121975
The last patch is relavant to my gsoc project as it deals with forms which
is kind of a part of the project that I applied to.
I would see to that these two get merged as well and if possible I will
send one more over the next weekend.
The only other open source project that I have contributed to is PHPMYADMIN
which has very high standards for patch acceptance and stringent code
reviews.
I have four patches merged in there as well. Of these one of those is 280
line patch that I talked about in my proposal as well.
The details of the rest of the patches could be found here:
https://github.com/phpmyadmin/phpmyadmin/commits?author=ganeshaditya1
Regards,
Aditya
Hi,
As part of our extension testing, we've set up varnish in accordance
with http://www.mediawiki.org/wiki/Manual:Varnish_caching
One of the things we've noticed is that our oldid URIs are cached,
whereas Wikipedia doesn't seem to cache those pages.
Is there a reason why Wikipedia doesn't do this? Is there some
threshold that Wikipedia uses for caching?
Thanks in advance,
Shawn M. Jones
Graduate Research Assistant
Department of Computer Science
Old Dominion University