Welcome to three new extensions committers:
Alexander (grafzahl) expects to work on: "Extension:Score, possibly
Extension:JHilbert should it ever go into SVN."
Adam White (adamw) is working on Extension:Offline -- more info at
http://code.google.com/p/wikipedia-offline-patch/ .
Isaac "Ike" Hecht (tosfos) works on the AdManager extension and will be
working on Semantic MediaWiki-related extensions.
Welcome, Ike, Adam, and Alexander!
--
Sumana Harihareswara
Volunteer Development Coordinator
Wikimedia Foundation
Hi, everyone. I'm looking for some clarification about the process whereby
code gets deployed on the Wikimedia cluster. Not so much the technical
side of things, and in fact, I'd love to keep this conversation
VCS-agnostic. We'll be moving to git soon and things will change, but
these questions apply regardless.
I've been working to review a new extension that Petr Bena wrote,
InteractiveBlockMessage. It's a simple little extension that adds some
template magic for blocked users, to facilitate the creation of less
confusing talk page templates. This bug has links to the relevant code and
on-wiki discussions: https://bugzilla.wikimedia.org/show_bug.cgi?id=32819
Here's my present understanding of the process for community-contributed
code:
1. Discussion happens, a need arises, etc.
2. Someone writes some code, probably in the form of an extension
3. That extension is checked in and makes its way through code review. All
revisions must be ok'd to deploy.
4. On-wiki consensus is reached regarding deployment, and the discussion
closes.
5. A ticket is created to request deployment on a specific wiki
6. Someone Who's Been Around A While (Tim, Roan, maybe others) looks over
the code and decides it's okay
7. (maybe) A discussion regarding the extension happens here on wikitech-l.
8. The ticket gets picked up, extension gets deployed, ticket gets closed.
(Of course, I'm probably getting this wrong and I'm sure the steps are out
of order. Please correct.)
This differs rather significantly from the deployment process we've been
following for new code here in the office. That's more like:
1. Someone decides we need the code.
2. The code gets written
3. Code is checked in, makes its way through CR, must be ok'd before
deployment
4. A deployer schedules a window and deploys it.
Note that I'm leaving out the technical parts (creating a patch, pushing to
prototype, testing, etc...) That's unnecessary detail here. This message
concerns only the social part of the procedure.
My specific questions:
Process B is what I'm used to, but it seems that for this extension, it's
process A. When do we pick one or the other? Is process A for
community-contributed code, whereas B is for stuff from WMF? Do things
change when we're deploying a whole new extension? I understand that this
process is informal, and I'm not necessarily pushing to formalize it, but
maybe a few general guidelines or a minimum standard would help make things
more clear.
The step where Someone Who's Been Around A While reviews the code makes
sense, but it seems to be applied inconsistently, or perhaps the conditions
for that just aren't clear. When does code need to be run past one of
these very busy people, and when is it okay to push it without? Is there a
way to indicate which code needs this review, and when it's done aside from
the existing 'OK' status in CR? Secondly, who *are* these people? I've
heard Roan and Tim so far, but I bet there are more.
Is a discussion on wikitech-l generally part of the process in addition to
the on-wiki discussion? Is that only for new extensions, or for other
types of deployment as well?
I would love to get this worked out and document it, so community
developers can better understand how to get their code deployed. Of
course, if this is already documented somewhere please send me a link. :)
-Ian
Cross-posting.
---------- Forwarded message ----------
From: Steven Walling <swalling(a)wikimedia.org>
Date: Tue, Dec 20, 2011 at 11:58 AM
Subject: [Foundation-l] IRC office hours with the WMF features team, Jan.
4th 2012
To: Wikimedia Foundation Mailing List <foundation-l(a)lists.wikimedia.org>
Hey everyone,
Since folks have been asking about it, I wanted to announce that the
features development team at the Wikimedia Foundation will be holding an
office hours (in #wikimedia-office) about the general past, present, and
future of MediaWiki features being worked on here at the WMF.
This will be on January 4th, 2012 at 23:00 UTC. Documentation is on Meta
for time conversion and IRC how-tos.[1]
--
Steven Walling
Community Organizer at Wikimedia Foundation
wikimediafoundation.org
1. https://meta.wikimedia.org/wiki/IRC_office_hours
_______________________________________________
foundation-l mailing list
foundation-l(a)lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
I'm running a firefox plugin called certpartol which alerts me to unusual
ssl cert changes.
The existing cert signed by GeoTrust, Inc. wasn't set to expire until
2016-07-19 02:17:12.
The new cert is signed by DigiCert Inc.
I just want to make sure this is an intentional change and not a fake cert.
I took a screenshot of the certpatrol warning @
http://img204.imageshack.us/img204/8463/screenshot20111220at953.png
Hi,
I am looking for some bug tracking thing where I could keep track of
how the various caching issues on Wikipedia are being addressed. Issues
like Wikipedia showing the "view source" tab rather than the "edit" tab,
or claiming there are pending revisions when there aren't. Some time ago
I'd have added broken images due to pending revisions, but that at least
seems to have been addressed. I also had, about two months ago, an issue
on Commons where a page would constantly automatically reload, unless
you did something like clicking Edit and then Read again, unless you did
clear your cache, in which case the issue would re-appear. I also had an
issue where some thread on the german Wikipedia would not be updated un-
less you forced some reload, and it would re-appear again if you happen
to clear your cache and re-connect to your ISP (meaning you get a new IP
address).
(This is all for users who are not logged into any account.)
I've tried a couple of times to find something on Bugzilla, but in over
a year and a half or so could not find much, so either I suck at search
or these problems have poor visibility.
(I'd post links, but the issues are not quite as reproducible as you'd
might like; a recent case I noticed was de:WP:K, some time after it was
locked it showed only "Show Source" rather than "Edit", but if you did
click "Show Source" that'd turn into "Edit". That was last week. I also
saw the "pending revisions even though there are none" issue after it,
but I don't recall which page it was on.)
regards,
--
Björn Höhrmann · mailto:bjoern@hoehrmann.de · http://bjoern.hoehrmann.de
Am Badedeich 7 · Telefon: +49(0)160/4415681 · http://www.bjoernsworld.de
25899 Dagebüll · PGP Pub. KeyID: 0xA4357E78 · http://www.websitedev.de/
Facebook have announced that a JIT compiler for PHP called HipHop
Virtual machine (hhvm) is now at a fairly advanced stage of
development, and that they expect its performance to eventually exceed
that of the compiled binaries produced with hphpc:
<https://www.facebook.com/notes/facebook-engineering/the-hiphop-virtual-mach…>
We had planned on compiling MediaWiki with hphp, with project
completion and deployment to Wikimedia around April 2012, at least for
parsing. But the idea of a slow compile/test cycle (a few
processor-hours) was daunting. Slow compilation is a problem in
practice for Facebook, and one of the main reasons for this new hhvm
project.
So we've decided to defer our HipHop deployment until hhvm is at a
suitable level of maturity. We don't know exactly when that will be,
but Jason Evans says in the note linked above that "the first 90% is
done; now we're on to the second 90% as we make it really shine."
We still want to do something about parser performance in the first
half of 2012, so we're going to bring forward our other performance
project, i.e. server-side scripting embedded in wikitext. That's a
project which is still at an early stage of planning. We will need to
define its scope, and to bite the bullet and make some tough design
choices (such as Lua versus JavaScript), if it's going to progress
from pipe dream to reality.
An extension specific to citations (like TemplateAdventures) would be
an alternative if scripting proves to be too hard.
Deferring HipHop will affect our project priorities in other ways:
* We kicked around the idea of splitting out parsing into a separate
cluster of servers, accessible via an HTTP API. The main motivation
for doing that is gone now (it would have allowed us to use HipHop for
parsing and Zend for everything else).
* We shelved a project plan for better source file distribution (a
scap replacement), on the basis that it would be obsolete with HipHop.
And our system for having different versions of MediaWiki on different
wikis (HetDeploy) was implemented in a quick and dirty way, for the
same reason. These two projects may have to be revisited.
-- Tim Starling
Greetings all,
The Mobile and Special Projects department is pleased to announce the
addition of two new contractors to the team: Yuvaraj Pandian and Max
Seminik.
Yuvaraj Pandian or Yuvi as we gotten to know him was one of our 2011
Google Summer of Code students. During GSoC he worked with Arthur
Richards on porting the Wikipedia 1.0 bot to a PHP extension named
Selection Sifter. This was in important offline project as it allowed
us to better support and extend our offline tool chains. Previous to
this he had been active in the busroutes.in community where users
@logic and @planemad motivated him to come on and help the Wikimedia
community. One of his early Wikimedia project memories includes the
tawiki community discussing the ShortUrl extension. He found the
entire channel so lively and helpful that he applied for GSoC.
Yuvi will be working for us remotely while on a six month sabbatical
from KCG College of Technology, India. He'll be extending the Android
app, enhancing MobileFrontend, and working with our various offline
efforts to further the reach of the projects. You can find him on irc
under the nick 'yuviapanda' .
Max Semink has been an active MediaWiki contributor since 2009. He's
recently been most involved in developing the API Sandbox for
MediaWiki. An extension that allows any user to easily create complex
API queries through a visual interface. When not hacking on MediaWiki,
Max has been busy developing C++ software for embedded systems in the
railroad industry. While train switching has captured the majority of
his recent developer cycles, he's super eager to change direction and
focus his full time efforts on MediaWiki.
Max will be continuing his interest in the API by helping us improve
and extend its use within mobile. He'll be adding a proper API to
MobileFrontend, extending support for GPS, and in general moving the
API forward for our mobile clients. Max will be working for us
remotely and can be found under the nick 'MaxSem' .
It's great to be able to see two active community members be able to
join us full time.
Please join me in (re)-welcoming Yuvi and Max!
--tomasz
Last week Robla and I worked on his CRStats to incorporate the Code
Review goals into his CRStats page. You can see the updated chart with
the goal trend line here: http://toolserver.org/~robla/crstats/
Here's hoping we can continue to get the actual data to meet the planned
goals.
Mark.