I'm happy to announce the availability of the second beta release of the
new MediaWiki 1.19 release series.
Please try it out and let us know what you think. Don't run it on any
wikis that you really care about, unless you are both very brave and
very confident in your MediaWiki administration skills.
MediaWiki 1.19 is a large release that contains many new features and
bug fixes. This is a summary of the major changes of interest to users.
You can consult the RELEASE-NOTES-1.19 file for the full list of changes
in this version.
Five security issues were discovered.
It was discovered that the api had a cross-site request forgery (CSRF)
vulnerability in the block/unblock modules. It was possible for a user
account with the block privileges to block or unblock another user without
providing a token.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=34212
It was discovered that the resource loader can leak certain kinds of private
data across domain origin boundaries, by providing the data as an executable
JavaScript file. In MediaWiki 1.18 and later, this includes the leaking of
CSRF
protection tokens. This allows compromise of the wiki's user accounts, say
by
changing the user's email address and then requesting a password reset.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=34907
Jan Schejbal of Hatforce.com discovered a cross-site request forgery (CSRF)
vulnerability in Special:Upload. Modern browsers (since at least as early as
December 2010) are able to post file uploads without user interaction,
violating previous security assumptions within MediaWiki.
Depending on the wiki's configuration, this vulnerability could lead to
further
compromise, especially on private wikis where the set of allowed file types
is
broader than on public wikis. Note that CSRF allows compromise of a wiki
from
an external website even if the wiki is behind a firewall.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=35317
George Argyros and Aggelos Kiayias reported that the method used to generate
password reset tokens is not sufficiently secure. Instead we use various
more
secure random number generators, depending on what is available on the
platform. Windows users are strongly advised to install either the openssl
extension or the mcrypt extension for PHP so that MediaWiki can take
advantage
of the cryptographic random number facility provided by Windows.
Any extension developers using mt_rand() to generate random numbers in
contexts
where security is required are encouraged to instead make use of the
MWCryptRand class introduced with this release.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=35078
A long-standing bug in the wikitext parser (bug 22555) was discovered to
have
security implications. In the presence of the popular CharInsert extension,
it
leads to cross-site scripting (XSS). XSS may be possible with other
extensions
or perhaps even the MediaWiki core alone, although this is not confirmed at
this time. A denial-of-service attack (infinite loop) is also possible
regardless of configuration.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=35315
*********************************************************************
What's new?
*********************************************************************
MediaWiki 1.19 brings the usual host of various bugfixes and new features.
Comprehensive list of what's new is in the release notes.
* Bumped MySQL version requirement to 5.0.2.
* Disable the partial HTML and MathML rendering options for Math,
and render as PNG by default.
* MathML mode was so incomplete most people thought it simply didn't work.
* New skins/common/*.css files usable by skins instead of having to copy
piles of
generic styles from MonoBook or Vector's css.
* The default user signature now contains a talk link in addition to the
user link.
* Searching blocked usernames in block log is now clearer.
* Better timezone recognition in user preferences.
* Extensions can now participate in the extraction of titles from URL paths.
* The command-line installer supports various RDBMSes better.
* The interwiki links table can now be accessed also when the interwiki
cache
is used (used in the API and the Interwiki extension).
Internationalization
- --------------------
* More gender support (for instance in user lists).
* Add languages: Canadian English.
* Language converter improved, e.g. it now works depending on the page
content language.
* Time and number-formatting magic words also now depend on the page
content language.
* Bidirectional support further improved after 1.18.
Release notes
- -------------
Full release notes:
https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/core.git;a=blob_plain;f=RE
LEASE-NOTES-1.19;hb=1.19.0beta2
https://www.mediawiki.org/wiki/Release_notes/1.19
Co-inciding with these security releases, the MediaWiki source code
repository has
moved from SVN (at https://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3)
to Git (https://gerrit.wikimedia.org/gitweb/mediawiki/core.git). So the
relevant
commits for these releases will not be appearing in our SVN repository. If
you use
SVN checkouts of MediaWiki for version control, you need to migrate these to
Git.
If you up are using tarballs, there should be no change in the process for
you.
Please note that any WMF-deployed extensions have also been migrated to Git
also, along with some other non WMF-maintained ones.
Please bear with us, some of the Git related links for this release may not
work instantly,
but should later on.
To do a simple Git clone, the command is:
git clone https://gerrit.wikimedia.org/r/p/mediawiki/core.git
More information is available at https://www.mediawiki.org/wiki/Git
For more help, please visit the #mediawiki IRC channel on freenode.netirc://irc.freenode.net/mediawiki or email The MediaWiki-l mailing list
at mediawiki-l(a)lists.wikimedia.org.
**********************************************************************
Download:
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.0beta2.tar.gz
Patch to previous version (1.19.0beta1), without interface text:
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.0beta2.patch.gz
Interface text changes:
http://download.wikimedia.org/mediawiki/1.19/mediawiki-i18n-1.19.0beta2.patc
h.gz
GPG signatures:
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.0beta2.tar.gz.si
g
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.0beta2.patch.gz.
sig
http://download.wikimedia.org/mediawiki/1.19/mediawiki-i18n-1.19.0beta2.patc
h.gz.sig
Public keys:
https://secure.wikimedia.org/keys.html
I normally don't chime in on these threads, but I feel compelled.
Rob made a significant contribution to VisualEditor and this change is a
well deserved nod to his growth as an engineer as well as a step in the
right direction to spread knowledge around the organization.
Congrats on the new position man.
- Trevor
On Thu, May 8, 2014 at 10:24 AM, Moriel Schottlender <
mschottlender(a)wikimedia.org> wrote:
> Woohoo, congrats Rob!!
> +1 to James and Roan's sentiments, and best of luck on the growth team!
>
>
> On Thu, May 8, 2014 at 7:28 AM, James Forrester <jforrester(a)wikimedia.org>wrote:
>
>> On Wednesday, May 7, 2014, Terry Chay <tchay(a)wikimedia.org> wrote:
>>
>>> Hello everyone,
>>>
>>> I’m pleased to announce Rob Moen is moving from the VisualEditor team to
>>> the Growth team.
>>>
>>
>> Rob,
>>
>> It's been a pleasure working with you. Thank you for everything;
>> using VisualEditor is so much better because of your work. Best wishes with
>> Growth – they're lucky to get you. :-)
>>
>> J.
>>
>>
>> --
>> James D. Forrester
>> Product Manager, VisualEditor
>> Wikimedia Foundation, Inc.
>>
>> jforrester(a)wikimedia.org | @jdforrester
>>
>> _______________________________________________
>> Engineering mailing list
>> Engineering(a)lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/engineering
>>
>>
>
> _______________________________________________
> Wmfall mailing list
> Wmfall(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wmfall
>
>
We've recently used OOUI for Media Viewer and had to make an effort to
"keep it in its own corner" due to its size. At the moment it's loaded on
demand when you click on a button that opens the dialog where all our
OOUI-dependent UI elements happen to be.
I was wondering if there was any plan to make OOUI more modular. Or if
there's somehow an option like that which we entirely missed. It's already
relatively large (41kb gzipped on beta right now) and likely to keep
growing. While I imagine that VE needs all of it, if the goal is for it to
be reused across all projects, it seems to be like the cold cache
experience could be improved if it was available in different parts that
can be loaded on demand.
Is something like that in the works? If not, what's the reasoning for
keeping it all under one JS file in production?
Maybe it will become so ubiquitous in mediawiki code that at some point in
the future it will make sense to load all of it as part of every page. But
to me it seems like OOUI is at a turning point at the moment, where teams
likes ours are judging whether or not they should use it for their project.
Which creates an entirely different situation, because project X has to
live with the consequences of being likely to be the way through which
users will load OOUI for the first time, just because it has more traffic
that the existing places where OOUI is currently deployed. And you don't
want to be the project that people feel is slow to open the first time they
use it, due to a dependency that is quite large compared to the small
portion that is really being used in the context of that project.
With that in mind, OOUI modularity is an important factor, in my opinion,
in the decision we're soon going to have to make as a team regarding
UploadWizard.
Hi all,
Some of you might heard of it, some of you probably know it or even
regularly use it - huggle is a super fast diff browser for MediaWiki
intended for dealing with vandalism on Wikimedia projects (but it can
be used for any installation), written in C++.
It is being used on a number of wikimedia projects, and it helps to
revert hundreds of instances of vandalism every day, you can check
this chart for some overview:
http://tools.wmflabs.org/huggle/toolstat/daily.php
It may not look like that, but huggle is very effective, compared to
twinkle or other tools which are being used by thousands of users,
huggle is typically used only by less than ten users daily on english
wikipedia, but reverts almost same amount of vandalism as twinkle and
such (some days even more). That makes its users hundreds times more
effective than users of other tools.
Huggle is currently being developed primarily only by me, Adam
Shorland (addshore) and few other devs:
https://github.com/huggle/huggle3-qt-lx/graphs/contributors
If anyone of you were interested in helping us, wanted to contribute
or just find out more about that, me and Adam will be at Hackaton so
don't hesitate to contact us there! Huggle3 is not just all about C++
it contains embedded python interpreter and can be extended with
Python and C++/C extensions.
Wikimedia Hackathon participants, please have a look at
https://www.mediawiki.org/wiki/Zürich_Hackathon_2014/Schedule
It is an almost empty grid with very few exceptions. It is based on the
idea of beginning of the day with 1h slots for planning, and then
defaulting to 1h30 slots, following the advice of many participants in
Amsterdam last year.
There are some basic principles proposed, everything debatable. If you have
especial requirements for your session (e.g. local or remote guests that
need to know the time in advance) then you can start discussing them in the
Talk page. The idea is to do most of the scheduling on Friday morning, then
fine tune as we go.
On Friday morning we will use the /Topics page to organize the scheduling,
starting with the topics that have raised more interest:
https://www.mediawiki.org/wiki/Zürich_Hackathon_2014/Topics
You are welcome to create wiki subpages and/or etherpads (
https://etherpad.wikimedia.org/ ) for your activities. We will have a
process for session coordinators to create hangouts from the MediaWiki
Google+ page, in order to have low-tech instant streaming and videos
archived, all from your laptops.
--
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
Hey all,
Me and Andrew Bogott will be travelling to Zürich for the Hackaton[1] in
the coming days; this means limited online availability during actual
travel time, but vastly increased physical availability in Zürich
itself. :-)
I will be available for help in porting tools to the Tool Labs, helping
maintainers debug or improve their tools, or for any other labs-related
task for the duration of the Hackaton. Don't hesitate to drop by and
say hi, even if you don't need my help with anything!
After the Hackaton, I'm taking a few day's worth of vacation, and will
return to full availability starting May 16th.
-- Marc
[1] https://www.mediawiki.org/wiki/Zürich_Hackathon_2014
Minutes and slides from the quarterly review meeting with the
Wikimedia Foundation's Release Engineering and QA team on April 30
have been posted here:
https://www.mediawiki.org/wiki/Wikimedia_Release_and_QA_Team/Quarterly_revi…
--
Tilman Bayer
Senior Operations Analyst (Movement Communications)
Wikimedia Foundation
IRC (Freenode): HaeB
Copying from #wikimedia-dev IRC (sorry) and cc'ing wikitech-l as requested:
[James_F] cscott: I think you're confused about fab.wmflabs.org. It's
not meant for use. It's meant for evaluation.
[cscott] James_F: and i'm saying that unless i can use it for
something i can't easily evaluate it
[greg-g] cscott: James_F crazy idea here: can some teams use it for
real (I think growth is, kinda?) and export/import to a future real
instance?
frontend...
[cscott] greg-g: i think that's more or less the plan, but it doesn't
really mean that the results will be applicable to the groups who
*aren't* using it.
that is, i could, for example, use the existing instance for work on
the PDF backend, which is a 1-2 person project. but that doesn't mean
that the results apply to how the parsoid team works.
[greg-g] cscott: of course, but it can't be on anyone else other than
those people to try it and report bugs, no one can impersonate you
effectively other than you.
[cscott] greg-g: i totally agree, which is why i'm arguing for a soft
transition.
[greg-g] cscott: I think it's mostly semantics at this point,
honestly. The difference between a soft transition and the RFC closing
with "Yes, pending those blockers are addressed" are the same thing,
especially since Platform (the usual suspects saddled with this kind
of stuff) won't have time to actually do anything production-like any
time soon with Phab.
[James_F] greg-g, cscott: I think that ultimately if we can't work out
how we use gerrit and boil it down to 10 bullet points we've failed.
It's a simple tool.
[cscott] greg-g: what i'm saying is i don't think it's worth closing
the RFC with a "yes, but" since those bugs are all show-stoppers to
Real Work right now.
[James_F] greg-g: A "soft transition" means "go make a production
Phabricator instance".
Which is pretty bad if we never use it properly, and don't shut down
other systems.
[gwicke] we could agree to write a new RFC after actually trying it
;)
[YuviPanda] I tried setting our fab.wmflabs.org instance up for doing
CR with its own hosted repositories, so we could put a few small
projects there
spent 3-4 hours, and got close before giving up.
I can add other people to the project if they want to give it a shot
:) Note that none of this is puppetized so need to be slightly extra
careful
[cscott] James_F: if you want to phrase it that way, a "yes but"
means, "we'll commit to transitioning to phabricator without ever
having seen an instance which will actually work for us"
[greg-g] cscott: how is that different from any development problem/goal?
[cscott] James_F: i think all the extremes are bad. i don't think we
should give up on phabricator. and i think it's too early to
definitively commit to it. and i think we shouldn't spent 100% of the
resources to make a production instance before making a decision, and
I don't think we should make the decision without spending *any* of
the resources.
i don't think we can do a transition without some sort of integration
of new and old systems, and i also agree that maintaining the old and
new systems together indefinitely defeats the point.
[gwicke] fwiw, I tend to agree with cscott that it's hard to make any
informed decision without actually testing it in practice
[James_F] cscott: You're throwing around dramatic terms like
"blockers" without explaining what you mean. Again, please take this
to wikitech-l and let's have a proper discussion.
[cscott] James_F: I'm arguing for a middle path. devote *some*
resources, implement *some* interoperability, decide at *some later*
point when we have a more functional instance.
--
[http://cscott.net]
Hi,
== Closing the Phabricator RFC ==
As previously announced [1], we've been facilitating an RFC proposing to
replace Wikimedia's current product management tools and development
toolchain by a tool called Phabricator:
https://www.mediawiki.org/wiki/Requests_for_comment/Phabricator
We'd like to thank everybody for discussing and testing Phabricator and
providing very helpful feedback for the last three weeks! In order to
move forward and avoid letting the RFC be forgotten in a dusty corner of
the wiki, it's now time to close and summarize it.
The goal of the RFC was to gauge interest in simplifying our development
toolchain and consolidating our tools (gitblit, Gerrit, Jenkins,
Bugzilla, RT, Trello, and Mingle) into Phabricator.
At first glance, it seems that there is support for this proposal. The
consensus is also that there are blockers that must be addressed before
any migration is considered, and that any migration must be carefully
planned and as carefully executed.
To be clear: It's not yet been decided to move to Phabricator. The RFC
has shown that there is interest and enthusiasm about Phabricator, and
this means resources could now be devoted to work more specifically on
the blockers and the migration plan. We expect that there will be
another (shorter) discussion down the road to serve as a reality check.
Its format will be lighter than that of the RFC, since its goal will
mostly to check that blockers have been resolved and the migration plan
makes sense.
== Plan for blockers and migration ==
A first phase of the migration would focus on migrating all the Bugzilla
data to Phabricator, and merging the project management work being done
in Trello and Mingle.
A second phase —that could be worked in parallel— would focus on
substituting Gerrit for code review, and RT. There is also a possibility
to deprecate Jenkins as a continuous integration tool, but this option
is out of scope for now.
A few blockers have been identified in these areas, and we will
collaborate with the Phabricator community to fix them.
The schedule for this migration depends on resolving those issues which
are blockers for Wikimedia moving to Phabricator.
The Engineering Platform team at the Wikimedia Foundation would lead
this project allocating the resources necessary to define a detailed
plan, proceed with the migration, and maintain the new infrastructure.
A longer version, requirements to still sort out first, and concerns
raised have been summarized by Quim at
https://www.mediawiki.org/wiki/Requests_for_comment/Phabricator/Plan
== Join the next discussions ==
There will be a session about Phabricator at the Wikimedia hackathon in
Zürich this week-end (see [2]), as well as another IRC discussion next
week (in #wikimedia-office on Wednesday, May 14, at 18:00 UTC:
http://www.timeanddate.com/worldclock/fixedtime.html?msg=Phabricator+migrat… )
Cheers,
Guillaume and Andre (and Quim)
[1] http://lists.wikimedia.org/pipermail/wikitech-l/2014-April/075993.html
[2] https://www.mediawiki.org/wiki/Z%C3%BCrich_Hackathon_2014/Topics#Future_of_…
--
Andre Klapper | Wikimedia Bugwrangler
http://blogs.gnome.org/aklapper/