Hi,
For Wikimedia Incubator, a list of ISO 639-1/3 language codes would be
useful. I generated a PHP array based on
<http://sil.org/iso639-3/iso-639-3_20110525.tab>, and I wanted to make
an extension of that (which would be enabled on Incubator). FYI: the
array contains 7706 ISO 639-3 codes, besides the ISO 639-1 codes.
Now, such a list already ships with the Babel extension, and I don't
like duplication so I was thinking, maybe it could be added to core?
Besides these two extensions, it could provide language names for
{{#language}} which only works for languages known in MediaWiki's
Names.php (unless CLDR extension is enabled and the second parameter
is used).
Actually CLDR is incomplete itself and our CLDR extension has extra
languages names that we added to the extension (which makes a third
extension that could benefit of this).
And maybe there are more use cases...
What do you think?
Regards,
SPQRobin
Roan sent out a new set of HTTPS fixes today, which made us confident
enough to enable protocol-relative URLs and HTTPS on commonswiki and
foundationwiki. We haven't purged the cache yet for these wikis, so
it's very likely some pages will point you back to HTTP. We'll be
purging caches some time soon, but please don't hesitate to try it
now. Please file bug reports or let Roan or I know of any issues you
find.
Note: there is likely a bunch of site CSS, JS, and templates that will
need to be changed to use protocol relative URLs everywhere. HTTPS has
a massive long tail :). If you feel like helping out with that, please
be bold.
Another *important* note: "Log me in globally" is still actually
insecure, even when using HTTPS. It loads the images from each wiki
using HTTP, which is what sets your cookies (which are also, then sent
over HTTP). If you use this option, people can still steal your
cookies; they cannot, however steal your password.
- Ryan
Hi everyone,
Thank you everyone involved for getting the review queue down as low
as it is. As it stands, we have 82 new revisions to review, and 57
fixmes:
http://www.mediawiki.org/wiki/MediaWiki_roadmap/1.18/Revision_report
Back on August 18, we had 171 new revisions to review and 59 fixmes.
If you start there an plot linearly to release, that means reviewing
7-8 new revisions per day, and fixing an average of 2 fixmes per day
to just barely be done on September 16. Needless to say, we're doing
great on the new revisions, but the fixmes are a bit of a problem.
Even accounting for the fact that fixmes are being added as a result
of the rapid rate of new revision fixing, we're still not fixing the
fixmes fast enough. Between August 18 and now, we've netted 2
revisions, by fixing 9 fixmes, and adding 7 more. That's only fixing
the fixmes at a rate of 1.5 a day, which won't get us there even if we
don't add any more fixmes. Given the fact that reviewing 89 revisions
yielded 7 new fixmes, it's reasonable to expect that 5-10% of the
remaining new revisions will become new fixmes, which will add 4-8 new
fixmes before we're done.
So, please take a look at the fixme list (especially if you are one of
the committers involved) and do what you can to reduce our load.
Thanks!
Rob
Hi everyone,
I’m extremely pleased to welcome Aaron Schulz to Wikimedia Foundation
as a full-time developer in Platform Engineering. Aaron is a
long-time MediaWiki developer, starting as a volunteer in 2007. He
quickly proved adept at working with the FlaggedRevs extension, so WMF
hired him as a student contractor to take on development, and is now
still the primary developer of the FlaggedRevs extension in use on
many of our larger wikis today. More recently, he volunteered a lot
of the initial work on IPv6 compliance, and has made many small fixes
and improvement in the MediaWiki core. Aaron has been so active in
MediaWiki in the past few years that 5,527 of the 95,732 revisions in
our main source code repository have his name on them.[1]
This summer, Aaron continued his student contractor work on our
software deployment infrastructure, working on the Heterogeneous
Deployment project, which we plan to use to roll out MediaWiki 1.18 in
a controlled manner in September. He's one of many developers who is
slogging away at reviewing code commits in anticipation of 1.18, and
will generally be working on operational and performance-oriented
projects.
Welcome, Aaron!
Rob
[1] See http://www.mediawiki.org/wiki/Special:Code/MediaWiki/author/aaron
It looks like we've finally moved ahead of the curve with regard to
FIXMEs. Today, Sam Reedy (and others) plowed through some of them and
brought us up to where we need to be in order to deploy 1.18 in 3
weeks. We still need to keep up the pace, but we're in a better
position now.
There is one more area that we need to have work done, though, for 1.18
deployment and that is Bug #29068 ("Bugs to be fixed for 1.18 WMF
deployment"): https://bugzilla.wikimedia.org/29068
In order to close this tracking bug we need to fix the following:
https://bugzilla.wikimedia.org/29246 -- API errors occasionally with
unknown error 231
https://bugzilla.wikimedia.org/30192 -- Thumbnails of archived images
don't get deleted
https://bugzilla.wikimedia.org/30352 -- jQuery.makeCollapsible.js should
support 'autocollapse', 'innercollapse' and 'outercollapse' options
https://bugzilla.wikimedia.org/30384 -- extra newlines in nested
templates in tables [parser difference between 1.17 and 1.18]
Thanks for any help you can give on these bugs,
Mark.
Hi Jeroen,
I have seen that you released new versions of Maps/Semantic Maps, great,
thanks!
Could you please give me some information about the current status of KML
visualisation in Maps/Semantic Maps?
With the current version, I haven't been able to visualise KML files,
although mentioned [1].
[1] http://mapping.referata.com/wiki/Google_Maps_v3
For 7.x I had added support for automatic centering and external/internal
KML files. Do they still work?
It would be great if you could give me some pointers. Or maybe I can help
you improving the KML support.
Best,
Benedikt
--
AIFB, Karlsruhe Institute of Technology (KIT)
Phone: +49 721 608-47946
Email: benedikt.kaempgen(a)kit.edu
Web: http://www.aifb.kit.edu/web/Hauptseite/en
I'd like an accurate, visually pleasing way to count how many unreviewed
revisions there are in trunk, for when I'm encouraging volunteer code
review. TL;DR version: it would be great if someone fixed up RobLa's
chart, corrected its errors, and put a chart generator in the CodeReview
MediaWiki extension.
The JavaScript behind RobLa's CRStats chart
http://toolserver.org/~robla/crstats/ gives me some current numbers. By
inspecting
http://toolserver.org/~robla/crstats/data/trunkall/crstatsdata.js I
believe I see we have 279 revisions left to review in trunk.
But the CodeReview statistics page at
https://secure.wikimedia.org/wikipedia/mediawiki/wiki/Special:Code/MediaWik…
mentions that there are 344 NEW revisions in /trunk . That's more than
the 279 that Rob's script counts; which should I believe?
Rob believes the problem is with his script. He has checked to see
whether the problem is that his report, run every night at midnight UTC,
falls out of sync with the CodeReview statistics page; that's not it.
Rob says:
> Bummer, it looks like my script.... I probably won't get
> around to fixing up the script, but I've got a couple ideas about how
> to fix it...
> the details for those who care about what it would take to fix my
> script or just implement it correctly in PHP in MediaWiki. What
> throws my script off is that I'm calculating the revision history
> starting way back in history and moving forward, piecing together the
> history of code review from our revision log plus status changes. It
> used to be safe to assume that all revisions started life as "new",
> and then use the status changes to step each revision through the rest
> of the workflow. However, one new-ish feature of the CodeReview
> extension is the ability to automatically set the start state to
> "deferred" based on the path of the checkin, which means those changes
> never show up in the status change log. My initial hack was to step
> through twice: first time to determine the initial state (assuming
> that if initial state != final state, then final state was the actual
> initial state). A more correct approach is to just start at the end
> (where we have a correct accounting of state) and reconstruct states
> backwards.
--
Sumana Harihareswara
Volunteer Development Coordinator
Wikimedia Foundation
What: i18n triage bug triage
When: Wednesday, August 31, 15:00UTC
Time zone conversion: http://hexm.de/6h
Where: #wikimedia-dev on freenode
Use http://webchat.freenode.net/ if you don't have an IRC
client
Siebrand and Niklas and I spent time yesterday to prepare an i18n triage
for Wednesday. The following bugs will be discussed. I would
specifically call for any Arabic-knowledgeable developers to come to
this triage for #21429 ("Arabic double diacritics presentation"). If
you know someone who could help us with this bug, please invite them to
come to the triage or, at least, look at #21429.
https://bugzilla.wikimedia.org/164 -- Support collation by a certain
locale (sorting order of characters)
https://bugzilla.wikimedia.org/24156 -- Messages of log entries should
support GENDER
https://bugzilla.wikimedia.org/2361 -- Support dynamic fonts (CSS 3
@font-face, ttf/otf/eot, web fonts, WOFF) (done?)
https://bugzilla.wikimedia.org/29000 -- Allow font selection by language
https://bugzilla.wikimedia.org/29005 -- Unnecessary unicode Char code
change
https://bugzilla.wikimedia.org/29318 -- Pack the list of available fonts
https://bugzilla.wikimedia.org/4030 -- EasyTimeline reversed text in RTL
languages
https://bugzilla.wikimedia.org/29495 -- Numbering system grouping for
Indian languages
https://bugzilla.wikimedia.org/21429 -- Arabic double diacritics
presentation
I hope to see a lot of you there!
Mark.
--
Mark A. Hershberger
Bugmeister
Wikimedia Foundation
mhershberger(a)wikimedia.org
717.271.1084
In my spare time I have been coding a MediaWiki extension to generate
quizzes automatically from Wikipedia content.
I would be *very* interested in your feedback, so I have put a basic
prototype online, available at http://wikilearner.net/wiki
It's rough on the edges, but I hope you get the basic idea.
To be quizzed on content of your choice, simple create a new article
on the wiki, copy-pasting your favorite Wikipedia content. I suggest
you remove <ref> tags, templates, and alike which are not yet
supported on my wiki.
Cheers,
Justin