Right now we are about to implement some kind of basic template Engine
to share HTML on the server side as well as on the client side in
JavaScript. Basically this will be a bunch of HTML snippets which we
will put into a resource loader module and send to the client. The
snippets will need some kind of placeholder which can then be replaced
with different content. The replacement would be done by some basic
parser which had to be implemented in PHP as well as JS.
One thought was to simply use the MW message system for this. The
templates would of course get their own store, but the message parser
could perhaps be reused. $1 etc. could be used as placeholders and
even nice-to-haves such as PLURAL or {{int:}} would work out of the
box.
>From a first look, it could be as easy as overwriting
Message::fetchMessage in as subclass.
Off course it had to be taken care of the JavaScript side as well.
Doesn't seem like mw.Message would be a problem.
Any thoughts on this or does anyone know about some similar
implementation in any extensions?
Cheers,
Daniel
--
Daniel Werner
Software Engineer
Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin
Tel. (030) 219 158 26-0
http://wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 B. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
Hi all,
I'm excited to see that Max has made a lot of great progress in adding Solr
support to the GeoData extension so that we don't have to use mysql for
spatial search - https://gerrit.wikimedia.org/r/#/c/27610/
GeoData makes use of the Solarium php client, which is currently included
as a part of the extension. GeoData will be our second use of Solar, after
TranslationMemory extension which is already deployed -
https://www.mediawiki.org/wiki/Help:Extension:Translate/Translation_memorie…
the Wikidata team is working on using Solr in their extensions as
well.
TranslationMemory also uses Solarium, a copy of which is also bundled with
and loaded from the extension. For a loading and config example -
https://gerrit.wikimedia.org/r/gitweb?p=operations/mediawiki-config.git;a=b…
I think Solr is the right direction for us to go in. Current efforts can
pave the way for a complete refresh of WMF's article full text search as
well as how our developers approach information retrieval. We just need to
make sure that these efforts are unified, with commonality around the
client api, configuration, indexing (preferably with updates asynchronously
pushed to Solr in near real-time), and schema definition. This is
important from an operational aspect as well, where it would be ideal to
have a single distributed and redundant cluster.
It would be great to see the i18n, mobile tech, wikidata, and any other
interested parties collaborate and agree on a path forward, with a quick
sprint around common code that all can use.
-Asher
Hi folks,
we're experimenting with Hangout on Air + IRC as a meeting technology
that could potentially be used for various WMF gatherings to further
open up our communication both with remote staff and the world at
large. (For those who don't know, Hangout on Air is a nifty new
feature of Google Hangout that lets you broadcast a live YouTube
stream of a meeting you're organizing.)
As a pilot, we're starting what could potentially become a weekly
engineering chat. The first one will be on October 18:
https://www.mediawiki.org/wiki/Meetings/2012-10-18
Anyone will be able to join the meeting via IRC to participate and
watch the YouTube stream. Sign up if you want, or just join the
meeting when it happens.
I'd prefer to avoid bikeshedding at this point about the specific
technologies, their proprietary/evil nature, etc. - we're giving this
combination a try for the first run, and will iterate. the development
of open solutions like Apache OpenMeetings and new standards like
WebRTC will hopefully lead us to a fully open stack eventually. But
this seems like a workable start.
If this works well for the tech meeting, we'll likely also use it for
the next monthly metrics meeting (those have been recorded on video
and can be found here:
https://commons.wikimedia.org/wiki/Category:Wikimedia_Foundation_Metrics_an…
- but participation has only been semi-open due to WebEx limitations.)
Cheers,
Erik
--
Erik Möller
VP of Engineering and Product Development, Wikimedia Foundation
Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate
Dear all,
let me say it like this:
WOHOOOO!!!
All of our patchsets have been merged into core. You are awesome!
Thank you so much!
We won't leave you without new work, though. Three points:
First, the merge of the ContentHandler branch is, now that it is being
deployed, revealing issues in several places. If you discover
something, please file it in Bugzilla under the ContentHandler
component. The list of currently open bugs is here, and if you can
help us, this would be great.
<https://bugzilla.wikimedia.org/buglist.cgi?resolution=---&query_format=adva…>
Second, we have created a document describing how data from Wikidata
is being synched (or percolated, or propagated, or moved, or whatever
the word shall be) to the Wikipedias. This is the core technical heart
of Wikidata's inner magic, and we really would benefit from peer
review and scrutiny on this topic. Comment. Suggest ideas. Help us
out. We do not have the one perfect solution here, and it can make
quite an impact.
<https://meta.wikimedia.org/wiki/Wikidata/Notes/Change_propagation>
Third, there is an oldish bug to MW core, filed in May, which we have
a bad feeling about. We expect that it will bite us more often on
Wikidata. The trouble is, there is no fix, and actually, we do not
know what is going on there, and several others have also tried to
crack this nut. If you want a challenge, squash this beast!
<https://bugzilla.wikimedia.org/show_bug.cgi?id=37209>
Enjoy all,
Denny
--
Project director Wikidata
Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin
Tel. +49-30-219 158 26-0 | http://wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 B. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
I've made a modest initial stab at MobileFrontend support for using
ResourceLoader directly, using a 'target' filtering technique that we
discussed with Trevor, Roan, and Timo. This is another step in integrating
MobileFrontend/SkinMobile into the core MediaWiki ecosystem.
Once we're happy with this and merge it, this'll let both core code and
extensions add appropriate JS and CSS by whitelisting their modules for
mobile -- or including a separate mobile module if necessary -- without
having to special-case JS and CSS loading into MobileFrontend.
Core changes: https://gerrit.wikimedia.org/r/#/c/28433/
MobileFrontend: https://gerrit.wikimedia.org/r/#/c/28434/
In mobile beta mode, if jQuery is listed as supported for the device
profile, instead of loading jQuery directly we load up the ResourceLoader
startup module using the 'mobile' target instead of the default ('desktop').
This filters the module list and dependency graph, including only modules
whitelisted by including 'mobile' in its 'targets' attribute. The startup
module then loads jQuery and the stub 'mediawiki' module for us... as well
as anything that's been specified via OutputPage->addModules(), if it's in
the whitelist.
I haven't yet done much refactoring to make SkinMobile/MobileFrontend use
ResourceLoader fully for its own stuff, except for removing a special case
for the $wgResponsiveImages code that loads high-resolution pictures. By
marking the core JS modules used to implement this with:
'targets' => array( 'desktop', 'mobile' ),
the existing core code that uses OutputPage->addModules() "just works" and
RL loads those modules. Other modules that have not been whitelisted for
mobile don't get included, so we're not randomly loading code that's going
to expect a different environment and just break.
Mobile-specific modules can also specify:
'targets' => 'mobile'
to avoid cluttering up the module list for the desktop, if they're never
going to be used there.
Currently this adds one HTTP request to MobileFrontend output (replacing
one load of jquery with two loads of [RL startup, jquery+mediawiki]), but
we should be able to cut that back down by loading more mobile modules via
RL.
Please feel free to test, give feedback, and/or demand the whole thing be
rewritten (but only with good reason ;).
-- brion
Hey all,
Unfortunately the combination of using commit summaries in release notes
and the mass-merger of Wikidata code means that we now about several
hundred somewhat cryptic commit summaries in the release notes [1].
IMHO it might be a good idea for everyone to get into the habit of using
clear commit summaries regardless of whether or not they're directly
committing to core/master, if not overly time-consuming. This would make it
a lot easier for people like me who wade through the automated release notes
Thanks!
Harry
--
Harry Burt (User:Jarry1250)
[1] https://www.mediawiki.org/w/index.php?title=MediaWiki_1.21/wmf2
Hello everyone,
You're invited to the IRC office hours with the Language Engineering team[1] at
the Wikimedia Foundation.
Date: 2012-10-17
Time: 16.30 UTC
Venue: #wikimedia-office
Agenda:
1. ULS & Project Milkshake updates.
2. Translation UI/UX design findings
3. Q & A
For more logistical info and time conversion links check the page on
Meta[2]. Thanks, and we hope to talk to you soon!
[1] http://wikimediafoundation.org/wiki/Language_Engineering_team
[2] https://meta.wikimedia.org/wiki/IRC_office_hours
--
Srikanth L
Wikimedia Language Engineering Team
(Re-sending to list)
On 10/17/2012 05:52 AM, Antoine Musso wrote:
> Le 16/10/12 23:42, Quim Gil a écrit :
>> Hi, what about having Wikimedia features as organization in Ohloh?
>>
>> The proposal is interesting considering the current state of things:
>> MediaWiki seems to be stalled with the SVN to Git migration, and it is
>> close impossible to find out what other projects come from this community.
>>
>> This would help or quest on community metrics, so here goes my humble
>> +1. I also volunteer with some work, basically following the steps of
>> https://github.com/wikimedia and pinging Sumana / here for anything else.
>
> Brion, Siebrand and I are admins of the MediaWiki project on ohloh.net.
Good to know. :) Do you want to contact Rich Sands from Ohloh? Do you
want me to contact CCing you...?
> Seems Ohloh has trouble fetching from our git repositories :-/
http://code.ohloh.net/project?pid=&ipid=306442 shows
Code Location: https://gerrit.wikimedia.org/r/p/mediawiki/core.gitmaster
Shouldn't it be something like
https://gerrit.wikimedia.org/r/p/mediawiki/core.git ?
--
Quim