I think we should migrate MediaWiki to target HipHop [1] as its
primary high-performance platform. I think we should continue to
support Zend, for the benefit of small installations. But we should
additionally support HipHop, use it on Wikimedia, and optimise our
algorithms for it.
In cases where an algorithm optimised for HipHop would be excessively
slow when running under Zend, we can split the implementations by
subclassing.
I was skeptical about HipHop at first, since the road is littered with
the bodies of dead PHP compilers. But it looks like Facebook is pretty
well committed to this one, and they have the resources to maintain
it. I waited and watched for a while, but I think the time has come to
make a decision on this.
Facebook now write their PHP code to target HipHop exclusively, so by
trying to write code that works on both platforms, we'll be in new
territory, to some degree. Maybe that's scary, but I think it can work.
Who's with me?
-- Tim Starling
[1] https://github.com/facebook/hiphop-php/wiki/
Hi!
I am looking for static HTML Dumps of the English Wikipedia, but I have
found only this page: http://static.wikipedia.org/
This page contains almost three years old dumps!
Do you know other pages, where I could download recent HTML dumps from, or
do you have any idea, how could I get them?
Thank you for your help,
Samat
Hello,
I was encouraged (hello, Sumana) to post a Google Summer of Code idea
and solicit comments. So here goes.
Perhaps entirely outside of the scope of GSOC, I think it would be a
fun project to make wikipedia.org and wikimedia.org accessible over
IPv6.
Also, while creating an account to post to
http://www.mediawiki.org/wiki/Summer_of_Code_2011
I noticed that my login was insecure, and that there was no SSL
option. Adding a secure login feature would also be a cool project.
Cheers,
mc
(Breaking this out from 'Focus on sister projects' thread.)
I just want to give an official shout out, as Lead Software Architect for
the Wikimedia Foundation, that Wikipedia's sister projects are important and
that they need more love. Not only are they directly useful and interesting
in various ways, but their explorations of additional subject areas, media,
and organization styles is something that needs more attention as Wikimedia
looks to its future beyond Wikipedia's first decade. The next decade can't
just be ten years of doing the same things...
Historically it's been difficult to get enough attention for review &
deployment of special-purpose extensions for those projects, so I'm highly
interested in proposals that will help put more flexibility and power into
the hands of their own motivated contributors: from easier
self-configuration of sites to new plugin architectures that allow the use
of new creative media tools without the explicit intervention of Wikimedia
staff.
In particular, I think there's some low-hanging fruit in the Gadgets system.
Right now it's honestly pretty awkward to create a Gadget in the first
place, and sharing code modules between wikis requires a lot of
cut-and-pasting (which leads to divergent code bases, which makes
maintenance nigh-impossible in the face of MediaWiki framework changes).
Some things that would be very spiffy, and probably not that hard:
* Provide a nice interface rather than forcing manual editing of MediaWiki:
namespace message pages
* Self-publishing: be able to do your user JS/CSS in modular Gadget-form,
with easy sharing to other users. Don't require manual work to move
something from 'my own custom JS' ro 'a user .js file that other people can
use' to 'a Gadget that anyone can select in preferences'.
* Cross-wiki gadget sharing: if we can avoid fragmenting common scripts,
they'll be easier to maintain.
Harder, but very interesting in the medium to long-term:
* Security model for safe code sharing: we could devise an explicitly
limited interface between the wiki page JS and gadgets hosted in an offsite
iframe. A foreign gadget could add certain UI elements (tabs, toolbar
buttons) to be triggered, and could open as an embedded view to provide
things like image editing, drag-n-drop category rearrangement, custom
visualizations or interactive diagrams.
-- brion
I've made some changes promoting better coding patterns in some contexts.
These are for MediaWiki 1.18, extensions can keep their old patterns
till they drop support for pre-1.18.
I'd like to consider dropping the rewriting of $wgTitle and $wgOut
inside of SpecialPage::capturePath around MediaWiki 1.20. This means
that after that release includable special pages in extensions will have
to use the new patterns instead of relying on the $wg replacement hack.
When working on special pages (ESPECIALLY includable special pages):
- Use $this->getOutput() instead of the $wgOut global.
- Use $this->getUser() instead of the $wgUser global.
- Use $this->getSkin() instead of using $wgUser->getSkin();
In contexts where you are working with $wgOut or another OutputPage
instance and using $wgUser and skin instances fetched from it to render
stuff being passed back to output:
- Use $out->getUser() instead of the $wgUser gobal.
- Use $out->getSkin() instead of using $wgUser->getSkin();
---- Extra stuff ----
I have some thought to go through on what to do about the Linker in some
contexts, but I have some plans I'm thinking of regarding Skin:
- I'm thinking of making $out->getSkin(); the primary point for getting
a skin instead of $wgUser->getSkin(); (I'll still keep BC), making
OutputPage the focal point for things related to the output of the page.
Having OutputPage track and manage skins also avoids some special case
bugs, it's actually more sane for OutputPage to manage the skin when you
examine various branches of code.
- Along with making OutputPage track the skin I'm planning on dropping
support for getSkin( $t ); examining the code it's not actually even
used, and the only code using it actually shouldn't be playing with a
full skin at all anyways, and making Skin defer to OutputPage for most
of it's title information. Instead of having 3 near-global copies of the
Title floating around ($wgTitle, $wgOut->mTitle,
$wgUser->getSkin()->mTitle).
I haven't decided what do about the linker yet. But I'm considering
making the Parser get it's linker via $po->getLinker(); (either
ParserOutput or ParserOptions, I need another look) and getting a plain
linker instance instead of a skin. Skin will define a getLinker() which
will replace it's subclassing of Linker (there will be some bc). To
allow for the overloading of parts of the linker, which was sorta
possible already, but making it more reliable, getLinker would be
overridable by a skin that wanted to make linker changes so it could
return a subclass of the linker. The getLinker in Parser context will
take that into account, if a subclass of Linker is being used instead of
the normal linker the parsercache key will change, as a result we will
both make overriding linker methods reliable and usable, while also
avoiding any unnecessary parser cache fragmentation which would happen
if we went and included the skin name into the pcache key.
--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]
I had a bit of a documentation challenge approaching the problem of
writing phpunit test for extensions, mostly because many of the
extensions do this very differently and the manual did not have any
recommendations.
It appears many extension have custom bootstraping code ( somewhat hacky
path discovery and manual loading of core mediawiki files, and don't
necessarily register their tests in a consistent way.)
I wrote up a short paragraph of what I would recommend here:
http://www.mediawiki.org/wiki/Manual:Unit_testing#Writing_Unit_Test_for_Ext…
If that makes sense, I will try and open up some bugs on the extensions
with custom bootstraping code, and I would recommend we commit an
example test tests/phpunit/suite.extension.xml file for exclusively
running extension tests.
Eventually it would be ideal to be able to 'just test your extension'
from the core bootstraper (ie dynamically generate our suite.xml and
namespace the registration of extension tests) ... but for now at least
not having to wait for all the core tests as you write you extension
tests and some basic documentation on how to do seems like a step forward.
--michael
Hello!
What is the current concensus on HTML5?
Are we going to fully support it and use as many features as we can or
are we going to keep just using javascript alterntives?
I would assume that we would continue to use javascript in the case
that the client does not support HTML5, though that may mean we have
to write things twice.
Also, if we are indeed going to use HTML5, are we going to use XHTML5?
I would like to work on integrating MediaWiki with HTML5 if at all
possible, I am just unsure on what parts we wish to use the new
standard.
TIA - Joseph Roberts
> Message: 2
> Date: Fri, 01 Apr 2011 18:40:00 -0700
> From: Ryan Kaldari <rkaldari(a)wikimedia.org>
> Subject: Re: [Wikitech-l] Focus on sister projects
> To: Conrad Irwin <conrad.irwin(a)gmail.com>
> Cc: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
> Message-ID: <4D967E70.2080108(a)wikimedia.org>
> Content-Type: text/plain; charset=windows-1252; format=flowed
>
[...]
>
> As long as we're on the subject of wiktionary, I notice that there's a
> lot of custom Javascript there for handling specialized editing tasks
> like editing glosses, managing translations, etc. It seems like some of
> this functionality could be improved further and developed into
> full-fledged extensions (making it easy for other wiktionaries to use as
> well). Would you have any interest in working up a couple Wiktionary
> project proposals for the upcoming Hackathon in Berlin?
>
> Ryan Kaldari
This isn't just true of wiktionary. Lots of sister projects have
specialized work flow tools
in js. Wikinews has review related tools in js and a hack that adds a
second "talk" namespace,
Wikisource has the proofread page extension, but still much of there
workflow is written in js,
I'm sure many other projects have specialized stuff that should be in
php extensions.
The issue is at the end of the day it is _significantly_ easier to
write a js hack, then
to manage to get a php extension written, reviewed and deployed.
-bawolff
(from [Wikitech-l] Future: Love for the sister projects!)
What are the differences between
mw.loader.load( 'http://centralwiki.org/w/load.php?modules=ext.gadget.foo');
and
mw.loader.load( '
http://centralwiki.org/w/load.php?title=MediaWiki:Gadget-foo.js&action=raw&…'
);
which is currently the recommended way in the migration guide[2]?
Helder
On Sun, Apr 3, 2011 at 18:12, Krinkle <krinklemail(a)gmail.com> wrote:
> = New way =
> This way is promoted in the migration here [2] to start centralizing
> gadgets and
> avoid stuff from getting out of date. (navPopups, HotCat, UTCLiveClock,
> WikiMiniAtlas etc.) :
>
> == Central wiki ==
> * MediaWiki:Gadget-definitions
> * foo[ResourceLoader]|foo.js|foo.css
> * MediaWiki:Gadget-foo
> This is Foo description.
> * MediaWiki:Gadget-foo.js
> alert('foo');
> * MediaWiki:Gadget-foo.css
> body { background: orange; }
>
> == Other wiki ==
> * MediaWiki:Gadget-definitions
> * foo|foo.js
> * MediaWiki:Gadget-foo
> This is Foo description.
> * MediaWiki:Gadget-foo.js
> mw.loader.load( '
> http://centralwiki.org/w/load.php?modules=ext.gadget.foo'
> );
>
> This new way not only centralizes stuff (like the old way did) but
> also does all resource loader stuff
> (1 request, minified, combined, cached)
>
> This is already a huge improvement and can and is being done today.
> Once global preferences are up and RL 2.0 is in the air, it would be
> even easier.
> ie. extra section in Special:Preferences#Gadgets with "global gadgets"
> and
> MediaWiki:Gadgets-globaldefinition is used and meta.wikimedia/load.php
>
> --
> Krinkle
>
> [1] https://bugzilla.wikimedia.org/show_bug.cgi?id=14950
> [2] http://www.mediawiki.org/wiki/RL/MGU#Keep_gadgets_central
>
>
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
I'm going to be making a change to trunk soon that removes
$skin->mTitle, any extension directly accessing this (supposed to be
private) method directly will break in 1.18. Please update code to use
the getTitle() method that was added in 1.16.
--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]