New Subversion committers:
>From OmniTI, who's working on the Article Feedback Tool, four people got
core commit access:
* Yoni Shostak (project manager) - yonishostak
* Greg Chiasson (developer) - gregchiasson
* Reha Sterbin (developer) - rsterbin
* Sean Heavey (UI designer/front-end developer) - seanheavey
Also, Simon Bachenberg (sbachenberg) got extensions-only access to work
on Semantic MediaWiki and a Solr Extension.
Welcome! :-)
--
Sumana Harihareswara
Volunteer Development Coordinator
Wikimedia Foundation
I just do not know, whether you have read about ETHERPAD LITE
https://github.com/Pita/etherpad-lite/wiki
It needs only a fraction of the resources of Etherpad.
Etherpad vs Etherpad Lite
Etherpad
Etherpad Lite
Size of the folder (without git history) 30 MB 1.5 MB
Languages used server side Javascript (Rhino), Java, Scala
Javascript (node.js)
Lines of server side Javascript code ~101k ~9k
RAM Usage immediately after start 257 MB (grows to ~1GB) 16 MB
(grows to ~30MB)
I like this:
"The Etherpad Lite jQuery plugin
<https://github.com/johnyma22/etherpad-lite-jquery-plugin> easily allows
you to add a pad from Etherpad in a web page. It injects the pad
contents into a div using iframes."
Tom
I've posted a summary here:
https://blog.wikimedia.org/2011/10/28/tech-meetup-moves-wm-infrastructure-f…
Basically, in New Orleans we made progress on the SwiftMedia extension,
Wikimedia Labs, continuous integration, ArchiveLinks, user scripts,
Max's API Query Sandbox, Puppetization, Git migration, and more. That
blog entry's chock full of links to notes in case you want to know more.
Thanks to Ryan Lane and Dana Isokawa for organizing the event with me,
and thanks to Launch Pad New Orleans for providing the venue! And
thanks to Ben Hartshorne for the photos.
Our next developers’ event is a hackathon in Mumbai, November 18-20,
concentrating on internationalization, localization, and mobile/offline
work. To find out about other upcoming Wikimedia technical events,
check the meetings wiki page
https://www.mediawiki.org/wiki/MediaWiki_developer_meetings
and follow @MediaWikiMeet on Identi.ca or Twitter.
--
Sumana Harihareswara
Volunteer Development Coordinator
Wikimedia Foundation
> Hello all, sorry if this is a noobish question, but I get strange problems when trying to upload a newly generated image from within an extension. It's seems the title of the calling page gets corrupted getting the name of the uploaded file), and sometimes the resulting image, despite being called File:XXX, end up in the main namespace. At times there are also a lot of UNIQ/QINU-strings in the page. These problems seem to appear both when doing the upload from the job queue, and when uploading at once. I am, by the way, not using any globals like $wgOut.
> This is the actual upload lines (it inside an Semantic MediaWiki Result Printer class, but I've had the same problems when trying it in an ordinary MediaWiki parser function):
> $ulUser= User::newFromName($myUserName);
> $mUpload = new UploadFromUrl(); $mUpload->initializePathInfo ( $filename, $tempdir, false );
> $status = $mUpload->getLocalFile()->upload( $tempdir . '/' . $filename, $tempdir, $description, 0, false, false, $ulUser );
> if ( $status->isGood() ) { unlink($tempdir . '/' . $filename); return true; } else { ... }
> I'm not a programmer, so the code is probably quite ugly. I'm not even sure this is the proper way to do a file upload, but it's the only way I found working. Any help would be greatly appreciated (e.g. examples of extensions doing file uploads, making it work)!
> Best regardsLeo Wallentin
> Leonard Wallentin
> leo_wallentin at hotmail.com
> +46 (0)735-933 543http://säsongsmat.nu
>
> http://nairobikoll.se/
> http://twitter.com/leo_wallentin
> Skype: leo_wallentin
This type of issue (lots of UNIQ_....'s in output) is usually caused
by calling $wgParser->parse() method when already parsing stuff. In
this case you're probably already parsing stuff because its being
executed from an smw printer (I'm not familar with SMW, but i assume
that's called from some parser hook or something). I would guess (but
haven't checked) that the upload method calls $wgParser->parse
somewhere along the line, probably when its creating the image
description page. The babel extension had a very similar issue with
auto-creating categories. See the BabelAutoCreate.class.php file in
the Babel extension.
Cheers,
-bawolff
I asked, a bit ago when setting up some private wikis using the new features
in 1.17, how one finds out which messages are modifiable, and where to find
them to change them. Specifically, I was trying to locate the messages on the
login page you're presented with when arriving, un-logged-in, at such a wiki.
I think that this:
"Another help tool is the qqx language code: there is no associated MessageQqx.php
file, and it isn't possible to select this fake language in the user's preferences.
But when used with the &uselang parameter to display a wiki page (e.g.
https://en.wikipedia.org/wiki/Special:RecentChanges?uselang=qqx), MediaWiki will
display the message keys instead of their values in the user interface: this is
very useful to identify which message to translate or change."
from Guillaume's technical writeup at:
https://www.mediawiki.org/wiki/MediaWiki_architecture_document/text
might be the answer I was looking for. I document it here for the archives.
Cheers,
-- jra
--
Jay R. Ashworth Baylink jra(a)baylink.com
Designer The Things I Think RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA http://photo.imageinc.us +1 727 647 1274
As I understand it, WMF has (a) hundreds of Apaches (b) running
Ubuntu. How do you set up the Apaches to a substantially-identical
configuration? I looked on the wikitech wiki and couldn't see it.
(I ask only for my own interest - I currently want to take the
Debian/Ubuntu apache2 out and have it shot.)
- d.
Hi,
As you may know from previous postings to this list and the Wikimedia
Blog, I've been working on a document presenting MediaWiki's
architecture. The document will be integrated as a chapter in the
second volume of the "Architecture of open-source applications" book.
The reason the Wikimedia Foundation agreed to do this, besides the
book, was because we thought that such a document would also prove
useful for new developers who want to familiarize themselves with how
MediaWiki works. See [1] for more context.
I just finished most of the write-up, based on the input provided by
developers, on the documentation on mw.o, on the doxygen doc, and on
deep dives into the code.
We'll submit the final draft to the book's editors in about a week,
but before that I'd like to ask you guys to review the document.
Mostly for accuracy (we don't want to publish something that contains
factual errors), but other comments are encouraged as well.
Please try to centralize the feedback on the document's talk page to
avoid duplication between the mailing list and the talk page.
You don't have to review everything; if you want to focus on a
specific section, that's fine, and I'll be grateful for any help that
you provide.
Also, if you find the document useful, please say so on the talk page;
it's really difficult to assess the impact of this kind of work, so
any feedback will help us determine if we should attempt similar
endeavors in the future.
The document is at:
https://www.mediawiki.org/wiki/MediaWiki_architecture_document/text ;
be bold and feel free to edit the page directly, unless the changes
are likely to be very disruptive.
Many thanks in advance for your help; I'm available to answer any
question on- or offlist.
[1] https://www.mediawiki.org/wiki/MediaWiki_architecture_document
--
Guillaume Paumier
Technical Communications Manager — Wikimedia Foundation
http://donate.wikimedia.org
Hello all, sorry if this is a noobish question, but I get strange problems when trying to upload a newly generated image from within an extension. It's seems the title of the calling page gets corrupted getting the name of the uploaded file), and sometimes the resulting image, despite being called File:XXX, end up in the main namespace. At times there are also a lot of UNIQ/QINU-strings in the page. These problems seem to appear both when doing the upload from the job queue, and when uploading at once. I am, by the way, not using any globals like $wgOut.
This is the actual upload lines (it inside an Semantic MediaWiki Result Printer class, but I've had the same problems when trying it in an ordinary MediaWiki parser function):
$ulUser= User::newFromName($myUserName);
$mUpload = new UploadFromUrl(); $mUpload->initializePathInfo ( $filename, $tempdir, false );
$status = $mUpload->getLocalFile()->upload( $tempdir . '/' . $filename, $tempdir, $description, 0, false, false, $ulUser );
if ( $status->isGood() ) { unlink($tempdir . '/' . $filename); return true; } else { ... }
I'm not a programmer, so the code is probably quite ugly. I'm not even sure this is the proper way to do a file upload, but it's the only way I found working. Any help would be greatly appreciated (e.g. examples of extensions doing file uploads, making it work)!
Best regardsLeo Wallentin
Leonard Wallentin
leo_wallentin(a)hotmail.com
+46 (0)735-933 543http://säsongsmat.nu
http://nairobikoll.se/http://twitter.com/leo_wallentin
Skype: leo_wallentin
-----BEGIN PGP PUBLIC KEY BLOCK-----
Version: GnuPG v1.4.10 (GNU/Linux)
mQENBE1JuiUBCADsAnoxo1o3G2Apkjs8NHSfbN4g/H8HbS4XXfjruZ6Afu6PBMJI
CoiopKmTzQojjUZEjM2i4QsynU2xX4PmpMloPCOXEWxQg7q6HgHKyBU9NroL+17T
WA/30jGTmi2pkuznX1LpYKz0r8oSUsrg9oIeJmSbPRP6bNjwlgaDzarfYjcfEFXu
Y0s8KdVT1M8Zw2Sq7QuYPaiT+gfLMN5FaCzCZPdHZu8L1lXszTtFxztiF0hTd2C6
8yAW3DlTqTOSWqQtt10tf8VxeXPGxfqs6Ti8s5MsAsUtg9vw0A7gQC36SmRXel7W
l/rYbkESqDKANgIT6ofbIru0ozdy/BxnL2wXABEBAAG0LUxlb25hcmQgV2FsbGVu
dGluIDxsZW9fd2FsbGVudGluQGhvdG1haWwuY29tPokBOAQTAQIAIgUCTUm6JQIb
AwYLCQgHAwIGFQgCCQoLBBYCAwECHgECF4AACgkQOYTL0NQ4Ju9AwQf+L655KY1W
9Q43IOfZ6hJBwfPjhC4pWppptxe4atsSIo+wh+UHd4Zle59LjMZqCGJrFmhRNk+E
DtReKuZT/9aZ6yIoqIf0qgqs2L+NPsFprLlsl284cZZtU7YR9oeOVwAK6l58pWfD
1YQnZEOZDcklvunXI7SpFesB3YbuEnlcU3AJQ9hBJquEuMsGXlcXin/1zid+wEWW
lkJz4nqp/EaZ9ITHpSzhftsvknskttLqbbEiXyGjMH+FO99S5Vbn9PZAs3axRCHV
MWZqx7DLM6FOTlowklLR4lH0UGawwDwjRICDJQlhcS4fA5ORByXE3zbZ45MleQJK
aTYuzxb6IVBzBrkBDQRNSbolAQgAv2RTauQ/aQbpS718FPoxPCdA/GgRXvYQ/dle
G7m+p0EBUuu+XlDThyQOrWMBy4UICp2OvChfeb0x7SQ2Xg7ahRkWuKnhGiPKkvoZ
qBVrbZ1bKcjA6QXcImelICtSjd3UTtCHcfNttEe5d000GaRJBAzsZseDVebpblLt
X1z/n/9nsas+moAdRpiyfSPX1HFW57429GzDUsyCvQfqaPwPCuZa4OBtqxw/ydyn
hGh1fBcm4bwOU5nqUt+N/d7GXSpZtYChuNhQZj0uwtvMnXJoKyIEPRIx1xkRJTaC
uZTZKqnNU3EGwtYDhAgMQpVzFXXBoSBAfV9Jz9XPHd4RidakEwARAQABiQEfBBgB
AgAJBQJNSbolAhsMAAoJEDmEy9DUOCbviRgH/3Tm05nqdsFk531eOLMkCqSoPupM
YSId04pE9qgKZCGHvqYYxWuksgDj9BFCm+vMfW+e45bX4nd2bxkZecMCVvAOQrsB
yKnk7g4BFI6YxYluCt5ouaRPcB73ztk+08z9j0GvlCo3IAp06neoC/IH1XhUvkts
bXBzxQt7zO7Gic54U516Qzsr4iD6MQsBkGoaSKtLtk4v8xFMgDWIl1ODy70qabES
7+5IJn0vnh9DWJqkosofRMcqYoblwvf7orDuSgr76wokaTKCCsNEOt7TTbO8XVyQ
r3wzIiopALjOj/2N9kxaHg9MQOfERV6prfZmkxPP69ptlCnJVZL7vwTO/0E=
=KIpq
-----END PGP PUBLIC KEY BLOCK-----
I was planning on working on the 1.18 release last week, however most of my
time was absorbed with Contest related work, meaning this didn't happen.
As of writing this email, there are 27 outstanding revisions tagged as 1.18
[1]. 26 are new, one is fixme. All revisons don't have to be merged to the
REL1_18 branch at this point. Any major bugs that have been in fixed (that
are applicable in 1.18) should be backported if they haven't been already.
Helps limit the number of duplicate bug reports we may get.
On the bug front, there are numerous outstanding bugs, not all of these need
to be fixed for 1.18 [2][3]. Please feel free to remove them from being
tarball blockers
The intention is to push a beta out this week (today? maybe.. Certainly by
the end of the week)
Certainly, if anyone knows of anything that really "needs" to be in the
beta, please let me know ASAP.
Sam
[1] https://www.mediawiki.org/wiki/Special:Code/MediaWiki/tag/1.18
[2] https://bugzilla.wikimedia.org/show_bug.cgi?id=28425 - Tarball bugs
[3] https://bugzilla.wikimedia.org/show_bug.cgi?id=29876 - WMF Deployment
related bugs
Hey all --
I've got a stack of issues discussed during the last couple months'
conferences and hackathons to raise, so there may be a few more manifestos
on the way. But don't worry, this one will be short!
I had a great chat at the New Orleans hackathon with D J Bauch who's been
working on maintaining MediaWiki's MS SQL Server support, and also had a
very useful email chat a couple weeks back with Freakalowsky who's put a lot
of work into MediaWiki's Oracle support.
Long story short: though traditionally MediaWiki developers have put very
little work on our own into maintaining non-MySQL compatibility, there's
still lots of folks interested in running on them... AND THEY ACTUALLY
MOSTLY WORK!
At this point I think it's a bit crazy of us to keep on marginalizing that
code; some folks running their own instances will need or want or prefer (or
be forced by office IT) to run on some "funny" database, and we shouldn't
stand in their way. More importantly, keeping things working with multiple
DB backends helps us to keep our code cleaner, and reduces our own hard
dependencies on a particular product line.
There are two main impediments to keeping code working on non-MySQL
platforms:
* lazy code breakages -- us MySQL-natives accidentally use MySQL-isms that
break queries on other platforms
* lazy schema updates -- us MySQL-natives add new schema updates into the
system but only implement them for MySQL
The first could often be helped simply by having automated testing run to
make sure that relevant code paths get exercised. Often there's just a
function that's used, or lazy use of LIMIT/OFFSET, or GROUP BY in a form
that other databases are pickier about. Just flagging them from the testing
can often be enough to make it easy to fix.
The second is a bit harder, but again testing can help flag that something
needs to be updated so it doesn't wait until too late for the next release,
or something.
I'd like for everybody who's working on MediaWiki on non-MySQL databases to
make sure that the phpunit test suite can run on your favorite platform, so
we can see about getting them set up in our regular reports on
http://integration.mediawiki.org/ci/
Green bars are your friend! :)
-- brion