I have a log of what happens on when the commands:
sudo apt-get install mediawiki2latex
mediawiki2latex -u https://en.wikipedia.org/wiki/Adam_Ries -o AdamRies.pdf
are entered on the command line of ubuntu (13.10) Better than TV...
Happy to send it to anyone.
Fred
I have converted my email on using composer to manage a set of library
dependencies for MediaWiki-Core [0] into an RFC [1]. Work is
continuing on the implementation of this project, but there are still
debatable implementation details and the RFC process is meant to not
only validate ideas but leave behind a record of the design decisions
that have been made and trade offs that were considered in the
process.
In particular, the current draft RFC omits discussion of the concept
of library "ownership" for long term updates and security fixes and
could use more detail around the process of forking, patching and
subsequently maintaining a external library. I will attempt to fill in
some of these details as I see them over the next day or so, but now
would be a great time for people with strong ideas or opinions on
these aspects to comment on the talk page.
[0]: http://www.gossamer-threads.com/lists/wiki/wikitech/467520?page=last
[1]: https://www.mediawiki.org/wiki/Requests_for_comment/Composer_managed_librar…
Bryan
--
Bryan Davis Wikimedia Foundation <bd808(a)wikimedia.org>
[[m:User:BDavis_(WMF)]] Sr Software Engineer Boise, ID USA
irc: bd808 v:415.839.6885 x6855
The linux.conf.au conference, which I have presented to and love going
to, just opened up its call for talks: http://linux.conf.au/cfp . They
want talks about all kinds of open source programming stuff, not just
Linux: Trevor and I presented about ResourceLoader in 2012, and James
and I presented about VisualEditor in 2014.
LCA 2015 will be 12-16 January in Auckland, New Zealand. That's summer
in the Southern Hemisphere, and the climate there is very moderate,
so no sweltering heat like in Australia :)
LCA provides travel funding to some speakers, and Wikimedia funds TPS
grants for people going to conferences to talk about
Wikimedia-related things: https://meta.wikimedia.org/wiki/Grants:TPS .
So you could even get to go for free.
LCA is basically my favorite conference - the talks are great and of
high quality, they treat speakers well, and their audience gets what
we're doing. This is a fun chance to show off your project. I
encourage you to suggest a talk, or tell someone else that they should
talk about their project.
Roan
P.S.: Thanks to Sumana for writing most of this email, and for talking
me into proposing a talk for LCA in the first place back in 2011
Hello all,
We're in negotiations with one of applicants for the MediaWiki release
management RFP and will make the official announcement ASAP, likely
early next week.
Greg
--
| Greg Grossmeier GPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @greg A18D 1138 8E47 FAC8 1C7D |
Hi,
Gerrit change Id819246a9 proposes an implementation for a recent changes
stream broadcast via socket.io, an abstraction layer over WebSockets that
also provides long polling as a fallback for older browsers. Comment on <
https://gerrit.wikimedia.org/r/#/c/131040/> or the mailing list.
Thanks,
Ori
I just stumbled across <https://github.com/wikimedia/mediawiki-core/pull/19>,
a small but useful contribution to core from an HHVM developer. It has gone
unnoticed for two months, which is a bit sad.
Is there a way to accept pull-requests from GitHub? According to <
https://github.com/wikimedia/mediawiki-core/settings/hooks> (may not be
visible to non-Wikimedians, sorry), the WebHook receiver <
http://tools.wmflabs.org/suchaserver/cgi-bin/receiver.py> is defunct.
Anyone know the story there?
It'd be good if some additional people were watching (that is, receiving
notifications for) <https://github.com/wikimedia/mediawiki-core/>.
I haven't responded yet, by the way, so feel free to if you know the
answers to these questions. I don't know what effect accepting the
pull-request will have on the code in master, and telling someone who has
already submitted a patch to go sign up for Gerrit seems impolite.
Ori
Just wanted to send out an update on the progress we made around MW-Vagrant
improvements at the Zürich Hackathon. Our primary goal was to make some key
production services available in MW-Vagrant in order to make local
development/testing easier/more reliable. We made some excellent headway,
focussing on a few key services: SSL, Varnish, CentralAuth/Multiwiki.
SSL:
I spent a majority of my time focussing on this and received a lot of
support/help from Ori. There is now an 'https' role in mw-vagrant which
when enabled, will allow you to access your devwiki on port 4430 (forwarded
to 443 in Vagrant). There is one outstanding patchset which will make it
possible to use $wgSecureLogin in MW-Vagrant:
https://gerrit.wikimedia.org/r/#/c/132799/
Varnish:
This is proving to be much more difficult than anticipated, however some
progress was made and work is ongoing, spearheaded by Andrew Otto. The plan
is to set up varnish VCLs for mw-vagrant similar to what is set up for text
varnishes in production, with a frontend and backend instance running in
vagrant. Andrew is in the midst of refactoring the production varnish
module, to make it usable in Vagrant.
CentralAuth/Multiwiki:
Bryan Davis, Chris Steipp, and Reedy spent a lot of time hacking on this,
and we now have support for multiwiki/CentralAuth in Vagrant! There is
still some cleanup work being done for the role to remove kludge/hacks/etc
(see https://gerrit.wikimedia.org/r/#/c/132691/).
Also of significant note, Matt Flaschen created a mw-vagrant iso which can
be packaged on USB thumb drives, making it possible to set up mw-vagrant
without a network connection. There is still some work to be done here to
create a one-click installer as well as updating documentation. Matt got
this done before the hackathon, and we brought a bunch of USB sticks imaged
with the iso, which was instrumental in getting a bunch of folks new to
mw-vagrant up and running at the hackathon. This was particularly useful
during Bryan Davis's vagrant bootcamp sessions.
I believe Katie Filbert from Wikidata did some mw-vagrant work at the
hackathon as well, although I'm not clear on the current status. Katie, can
you let us know where things were at with what you were working on?
All in all it felt like a very fruitful hack session, and we're closer than
ever to having a ready-to-go developer instance that mimics our production
environment. Big thanks to everyone involved in making our work successful.
--
Arthur Richards
Software Engineer, Mobile
[[User:Awjrichards]]
IRC: awjr
+1-415-839-6885 x6687
(CCing wikimedia-l as well, please send any replies to wikitech-l only)
The Wikimedia technical community wants to have another hackathon next year
in Europe. Who will organize it?
Interested parties, check https://www.mediawiki.org/wiki/Hackathons
We would like to confirm a host by Wikimania, latest.
The same call goes for India and other locations with a good concentration
of Wikimedia contributors and software developers. Come on, step in. We
want to increase our geographical diversity of technical contributors.
--
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
Thanks Sumana,
That's good info to have. I'll look through those links.
That diagram may make its way into the presentation that I'm drafting. The presentation has balooned to an alarming length already but I'm going to try to complete it in outline form before pruning.
If someone else makes presentation slides available about infrastructure under a license that allows reuse I would greatly appreciate it.
By the way, I appreciated the overview of UX in your keynote [1].
Pine
[1] http://wiki.code4lib.org/index.php/2014_Keynote_by_Sumana_Harihareswara
> Date: Tue, 03 Jun 2014 09:41:34 -0400
> From: Sumana Harihareswara <sumanah(a)wikimedia.org>
> To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
> Subject: [Wikitech-l] learning Ops infrastructure (was: Re: 404
> errors)
> Message-ID: <538DD08E.1000801(a)wikimedia.org>
> Content-Type: text/plain; charset=UTF-8
>
> Hi, Pine.
>
> I, too, am interested in building our understanding of our TechOps
> infrastructure. https://www.mediawiki.org/wiki/Presentations has some
> explanations of some parts, as does http://wikitech.wikimedia.org/ . I
> welcome more links to guides/overviews.
>
> At the recent Zurich hackathon, other developers agreed that it would be
> good to have a guide to Wikimedia's digital infrastructure, especially
> how MediaWiki is used.
> https://www.mediawiki.org/wiki/Overview_of_Wikimedia_infrastructure is
> .... a homepage with approximately nothing on it right now except this
> diagram of our server architecture:
> https://commons.wikimedia.org/wiki/File:Wikimedia_Server_Architecture_%28si…
>
> You might find the Performance Guidelines illuminating
> https://www.mediawiki.org/wiki/Performance_guidelines and you might also
> like the recent tech talk about how we make Wikipedia fast, by Ori
> Livneh and Aaron Schulz, recently - see
> http://www.youtube.com/watch?v=0PqJuZ1_B6w (I don't know when the video
> is going up on Commons).
>
> --
> Sumana Harihareswara
> Senior Technical Writer
> Wikimedia Foundation
>
>
> On 05/30/2014 06:30 PM, ENWP Pine wrote:
> >
> > Ori, thanks for following up.
> >
> > I think I saw somewhere that there is a list of postmortems for tech ops disruptions
> > that includes reports like this one. Do you know where the list is? I tried a web search
> > and couldn't find a copy of this report outside of this email list.
> >
> > I personally find this report interesting and concise, and I am interested in
> > understanding more about the tech ops infrastructure. Reports like this one
> > are useful in building that understanding. If there's an overview of tech ops
> > somewhere I'd be interested in reading that too. The information on English
> > Wikipedia about WMF's server configuration appears to be outdated.
> >
> > Thanks,
> >
> > Pine
> >
> >