"I'm totally cool with the idea of code review for Gadgets & so forth, just
not using Gerrit. We considered it for Scribunto (and heck, I wrote half of
a
proof of concept) but shot it down because the idea totally sucked."
Chad, can you expand on that statement. I've been toying for some time with
writing something that allows documentation to be synced both ways. E.g.
for hooks and variables and what not. My simplistic toy example had a 1:1
link but I've been trying to figure out how to make it more complex.
(Ideally this would also allow documentation to be linked to a branch and
thus we then have versioned documentation)
~Matt Walker
Wikimedia Foundation
Fundraising Technology Team
On Wed, Dec 11, 2013 at 11:21 AM, Chad <innocentkiller(a)gmail.com> wrote:
> On Wed, Dec 11, 2013 at 11:04 AM, Jon Robson <jdlrobson(a)gmail.com> wrote:
> >
> > Many a time I've talked about this I've hit the argument that gerrit is
> > confusing to some users and is a barrier for development, but this is a
> > terrible unacceptable attitude to have in my opinion. Our end users
> deserve
> > a certain standard of code. I'm aware using a code review process can
> slow
> > things down but I feel this is really essential. I for one greatly
> benefit
> > from having every single piece of my code scrutinized and perfected
> before
> > being consumed by a wider audience. If this is seen as a barrier, someone
> > should investigate making it possible to send wiki edits to Gerrit to
> > simplify that process.
>
>
> >
>
> Sending wiki edits to Gerrit for review? Absolutely not.
>
> I'm totally cool with the idea of code review for Gadgets & so forth, just
> not
> using Gerrit. We considered it for Scribunto (and heck, I wrote half of a
> proof
> of concept) but shot it down because the idea totally sucked.
>
> -Chad
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
Hello everyone,
It’s with great pleasure that I’m announcing that Andrew Green[1] has joined the Wikimedia Foundation as a contractor in Features Engineering working in the Education Project.
Before joining us, Andy worked at the Instituto Mora[2] creating free software for social science research that uses images as a primary sources. He received a B.A. in music composition from the Université de Montréal in 1995 and went on to study social anthropology at the National School of Anthropology and History in Mexico City.
He’s done a lot of work with the semantic web[3][4] which helps him navigate a lot of the pre-WikiData Education Program[5] codebase. :-) As you probably guessed with my usual tardiness, his first official day was actually on October 7th, so he’s already done a lot of work[6] for us. However, you’ll see him in the office this week and next (as well as at the holiday party), so be sure to say hi! and introduce yourself (he’s the quiet guy with big ideas sitting on the north end of the 3rd floor).
Andrew lives and works in Mexico City, Mexico with his wife and two children. He knows Spanish, French, and English and loves to talk about the philosophy of mathematics, cognitive linguistics, and politics[7].
Please join me in welcoming Andrew to the Wikimedia Foundation. :-)
Take care,
Terry
[1]: [[User:AndyRussG]]
[2]: http://www.mora.edu.mx/inicio.aspx
[3]: http://ceur-ws.org/Vol-348
[4]: https://github.com/AndrewGreen/papersandtalks
[5]: https://www.mediawiki.org/wiki/Wikipedia_Education_Program
[6]: https://gerrit.wikimedia.org/r/#/q/owner:%22AndyRussG+%253Candrew.green.df%…
[7]: AndyRussG: “Politics? Yeah, sure politics is cool. Well maybe not. Hmmm.”
Heya,
The Architecture Summit will be upon us in less than two months. To make
sure that this Summit is going to be productive it is important that we
discuss the right RfC's. Before deciding which RfC's should be discussed at
the Summit I want to make sure that
https://www.mediawiki.org/wiki/Requests_for_comment contains all RfC's and
that all important topics have an RfC.
If you have a Mediawiki related RfC in a personal notepad, on your User
Page, in your mind then this would be a great moment to write or move it
under https://www.mediawiki.org/wiki/Requests_for_comment and add an entry
to the table. If you don't have 'move' rights then please let me know and I
can move it for you.
If you know of a topic that *should* have an RfC but does not yet have an
RfC then please reply to this list mentioning the topic. I will check with
Tim/Brion to see how these topics can get an RfC.
Once we have collected all relevant RfC's under
https://www.mediawiki.org/wiki/Requests_for_comment then I will make a page
where everybody can express their interest in which RfC's should be
discussed at the Summit.
Questions? Let me know!
Best,
Diederik
Hi everyone,
We're just over a week away from the Friday, December 20 deadline for RFCs
as items to consider at the Architecture Summit.[1] That's not a hard and
fast rule (we've never done this before), but we should definitely have a
reasonable amount of time between the point an RFC is proposed and being
discussed at the Architecture Summit. Ideally, we'll have taken care of
everything that's reasonable to take care of via mailing list/IRC/other
online meeting, and really be down the things that require face-to-face
time to accomplish.
It's my hope that we're not just nibbling at the edges, but that we're
actually going to talk about things that will substantially modernize our
architecture and reduce our technical debt. I believe there are RFCs in
the list and in the works that do that, but I also know there are areas
we've discussed informally in the past that we've never set to writing.
Many of the RFCs cover important areas of incremental improvement, but not
all of the changes we need to make have such small increments.
Anyway, RFC submission page is here:
https://www.mediawiki.org/wiki/Requests_for_comment
Rob
[1] https://www.mediawiki.org/wiki/Architecture_Summit_2014
I was recently suggested by someone (and requested by someone else) to
use patrolling on good edits in huggle, however after executing the
patrol api query I receive this error:
<?xml version="1.0"?><api servedby="mw1130"><error
code="patroldisabled" info="Patrolling is disabled on this wiki"
/></api>
Is it true? Is patrolling really disabled on English wikipedia? It
sounds to me very unlike, so is this a bug and different error message
should have been displayed?
(re-sending from the "right" account for this list)
Hi.
I (rather urgently) need some input from someone who understands how parser
caching works. (Rob: please forward as appropriate).
tl;dr:
what is the intention behind the current implementation of
ParserCache::getOptionsKey()? It's based on the page ID only, not taking into
account any options. This seems to imply that all users share the same parser
cache key, ignoring all options that may impact cached content. Is that
correct/intended? If so, why all the trouble with ParserOutput::recordOption, etc?
Background:
We just tried to enable the use of the parser cache for wikidata, and it failed,
resulting in page content being shown in random languages.
I tried to split the parser cache by user language using
ParserOutput:.recordOption to include userlang in the cache key. When tested
locally, and also on our test system, that seemed to work fine (which seems
strange now, looking at the code of getOptionsKey()).
On the life site however, it failed.
Judging by its name, getOptionsKey should generate a key that includes all
options relevant to caching page content in the parser cache. But it seems it
forces the same parser cache entry for all users. Is this intended?
Possible fix:
ParserCache::getOptionsKey could delegate to ContentHandler::getOptionsKey,
which could then be used to override the default behavior. Would that be a
sensible approach?
And if so, would it be feasible to push out such a change before the holidays?
Thanks,
Daniel
--
Daniel Kinzler
Senior Software Developer
Wikimedia Deutschland
Gesellschaft zur Förderung Freien Wissens e.V.
Hello,
This is a reminder that the Wikimedia Language Engineering team will be
hosting an IRC office hour from 1700 to 1800UTC later today on
#wikimedia-office (FreeNode). Please see below for the event details.
Thanks
Runa
=== Event Details ===
What: WMF Language Engineering Office hour
When: December 11, 2013 (Wednesday). 1700-1800 UTC
http://www.timeanddate.com/worldclock/fixedtime.html?iso=20131211T1700
Where: IRC Channel #wikimedia-office on FreeNode
---------- Forwarded message ----------
From: Runa Bhattacharjee <rbhattacharjee(a)wikimedia.org>
Date: Fri, Dec 6, 2013 at 3:19 PM
Subject: Language Engineering IRC Office Hour on December 11, 2013
(Wednesday) at 1700 UTC
To: MediaWiki internationalisation
<mediawiki-i18n(a)lists.wikimedia.org>, Wikimedia Mailing List
<wikimedia-l(a)lists.wikimedia.org>, Wikimedia developers
<wikitech-l(a)lists.wikimedia.org>,
wikitech-ambassadors(a)lists.wikimedia.org
[x-posted]
Hello,
The Wikimedia Language Engineering team will be hosting an IRC office
hour on Wednesday, December 11, 2013 between 17:00 - 18:00 UTC on
#wikimedia-office. (See below for timezone conversion and other
details.) We will be talking about some of our recent and upcoming
projects and then taking questions for the remaining time.
Questions and any other concerns can also be sent to me directly
before the event. See you there!
Thanks
Runa
=== Event Details ===
What: WMF Language Engineering Office hour
When: December 11, 2013 (Wednesday). 1700-1800 UTC
http://www.timeanddate.com/worldclock/fixedtime.html?iso=20131211T1700
Where: IRC Channel #wikimedia-office on FreeNode
--
Language Engineering - Outreach and QA Coordinator
Wikimedia Foundation
--
Language Engineering - Outreach and QA Coordinator
Wikimedia Foundation
Hi,
Huggle 3 is slowly getting near to first release, and I have yet set
up some built environment for early beta versions. 1 for windows on
one of my own windows boxens (using NSIS and MinGW) which I use to
release beta version packages on sourceforge, and other one for linux
using launchpad.
So that both Windows and Linux users can easily get and install huggle
packages with no need to understand how compiling works or any need to
resolve any dependencies themselves. [1]
Unfortunately, we have no such thing for MacOS, not just because
neither me or any other current huggle dev owns a Mac, but also
because there is no free launchpad like service for mac's I know of.
So, if someone of you has enough experience with packaging software
for Macs and wants to help with huggle packaging for MacOS, let us
know so that we can setup some build process for MacOS users as well.
1 - More information on how to get huggle packages is at
https://en.wikipedia.org/wiki/Wikipedia:Huggle/Huggle3_Beta#Prebuilt_packag…
Hello,
I'm trying to perform an API edit call in a maintenance script using
this example in MW 1.19.9
http://www.mediawiki.org/wiki/API:Calling_internally
$user = User::newFromId( 1 ); // Using WikiSysiop
$page = WikiPage::newFromID( $id );
$titleText = $page->getTitle()->getPrefixedText();
$text = "...";
global $wgRequest;
$req = new DerivativeRequest(
$wgRequest,
array(
'action' => 'edit',
'title' => $titleText,
'text' => $text,
'token' => $user->editToken(),
), true);
$api = new ApiMain( $req, true );
$api->execute();
However, I get this problem:
Unexpected non-MediaWiki exception encountered, of type "UsageException"
badtoken: Invalid token
Any idea what can be wrong?
P.D.: I already use WikiPage::doEdit() successfully.
--
Toni Hermoso Pulido
http://www.cau.cat