On my MediaWiki 1.20alpha (r114161) I added some new Interwiki
definitions such as geocache =
http://www.geocaching.com/seek/cache_details.aspx?wp=$1
I found that links on my pages like [[geocache:GC3GAJZ|GC3GAJZ]] which
uses the new Interwiki definitions were not rendered as
http://www.geocaching.com/seek/cache_details.aspx?wp=GC3GAJZ
not even
- after &action=purge
- setting $wgCacheEpoch = '20030516000000';
- Ctrl F5 in my browser
- dummy edits
Any idea, what is wrong ?
Hi all!
As you may know, I have been working on a core branch to introduce the concept
of content handlers <http://meta.wikimedia.org/wiki/Wikidata/Notes/ContentHandler>.
Wikidata relies on the features introduces in that branch, and I'd like to merge
them into core as soon as possible.
Now, I'd like you input on the best strategey to do this. Keep in mind that in
the current state, the branch is a code experiment based on a fork that is two
months old. So it needs to be rebased and cleaned up before it goes into review.
Anyway, here's what I think might be the best way to do this. Consider the
branch containing several feature sets, say A, B, C and D. These should go into
review one ofter the other, and they depend on each other.
So, my idea is this:
* Create a fresh feature branch for A.
* Merge A into that branch and clean it up, creating A'.
* Submit A' for review
Same goes for the other features sets. So far, so simple.
But. Development on Wikidata has to continue in the meaningtime, and it'S based
on the old wikidata branch. So, on that branch, I'd need to:
* go from A to A'
* once A' i in master, I'd somehow need to rebranch and get B, C and D into the
wikidata branch
I'M unclear on how to best do this. BAsically, there are two processes here:
1) clean up one feature set ofter the other, get them through review and into core.
2) keep developing the wikidata extensions based on the original branch, but
adopt to the cleaned up version of each feature set (A, B and so on).
Idealls, the wikidata branch would be rebased every time a feature set like A'
is merged into master, so A'-in-master and A'-in-wikidata don't diverge.
Any ideas on how to best go about doing this?
-- daniel
For test purposes I wanted to clone the last commit only by using the
--depth option as in
/work/tmp/ # git clone --depth=10
https://gerrit.wikimedia.org/r/p/mediawiki/core.git
and got this error message:
Initialized empty Git repository in /work/tmp/core/.git/
*error: RPC failed; result=22, HTTP code = 500*
My git client version is 1.7.1
Any idea why ?
Re-adding the list, as this is of public interest.
În data de 13 aprilie 2012, 19:42, Niklas Laxström
<niklas.laxstrom(a)gmail.com> a scris:
> I do consider users (at least one, me!).
Glad to hear that, but you're not the main user of either
translatewiki or mediawiki in general.
> If we must initiate
> discussion (if there even is a community) and wait for consensus (that
> might never happen) for every change we do, we will never get anything
> done.
That's plain wrong, and Wikipedia is here to prove it :) Many user
scripts were developed after consultation with the community. They're
working great and do what users expect them to do, not what the
developers want them to do. I do agree that developement would be much
slower, though.
>
> Like I mentioned in the email, I proposed that we notify affected
> Wikpedias *before* the changes are deployed [1] and executed that
> together with Gerard. They were given right to veto any change they
> didn't like or wanted to discuss. And some of them did use that
> possibility.
Search for "translation" on the Romanian village pump:
https://ro.wikipedia.org/w/index.php?title=Special%3AC%C4%83utare&profile=a…
The ro.wp embassy:
https://ro.wikipedia.org/wiki/Discu%C8%9Bie_Wikipedia:Ambasad%C4%83
I can see no such notification. No message from you, and only one from
Gerard, when the translation team was founded. If you chose only some
Wikipedias, then let me tell you this is even worse than no
notification at all. The thing is - the wiki environment is by
definition decentralized and hard to follow for small communities. If
you choose to notify only big wikis or the wikis that already receive
support from the WMF (e.g. Arabic, Indic languages), you're basically
having no effect at all.
As to solutions, I have proposed at Wikimania 2010 to the person that
was coordinating the Translation hub from meta to put up some
automated notification system when a new translation request appears.
This hasn't happened and translations in most languages are still
happening by chance, as different users go by the translation hub.
I suggest you do the same, at least for Special:AdvanceTranslation
strings. Some time before a scheduled deployment (ideally, 7 days, but
at least 72 hours), notify _all_ the village pumps and some users
(either all the users that translated the advanced strings, or all the
users that sign-up on some dedicated page) of the deployment. This
way, you don't have to wait for the consensus, but you're giving
communities an opportunity to reach it.
The sign-up system might seem complicated, but it works - we use it on
ro.wp for unblocking requests and for responses to warnings: we're
handling all unblocking requests in a matter of minutes, vs. days
before the bot warnings.
Another way to go is to have regular deployments as a rule. Right now,
you guys are something like "oh, we'll do it about every 3-4 months,
if the review queue is not too big and if something goes wrong we'll
push it for 2 more weeks". Having the deployment schedule a year in
advance, including the features you're targeting on each deployment,
and enforcing a maximum delay of 1 week used only for deployment bugs
and nothing else would greatly enhance predictability and give
translators a target to aim for.
Sorry for the long email,
Strainu
P.S. My "accusation" was actually an observation based on my past
interactions with you on this list and on bugzilla. I don't mean any
disrespect, but I stand by it.
Hi,
Before posting a bug for this, I thought I should pop a question to
see if somebody hasn't done this already.
Some of the content partners I'm in contact with are not really happy
about being attributed on the talk page and/or the history of the
article. I was thinking that we could have an extension that could
handle this. The solution I came up with was:
* have some <attribution></attribution> tags in the article; place all
the requirds attribution text between those tags in plain wikitext
* when parsing these tags, don't include the result in the main
article div (mw-body), but rather create a new tab below mw-body,
which will contain the attributions. This new div should be visually
different from the article.
I am aware this solution is prone to vandalism, but it is also a
simple way to include attributions without any additional knowledge
from the editors.
So, my questions are:
1. Has somebody done a similar extension before? If yes, is it stable
enough to deploy on WMF wikis?
2. If not, would somebody on this list be willing to make it? I don't
feel my PHP is strong enough for me to take on this, but I will help
in any way I can.
Thanks,
Strainu
I whipped together a php script this afternoon that allows you to make
arbitrary queries for Gerrit change sets, and then perform bulk actions on
the resulting change sets from the command line. Currently it only supports
doing a bulk 'submit' (with approve +2 and verify +1), but it won't take
much to add in other things (like abandon, verify, approve, etc). Take a
look and feel free to comment/make changes:
http://svn.wikimedia.org/viewvc/mediawiki/trunk/tools/gerrit-dippybird/dipp…
--
Arthur Richards
Software Engineer, Mobile
[[User:Awjrichards]]
IRC: awjr
+1-415-839-6885 x6687
Hi everyone,
During the 1.19 cycle, there were some namespace names that were changed
and then reverted shortly before deployment. We're about to deploy
1.20wmf1 to a lot of international wikis on Wednesday of next week, and I'm
a little worried that we may need to do a speedy revert again.
Two concerns:
1. Do the new namespace names have community consensus? What I've heard
though the grapevine is that yes, there has been an effort to get
consensus, but confirmation of that would be nice.
2. For each namespace that was changed, was there a corresponding alias
set up? Rumor has it that there might be a few cases where that isn't the
case.
I did a quick grep for recent changes to NS_* messages in hopes of finding
a few examples of recent changes. Below is a list of changes in 2012 to
NS_ messages.
We should get this sorted out before we push on Wednesday.
Thanks
Rob
23ea50c3 (Antoine Musso 2012-02-15 15:29:22 +0000 47) NS_USER =>
array( 'male' => 'Përdoruesi', 'female' => 'Përdoruesja' ),
23ea50c3 (Antoine Musso 2012-02-15 15:29:22 +0000 48) NS_USER_TALK =>
array( 'male' => 'Përdoruesi_diskutim', 'female' => 'Përdoruesja_diskutim'
),
dde3821a (Niklas Laxström 2012-03-28 13:41:19 +0000 56) NS_USER => array(
'male' => 'Suradnik', 'female' => 'Suradnica' ),
dde3821a (Niklas Laxström 2012-03-28 13:41:19 +0000 57) NS_USER_TALK =>
array( 'male' => 'Razgovor_sa_suradnikom', 'female' =>
'Razgovor_sa_suradnicom' ),
b3664209 (Translation updater bot 2012-04-06 15:34:29 +0000 26) NS_MEDIA
=> 'Медиум',
b3664209 (Translation updater bot 2012-04-06 15:34:29 +0000 45) 'Медија'
=> NS_MEDIA,
b3664209 (Translation updater bot 2012-04-06 15:34:29 +0000 46) 'Специјални'
=> NS_SPECIAL,
b3664209 (Translation updater bot 2012-04-06 15:34:29 +0000 47) 'Слика'
=> NS_FILE,
80f29d01 ( Reedy 2012-04-07 21:03:44 +0100 47) 'Imagem' => NS_FILE,
3bc64a9f (Translation updater bot 2012-04-03 21:11:13 +0000 440)'articlepage'
=> "'Content page' is used for NS_MAIN and any other non-standard namespace
and this message is only used in skins Nostalgia, Cologneblue and Standard
in the bottomLinks part.
659f18cc (Tim Starling 2012-01-02 22:54:57 +0000 108) NS_USER =>
array( 'male' => 'Участник', 'female' => 'Участница' ),
659f18cc (Tim Starling 2012-01-02 22:54:57 +0000 109) NS_USER_TALK =>
array( 'male' => 'Обсуждение_участника', 'female' => 'Обсуждение_участницы'
),
764b3492 (Niklas Laxström 2012-02-13 19:36:49 +0000 61) NS_USER =>
array( 'male' => 'Përdoruesi', 'female' => 'Përdoruesja' ),
764b3492 (Niklas Laxström 2012-02-13 19:36:49 +0000 62) NS_USER_TALK =>
array( 'male' => 'Përdoruesi_diskutim', 'female' => 'Përdoruesja_diskutim'
),
f0ecaf0a (Niklas Laxström 2012-04-10 16:36:41 +0000 38) NS_FILE
=> 'Датотека',
f0ecaf0a (Niklas Laxström 2012-04-10 16:36:41 +0000 39) NS_FILE_TALK
=> 'Разговор_о_датотеци',
f0ecaf0a (Niklas Laxström 2012-04-10 16:36:41 +0000 40) NS_MEDIAWIKI
=> 'Медијавики',
f0ecaf0a (Niklas Laxström 2012-04-10 16:36:41 +0000 41) NS_MEDIAWIKI_TALK
=> 'Разговор_о_Медијавикију',
I have seen there is a lot of wikis where people are concerned about
inactive sysops. They managed to set up a strange rule where sysop
rights are removed from inactive users to improve the security.
However the sysops are allowed to request the flag to be restored
anytime. This doesn't improve security even a bit as long as hacker
who would get to some of inactive accounts could just post a request
and get the sysop rights just as if they hacked to active user.
For this reason I think we should create a new extension auto sysop
removal, which would remove the flag from all users who didn't login
to system for some time, and if they logged back, the confirmation
code would be sent to email, so that they could reactivate the sysop
account. This would be much simpler and it would actually make hacking
to sysop accounts much harder. I also believe it would be nice if
system sent an email to holder of account when someone do more than 5
bad login attemps, in order to be warned that someone is likely trying
to compromise their account.
Hello,
Whenever an action is made in Gerrit, a notification is sent to some IRC
channels. Previously we had to map each project to an IRC channel fall
backing to #mediawiki.
I have enhanced the python script to support wildcard when filtering on
Gerrit project name. Leslie Carr has deployed the changes some minutes
ago, and now:
projects from:
- analytics and integrations are sent to #wikimedia-dev
- operations are sent to #wikimedia-operations
- labs to #wikimedia-labs
- mediawiki and mediawiki extensions to #mediawiki
Default is still #mediawiki
If someone need more specific rules, you should tweak the filenames
dictionary in templates/gerrit/hookconfig.py.erb .
As an example, we might want to send MobileFrontend notifications to
#wikimedia-mobile .
--
Antoine "hashar" Musso