Hello,
PHP 5.3.0, released in June 2009, introduced namespacing, a feature we
have never used yet since we were still supporting 5.2.
Jeroen submitted https://gerrit.wikimedia.org/r/14181 which use
namespaces. Since that is, to my knowledge, the first patch that
introduce namespace, I am opening this thread so we discuss about
namespace introduction in MediaWiki.
PHP doc http://php.net/manual/en/language.namespaces.php
Thoughts?
--
Antoine "hashar" Musso
Good morning,
I've seen Phabricator has been tested and considered as a code
review solution.
It also seems there is a consensus this product didn't solve the
need but the page
http://www.mediawiki.org/wiki/Phabricator doesn't list them
Could you give me the main issues why Phabricator isn't fit for a
pre-commit code review
solution, so I can first update the page and secondly reconsider
my decision to recommend
Phabricator for an open source project willing to use Git
branches/Gerrit review Wikimedia-like
workflow?
Thank you.
--
Best Regards,
Sébastien Santoro aka Dereckson
http://www.dereckson.be/
For use in our monthly report, due to come out tomorrow
https://www.mediawiki.org/wiki/Wikimedia_engineering_report/2012/June
I'd like to know how many unique contributors ("owners") had commits
merged into the mediawiki & mediawiki/* Gerrit projects between June
1-30 inclusive. I've had luck in using "age:4d -age:34d status:merged
project:^mediawiki.* -owner:L10n-bot" as a search on
https://gerrit.wikimedia.org to get a big paginated table of all the
commits (and then I figure I'd look for all the unique owner names and
count them), but when I try that on the command line as
ssh -p 29418 gerrit.wikimedia.org gerrit query 'age:4d -age:34d
status:merged project:^mediawiki.* -owner:L10n-bot'
I get the error "fatal: "-age:34d" is not a valid option".
I'll accept either help in running this query correctly so I get the
giant table on the command line so I can gin up the status myself, or I
will simply accept a number if you want to do my homework for me. :-)
--
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation
Hi,
Just giving everyone a head's up that we'll be upgrading Gerrit this evening
from 2.3 to 2.4.2.
We're not expecting any extensive downtime for Gerrit, but we've scheduled
a one-hour window from 23:00-00:00 UTC tonight (7-8pm EDT, 4-5pm PDT).
The big feature I'm excited about in 2.4 is the new "Rebase" button, which
will hopefully make it easier to rebase your changes against your branch
without having to download the change first.
If you're interested, the release notes for 2.4 and 2.4.1 are:
http://gerrit-documentation.googlecode.com/svn/ReleaseNotes/ReleaseNotes-2.…http://gerrit-documentation.googlecode.com/svn/ReleaseNotes/ReleaseNotes-2.…
2.4.2 was a security release and only fixed one issue, but here's the notice
for that: https://groups.google.com/d/msg/repo-discuss/CTFM8KTIe34/otuuE74s__wJ
I'll send a reminder (and spam IRC) right before we begin the upgrade
process, and again when we're done. Please let me know if you have
any questions.
-Chad
Hey,
I am looking into generalizing how interwiki and interlanguage links are
stored. The motivation for this is because what's currently there does not
allow holding the information needed to have Wikidata work. Before I
continue work on this (ie figuring out how to best modify core code to not
break existing features and tools) I'd like some feedback on the schema
changes I'm thinking of making.
They are described here:
https://www.mediawiki.org/wiki/User:Jeroen_De_Dauw/Wikibase_sites_and_site_…
Some important changes to interwiki/sites compared to the current setup:
* Allows having "interlanguage links" that are not "interwiki links"
* Allows automatic synchronizing of site definitions between between all
clients (ie all Wikipedias).
* Site definitions can exist that are not used as "interlanguage link" and
not used as "interwiki link"
* Allows distinguishing between local and global identifiers
* Maps between global and local identifiers
* Holds information on the type of site. This way code can know that
bugzilla does not have a MW api.php :)
The change to the langlinks table is needed to distinguish between links
that are defined in local articles and links pulled from an external source
so that the code rebuilding links for an article knows to ditch the local
notes not in the article but keep the external ones. This is a table with
lots of rows. Anyone an estimate on how long it would take to add a boolean
field with initial value true?
Cheers
--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil.
--
Hi all
Phpunit tests that use the database should use @group Database. But that makes
them extremely slow. I'd like to elevate this problem. Here's the situation:
@group Database does two things, as far as I understand:
* MediaWikiTestCase will notice this group and use temporary tables instead of
the wiki database's actual tables. The temporary tables are re-created for every
test. This protected the wiki from modifications through test cases, and
isolates test. So far, so good.
* Jenkins will do one run for tests with @group Database, and a separate run for
tests *without* @group Database - the latter test is actually run without any
database connection. Any test using the database in any way, and be it only
indirectly and for reading, will fail if it doesn't declare @group Database. Or
so it seems.
There are a number of test cases (and there could and should be many many more)
that use the database for reading only. It would be good to have a @group
DatabaseRead, that does not enforce the slow and expensive creation of temporary
tables, but allows the tests to be run with a database connection in place. This
run could use a connection with a database user that has read-only rights to the
database.
If we decide to do this, there should be some way to define and re-create the
initial contents of the read-only database. Perhaps a maintenance script that
other parts of the code (and extensions) can register with could do the trick.
What do you think? Is this feasible?
-- daniel
All,
The Technical Operations team noticed abnormal network package losses
sometime after yesterday's 'leap second' switch (midnight UTC). While it
does not seem to impact the site availability at this moment, it is a
concern. We are still not sure if is even related to the 'leap second'
switch yet.
Leslie has opened a ticket with our network equipment provider and together
with Mark, they have been working with them to pinpoint the problem since
this morning. It is possible that they might induce some latency/issue
during the troubleshooting process.
If you do experience anything abnormal, please let us know (email to
ops(a)wikimedia.org or find us at the #wikimedia-operations IRC channel).
Thanks,
CT
Hello Everyone,
Good day!
When we had our first official out of town outreach project in Naga City,
about 300 km south of Manila, we encountered a problem when 50+ students,
librarians and professors attempt to create an account with the Bikol
Wikipedia. Apparently it only allows 6 applications per IP in 24 hours.
Is there a creative way to have not experience the same issue in our future
outreach projects? Say one school has one public IP address but many
attempt to register?
Another, we would like to ask if we can have a possibility to have an
client software that can perform to do sandbox editing for those who wants
to learn to edit Wikipedia but offline?
There will be places we would visit that has computers with no stable
internet connection but willing to contribute to Wikipedia. Say edit and
export it as xml and then upload that article in Wikipedia in bulk?
Your inputs are valuable to us.
--
Roman "Butch" Bustria Jr.
Vice President (2012-2013)
Wikimedia Philippines Inc.
------------------------------
The information contained in this message is privileged and intended only
for the recipients named. If the reader is not a representative of the
intended recipient, any review, dissemination or copying of this message or
the information it contains is prohibited. If you have received this
message in error, please immediately notify the sender, and delete the
original message and attachments.
Please consider the environment before printing this email.