Surprised? Me too!
Please read / watch / discuss
https://www.mediawiki.org/wiki/Summer_of_Code_2013
*Nothing* about GSOC 2013 is confirmed at this point, but there is no
harm in starting collecting ideas and recruiting participants.
Your feedback is welcome at the wiki page - or here if you are really
really lazy. Reason: potential participants visiting that page in the
near future will have an easier time following background discussions if
they are take place there.
Thank you!
--
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
Hi Platonides
2013/1/15 Platonides <Platonides(a)gmail.com>:
>> I tried to get the content via getArticleID() ...
>> $titleObj=Title::newFromText("Existing page");
>> $articleID=$titleObj->getArticleID();
>> Article::newFromID($articleID)->fetchContent();
>> etc.
>> ... but it returns $articleID=0 although the page exits. With MW 1.18
>> this approach worked fine, but after upgrade to MW 1.20.2 it does not
>> any more.
>
>
> It should be working, and it works for me on 1.20.2
> Can you provide more details on that $title->getArticleID(); which is
> not working?
On the http://offene-naturfuehrer.de/web/Spezial:MobileKeyV1
I want to generate a MobileKey for "Lamium (Deutschland)" (the page
exists) and the PHP code wants to get the content of page "Unterlippe
(Lamiaceae)" (the page exists too) but the PHP code above stops at the
step retrieving the articleID. The strange thing is, when I want to
generate a MobileKey for "Lamium (Deutschland)" and just print out
(1) Title::newFromText("Unterlippe (Lamiaceae)") and
(2) Title::newFromText("Lamium (Deutschland)")
... I get only an articleID for (2) not for (1)
Andreas
--- printout of title object for page (1) "Unterlippe (Lamiaceae)"
Title Object
(
[mTextform] => Unterlippe (Lamiaceae)
[mUrlform] => Unterlippe_(Lamiaceae)
[mDbkeyform] => Unterlippe_(Lamiaceae)
[mUserCaseDBKey] => Unterlippe_(Lamiaceae)
[mNamespace] => 0
[mInterwiki] =>
[mFragment] =>
[mArticleID] => 0
[mLatestID] =>
[mEstimateRevisions:Title:private] =>
[mRestrictions] => Array
(
)
[mOldRestrictions] =>
[mCascadeRestriction] =>
[mCascadingRestrictions] =>
[mRestrictionsExpiry] => Array
(
)
[mHasCascadingRestrictions] =>
[mCascadeSources] =>
[mRestrictionsLoaded] =>
[mPrefixedText] =>
[mTitleProtection] =>
[mDefaultNamespace] => 0
[mWatched] =>
[mLength] => -1
[mRedirect] =>
[mNotificationTimestamp] => Array
(
)
[mHasSubpage] =>
)
--- printout of title object for page (2) "Lamium (Deutschland)"
Title Object
(
[mTextform] => Lamium (Deutschland)
[mUrlform] => Lamium_(Deutschland)
[mDbkeyform] => Lamium_(Deutschland)
[mUserCaseDBKey] => Lamium_(Deutschland)
[mNamespace] => 0
[mInterwiki] =>
[mFragment] =>
[mArticleID] => 36
[mLatestID] =>
[mEstimateRevisions:Title:private] =>
[mRestrictions] => Array
(
)
[mOldRestrictions] =>
[mCascadeRestriction] =>
[mCascadingRestrictions] =>
[mRestrictionsExpiry] => Array
(
)
[mHasCascadingRestrictions] =>
[mCascadeSources] =>
[mRestrictionsLoaded] =>
[mPrefixedText] =>
[mTitleProtection] =>
[mDefaultNamespace] => 0
[mWatched] =>
[mLength] => -1
[mRedirect] =>
[mNotificationTimestamp] => Array
(
)
[mHasSubpage] =>
)
Hey,
I have a custom action in which I construct HTML to display by rendering
some user provided wikitext and some grammatically build chunks of HTML. It
looks like this:
$html = 'some html';
$html .= $this->getOutput()->parse( 'wiki text' );
$html .= 'more html';
$html .= 'even more html';
These last two segments of HTML start with a h2, which I would like to show
up in the same TOC as rendered by the parsed wiki text. I tried some things
but did not find a way to make this work, so I'm looking for some pointers.
Note: this HTML construction is happening inside a function that should
return HTML and not have side effects (so its result can be cached). It is
thus not possible to do anything in the direction of
$this->getOutput()->addHTML().
Cheers
--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil.
--
GNUnify[1] is a popular FOSS conference that happens in India every year.
Last year WMF had a pretty heavy presence at GNUnify. This year's CFP just
opened up. Are there existing plans for the WMF to participate?
[1]: http://gnunify.in/
--
Yuvi Panda T
http://yuvi.in/blog
All,
We will be proceeding with the datacenter switchover plan this coming
Tuesday (Jan 22, 2013), unless we discover some unexpected and
insurmountable issues in our tests between now and then.
During the 8-hour migration window on the 22nd, 23rd and 24th (from 17:00
UTC to 01:00 UTC hours / 9am to 5pm PST), there would be times (lasting
about 30 minutes) where the site would be set to "read-only" mode, to
facilitate master database switchovers from one datacenter to another.
While the site should be available to readers, no new contents could be
created, edited or uploaded.
We are aware of the inconvenience and we have put together plans to
minimize such annoyances, e.g., automating much of the procedures,
mitigating known risks, and performing tests to identify issues prior to
deployment. Given the scale and complexity of this migration, we do realize
not all operational impact is predictable. Some users could experience
intermittent site unavailability and/or performance issues unfortunately.
You can follow the migration on chat.freenode.net <http://irc.freenode.net>
(and not irc.freenode.org as mentioned in previous email) in the
#wikimedia-operations channel.
Thanks,
CT Woo
---------- Forwarded message ----------
From: Ct Woo <ctwoo(a)wikimedia.org>
Date: Fri, Jan 11, 2013 at 12:07 PM
Subject: Update on Ashburn data center switchover / migration – target date
is week of 1/22/13
To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>, Development and
Operations Engineers <engineering(a)lists.wikimedia.org>
All,
The Migration team is in the last lap on completing the remaining tasks to
ready our software stack and Ashburn infrastructure for the big switchover
day.
Per my last update,<http://lists.wikimedia.org/pipermail/wikitech-l/2012-October/063668.html>
with the Fundraising activity behind us now, the team has scheduled the *week
of 22nd January*, 2013 to perform the switchover. We are going to block a
8-hour migration window on the *22nd, 23rd and 24**th*. During those
periods, *17:00 UTC to 01:00 UTC hours (9am to 5pm PST*), there will be
intermittent blackouts and they will be treated as 'planned' outages. You
can follow the migration on irc.freenode.org in the #wikimedia-operations
channel.
The team is putting the finishing touches to the last few tasks and we will
make the final Go/No decision on 18th Jan, 2013. An update will send out
then. For those interested in tracking the progress, the meeting notes are
captured on this wikitech
page<http://wikitech.wikimedia.org/view/Eqiad_Migration_Planning#Improving_Switc…>
.
*Please note that we will be restricting code deployment during that week,
allowing only emergency and critical ones only.*
Thanks.
CT Woo
I realize this list has heard the news already, but I thought I'd share the
full announcement nonetheless.
---------- Forwarded message ----------
Posted today on the Wikimedia Tech Blog:
https://blog.wikimedia.org/2013/01/19/wikimedia-sites-move-to-primary-data-…
Wikimedia sites to move to primary data center in Ashburn, Virginia
Next week, the Wikimedia Foundation will transition its main technical
operations to a new data center in Ashburn, Virginia, USA. This is intended
to improve the technical performance and reliability of all Wikimedia
sites, including Wikipedia.
Engineering teams have been preparing for the migration to minimize
inconvenience to our users, but major service disruption is still expected
during the transition. Our sites will be in read-only mode for some time,
and may be intermittently inaccessible. Users are advised to be patient
during those interruptions, and share
information<https://meta.wikimedia.org/wiki/Wikimedia_maintenance_notice>in
case of continued outage or loss of functionality.
The current target windows for the migration are January 22nd, 23rd and
24th, 2013, from 17:00 to 01:00 UTC (see other
timezones<http://www.timeanddate.com/worldclock/fixedtime.html?msg=Wikimedia+data+cen…>on
timeanddate.com).
Wikimedia sites have been hosted in our main data center in Tampa, Florida,
since 2004; before that, the couple of servers powering Wikipedia were in
San Diego, California. Ashburn is the third and newest primary data center
to host Wikimedia sites.
A major reason for choosing Tampa, Florida as the location of the primary
data center in 2004 was its proximity to founder Jimmy Wales’ home, at a
time when he was much more involved in the technical operations of the
site. In 2009, the Wikimedia Foundation’s Technical Operations team started
to look<https://blog.wikimedia.org/2009/04/07/wmf-needs-additional-datacenter-space/>for
other locations with better network connectivity and more clement
weather. Located in the Washington, D.C. metropolitan area, Ashburn offers
faster and more reliable connectivity than Tampa, and usually fewer
hurricanes.
The Operations team started to plan and prepare for the Virginia data
center in Summer 2010. The actual build-out and racking of servers at the
colocation facility started in February 2011, and was followed by a long
period of hardware, system and software configuration. Traffic started to
be served to users from the Ashburn data center in November 2011, in the
form of CSS and JavaScript assets (served from “bits.wikimedia.org“).
We reached a major milestone in February 2012, when caching servers were
set up to handle read-only requests for Wikipedia and Wikimedia content,
which represent most of the traffic to Wikipedia and its sister sites. In
April 2012, the Ashburn data center also started to serve media files (from
“upload.wikimedia.org“).
Cacheable requests represent about 90 percent of our traffic, leaving 10
percent that requires interaction with our web (Apache) and database
(MySQL) servers, which are still being hosted in Tampa. Until now, every
edit made to a Wikipedia page has been handled by the servers in Tampa.
This dependency on our Tampa data center was responsible for the site
outage in August
2012<https://blog.wikimedia.org/2012/08/06/wikimedia-site-outage-6-august-2012/>,
when a fiber cut severed the connection between our two locations.
Starting next week, the new servers in Ashburn will take on that role as
well, and all our sites will be able to function fully without relying on
the servers in Florida. The legacy data center in Tampa will continue to be
maintained, and will serve as a secondary “hot failover” data center:
servers will be in standby mode to take over, should the primary site
experiences an outage. Server configuration and data will be synchronized
between the two locations to ensure a transition as smooth as possible in
case of technical difficulties in Ashburn.
Besides just installing newer hardware, setting up the data center in
Ashburn has also been an opportunity for architecture overhauls, like
incremental improvements of the text storage
system<https://blog.wikimedia.org/2011/11/18/nobody-notices-when-its-not-broken-ne…>,
and the move to an entirely new media storage
system<https://blog.wikimedia.org/2012/02/09/scaling-media-storage-at-wikimedia-wi…>to
keep up with the growth of the content generated and curated by our
contributors.
Wikimedia’s technical infrastructure aims to be as open and collaborative
as the sites it powers. Most of the configuration of our
servers<https://blog.wikimedia.org/2011/09/19/ever-wondered-how-the-wikimedia-serve…>is
publicly accessible, and the Wikimedia
Labs <https://blog.wikimedia.org/2012/04/16/introduction-to-wikimedia-labs/>initiative
allows contributors to test and submit improvements to the
sites’ configuration files.
The Wikimedia Foundation currently operates a total of about 885 servers,
and serves about 20 billion page views a month, on a non-profit budget that
relies almost entirely on donations from readers.
--
Guillaume Paumier
Technical Communications Manager — Wikimedia Foundation
https://donate.wikimedia.org
Hey,
Who is the maintainer (or rather, is anybody interested in reviewing
patchsets) for the TorBlock extension, because I have a few changes pending?
*--*
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com
| tylerromeo(a)gmail.com
Hi all,
This came up on the repo-discuss mailing list[0], and I thought
some people may find the material interesting. Please note
it's a work in progress, but I'm sure they'd love feedback.
-Chad
[0] https://groups.google.com/d/topic/repo-discuss/rdmzByl3_M4/discussion
---------- Forwarded message ----------
From: Edwin Kempin <edwin.kempin(a)gmail.com>
Date: Fri, Jan 18, 2013 at 11:21 AM
Subject: Gerrit Training Material
To: Repo and Gerrit Discussion <repo-discuss(a)googlegroups.com>
Hi,
at SAP we are doing a lot of trainings for users that are new to Git
and Gerrit. We normally do a 1 day hands-on workshop where participants
learn how to work with EGit and Gerrit. There are exercises so that
everybody can get familiar with the tools and the workflows.
The basic Git and Gerrit concepts are explained in details and
additionally background information about code review and best
practices are provided.
We have now made this training material and other Gerrit related
presentations available in a new Gerrit project called
'training/gerrit' [1,2]. We hope that this is of value for others and
want to invited everyone who is interested to help us to improve the
training material over time. Maybe you even have already training
material that you would like to share.
There is a second new project 'training/sample' [3] which contains the
sample application that is used as basis for the exercises of the
Git/Gerrit workshop.
Edwin
[1] https://gerrit-review.googlesource.com/#/admin/projects/training/gerrit
[2] https://gerrit-review.googlesource.com/41390
[3] https://gerrit-review.googlesource.com/#/admin/projects/training/sample
--
To unsubscribe, email repo-discuss+unsubscribe(a)googlegroups.com
More info at http://groups.google.com/group/repo-discuss?hl=en
Hi,
I would like to remind even here that we have moved the source code to
github this week.
In case anyone is interesting in improving huggle or joining the project,
you are welcome to do so: https://github.com/benapetr/huggle
Please note that branch csharp is the branch containing latest version you
probably want to work on. Trunk is huggle 2x.
Have fun