An automated run of parserTests.php showed the following failures:
Running test TODO: Table security: embedded pipes (http://mail.wikipedia.org/pipermail/wikitech-l/2006-April/034637.html)... FAILED!
Running test TODO: Link containing double-single-quotes '' (bug 4598)... FAILED!
Running test TODO: Template with thumb image (with link in description)... FAILED!
Running test Template infinite loop... FAILED!
Running test TODO: message transform: <noinclude> in transcluded template (bug 4926)... FAILED!
Running test TODO: message transform: <onlyinclude> in transcluded template (bug 4926)... FAILED!
Running test BUG 1887, part 2: A <math> with a thumbnail- math enabled... FAILED!
Running test TODO: HTML bullet list, unclosed tags (bug 5497)... FAILED!
Running test TODO: HTML ordered list, unclosed tags (bug 5497)... FAILED!
Running test TODO: HTML nested bullet list, open tags (bug 5497)... FAILED!
Running test TODO: HTML nested ordered list, open tags (bug 5497)... FAILED!
Running test TODO: Parsing optional HTML elements (Bug 6171)... FAILED!
Running test TODO: Inline HTML vs wiki block nesting... FAILED!
Running test TODO: Mixing markup for italics and bold... FAILED!
Running test TODO: 5 quotes, code coverage +1 line... FAILED!
Running test TODO: HTML Hex character encoding.... FAILED!
Running test TODO: dt/dd/dl test... FAILED!
Passed 412 of 429 tests (96.04%) FAILED!
Just a heads-up; we're planning to move the backend hosting for Japanese,
Korean, Malay, and Thai Wikipedias from our Korean server cluster (yaseo) back
to the main servers in Florida (pmtpa) later this week.
They were hosted on the Korean servers for about a year as an experiment in
multi-cluster hosting, but in the end we're going to move them back. There are
several reasons for this:
* More reliable management.
Experience has shown that software updates and other issues are more likely to
have problems on the second cluster, which receives less attention from sysadmins.
These wikis will no longer be second-class citizens where math or timelines sit
broken for weeks after a change.
* More efficient use of resources.
As traffic increases, the limited number of servers we have in the yaseo cluster
are less able to handle the load. They now frequently overload during peak
access hours in Asia, and we can't add new servers as easily as we can in Florida.
Sharing the larger pool of application servers in Florida, and reusing the
remaining Korean machines for additional proxy caches, should provide better
performance during peak hours in Asia.
* Single-login preparation.
Hosting all the wikis together will let our unified login system work for these
wikis along with the others when we finally start it up.
Expected changes and disruptions:
There will be some read-only time for the wikis being moved later this week
during the actual transition, but other wikis should not be affected. We will
try to do this during off-peak hours in Asia.
URLs for uploaded images will change to use upload.wikimedia.org as on other
wikis; old URLs will redirect for compatibility so this should not disrupt anything.
Performance for the moved wikis should be better overall, though off-peak access
for logged-in users may be slightly slower due to lightspeed delays between
Korea and Florida.
-- brion vibber (brion @ pobox.com)
I tried to load the ParserFunctions hooks by Tim Starling in Mediawiki 1.7 but it doesn't work. I put the required line at the bottom of LocalSettings.php file and copied the two necessary scripts to the extensions subdir, But when I then load a page that would use these hooks (such as India) nothing happens to process the {{#if statements. Anybody know what I'm doing wrong?
http://meta.wikimedia.org/wiki/ParserFunctions
Mike O
--
_______________________________________________
Surf the Web in a faster, safer and easier way:
Download Opera 9 at http://www.opera.com
Powered by Outblaze
Sorry - here are the queries I was playing with - I had intended to
send a rather more detailed and thought out analysis, but haven't had
time in the last couple of days to play with it much more. These
times are from running the queries on phpMyAdmin on my home computer
that wasn't doing anything else strenuous at the time. As you can
see, I used count... group by instead of join. They could be the
same, inside MySQL, but it doesn't seem like it. I tried doing some
large joins last night, and they never returned any results, but that
could be an instability in my computer - I need to try it again.
Just to get this out of the way: These number are of course, only
relevant to themselves, but I was thinking of comparing the times to
return intersections against the time it takes to do something that is
known on Wikipedia (say, get the first 200 results of the
"Living_People" category?) and try to extrapolate a meaningful
estimate.
Showing rows 0 - 15 (16 total, Query took 0.3316 sec)
SQL query: SELECT cl_sortkey, count( * ) AS catcount
FROM `categorylinks`
WHERE `cl_to` = 'Fantasy_films'
OR `cl_to` = 'Disney_films '
GROUP BY cl_sortkey
HAVING catcount =2
LIMIT 0 , 30
Showing rows 0 - 9 (10 total, Query took 0.4407 sec)
SQL query: SELECT cl_sortkey, count( * ) AS catcount
FROM `categorylinks`
WHERE `cl_to` = 'United_States_Army_soldiers'
OR `cl_to` = 'German-Americans '
GROUP BY cl_sortkey
HAVING catcount =2
LIMIT 0 , 30
Showing rows 0 - 19 (20 total, Query took 0.8908 sec)
SQL query: SELECT cl_sortkey, count( * ) AS catcount
FROM `categorylinks`
WHERE `cl_to` = 'Drama_films'
OR `cl_to` = 'World_War_II_films '
GROUP BY cl_sortkey
HAVING catcount =2
LIMIT 0 , 30
Showing rows 0 - 29 (2,169 total, Query took 36.3109 sec)
SQL query: SELECT cl_sortkey, count( * ) AS catcount
FROM `categorylinks`
WHERE `cl_to` = 'Prisoners_of_war'
OR `cl_to` = 'Living_people'
GROUP BY cl_sortkey
HAVING catcount =2
LIMIT 0 , 30
Showing rows 0 - 1 (2 total, Query took 8.6470 sec)
SQL query: SELECT cl_sortkey, count( * ) AS catcount
FROM `categorylinks`
WHERE `cl_to` = 'Articles_with_unsourced_statements'
OR `cl_to` = 'American_World_War_II_veterans '
GROUP BY cl_sortkey
HAVING catcount =2
LIMIT 0 , 30
woops, wrong list. will resend to mediawiki-I
------- Forwarded message follows -------
Date sent: Fri, 29 Sep 2006 18:18:16 +0200
From: Tsahi Asher <t_asher(a)netvision.net.il>
Subject: [Wikitech-l] can't save messages
To: wikitech-l(a)wikipedia.org
Send reply to: Wikimedia developers <wikitech-l(a)wikimedia.org>
Priority: normal
[ Double-click this line for list subscription options ]
hi,
I've just upgraded from version 1.3.9 to 1.5.8, and after upgrade, I can't save
pages. when I click the Save button, I get back to the edit page. in most of the
times, it works only after I click save in the 3rd to 10th time, or even more.
if it's of any relevance, my host (sourceforge.net) doesn't allow web
applications to write to the web space, but that was the case with the previous
version too.
thanks,
tsahi
--
Tsahi Asher
Mozilla Hebrew L10n Team
http://www.mozilla.org.il
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)wikimedia.org
http://mail.wikipedia.org/mailman/listinfo/wikitech-l
------- End of forwarded message -------
hi,
I've just upgraded from version 1.3.9 to 1.5.8, and after upgrade, I can't save
pages. when I click the Save button, I get back to the edit page. in most of the
times, it works only after I click save in the 3rd to 10th time, or even more.
if it's of any relevance, my host (sourceforge.net) doesn't allow web
applications to write to the web space, but that was the case with the previous
version too.
thanks,
tsahi
--
Tsahi Asher
Mozilla Hebrew L10n Team
http://www.mozilla.org.il
Hi,
I've written an extension for uploading multiple files at once, if
anyone is interested, the details can be found here:
http://www.wikihow.com/WikiHow:MultipleUpload-Extension
If anyone wants to add it to SVN, that'd be great.
Thanks,
Travis
Hoi,
There are several projects where there are groups fighting each other
because their project was envisioned to be only about one specific
orthography or script. It is in my opinion much better to solve these issues
first and be sure that there will be no such fight in the future.
Thanks,
GerardM
On 9/29/06, Tim Starling <tstarling(a)wikimedia.org> wrote:
>
> GerardM wrote:
> > Hoi,
> > Many people are in favour of adding Gilaki to the list of languages..
> There
> > is one issue outstanding as far as I am concerned. At WiktionaryZ the
> Babel
> > templates are in an Arabic script.. The first Gilaki word however is in
> both
> > the Latin and this Arabic script.
> >
> > Another issue is that the language is called Gilaki and only Wikipedia
> calls
> > it Gileki ..
> >
> > It would be nice to have these issues resolved first..
>
> Couldn't we create it first and resolve the issues later? Unless you are
> suggesting that the existence of these issues implies that the proposers
> don't know anything about the language, I don't see any problem with
> changing either the default text direction or the language name after the
> wiki is created.
>
> -- Tim Starling
>
> _______________________________________________
> Wikipedia-l mailing list
> Wikipedia-l(a)Wikimedia.org
> http://mail.wikipedia.org/mailman/listinfo/wikipedia-l
>
48 supports, no opposition, 5 actively interested editors including a
native speaker. It certainly has the requisite # of interested
parties. Wikitech-l is perhaps more likely to grab the attention of a
developer who can set up the new wiki... where did you send previous
emails about this?
SJ
---------- Forwarded message ----------
From: M R <rastgaar(a)yahoo.com>
Date: Sep 27, 2006 10:53 PM
Subject: [Wikipedia-l] whom I have to ask for approval of Gileki
wikipedia?how find a developer?
To: "wikipedia-l wikimedia.org" <wikipedia-l(a)wikimedia.org>
hi
Gileki wikipedia have many supporters that want topromote it. We
need a provider to develop it. Also does any committe or something
like that have to approve it first?
http://meta.wikimedia.org/wiki/Requests_for_new_languages#Gileki
I appreciate your help.It is over a month that I am sending these
kind of emails around.
Rastgaar
---------------------------------
Do you Yahoo!?
Everyone is raving about the all-new Yahoo! Mail.
_______________________________________________
Wikipedia-l mailing list
Wikipedia-l(a)Wikimedia.org
http://mail.wikipedia.org/mailman/listinfo/wikipedia-l
--
++SJ
An automated run of parserTests.php showed the following failures:
Running test TODO: Table security: embedded pipes (http://mail.wikipedia.org/pipermail/wikitech-l/2006-April/034637.html)... FAILED!
Running test TODO: Link containing double-single-quotes '' (bug 4598)... FAILED!
Running test TODO: Template with thumb image (with link in description)... FAILED!
Running test Template infinite loop... FAILED!
Running test TODO: message transform: <noinclude> in transcluded template (bug 4926)... FAILED!
Running test TODO: message transform: <onlyinclude> in transcluded template (bug 4926)... FAILED!
Running test BUG 1887, part 2: A <math> with a thumbnail- math enabled... FAILED!
Running test TODO: HTML bullet list, unclosed tags (bug 5497)... FAILED!
Running test TODO: HTML ordered list, unclosed tags (bug 5497)... FAILED!
Running test TODO: HTML nested bullet list, open tags (bug 5497)... FAILED!
Running test TODO: HTML nested ordered list, open tags (bug 5497)... FAILED!
Running test TODO: Parsing optional HTML elements (Bug 6171)... FAILED!
Running test TODO: Inline HTML vs wiki block nesting... FAILED!
Running test TODO: Mixing markup for italics and bold... FAILED!
Running test TODO: 5 quotes, code coverage +1 line... FAILED!
Running test TODO: HTML Hex character encoding.... FAILED!
Running test TODO: dt/dd/dl test... FAILED!
Passed 412 of 429 tests (96.04%) FAILED!