Attached. This time with the Wikipedia svg files for links back to
wikipedia for GFDL compliance.
Jeff
Start request
GET /wiki/%E1%8E%BE%E1%8F%8D%E1%8E%A9%E1%8F%AF%E1%8E%A2:Wikiquote-logo.svg
Host: chr.wikigadugi.org
User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.7.8)
Gecko/20050524 Fedora/1.0.4-4 Firefox/1.0.4
Accept:
text/xml,application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip,deflate
Accept-Charset: UTF-8,*
Referer: http://chr.wikigadugi.org/wiki/Alabama
Via: 1.1 c-67-177-44-96.hsd1.ut.comcast.net:80 (squid/2.5.STABLE12)
X-Forwarded-For: 67.186.225.37
Cache-Control: max-age=259200
Connection: keep-alive
Main cache: FakeMemCachedClient
Message cache: MediaWikiBagOStuff
Parser cache: MediaWikiBagOStuff
Unstubbing $wgParser on call of $wgParser->setHook from chr2syl
Unstubbing $wgMessageCache on call of $wgMessageCache->addMessages from
wfCite
Unstubbing $wgContLang on call of $wgContLang->getMagic from MagicWord::load
Got localisation for chr from source
Got localisation for en from source
Fully initialised
Unstubbing $wgUser on call of $wgUser->isAllowed from Title::userCanRead
Unstubbing $wgLoadBalancer on call of $wgLoadBalancer->getConnection
from wfGetDB
LoadBalancer::getReaderIndex: Using reader #0: localhost...
Unstubbing $wgOut on call of $wgOut->setSquidMaxage from
MediaWiki::performAction
Unstubbing $wgLang on call of $wgLang->getNsText from ImagePage::showTOC
MessageCache::load(): got from global cache
Image::canRender: entered
Image::canRender: entered
Image::canRender: entered
OutputPage::sendCacheControl: private caching; Fri, 23 Feb 2007 09:51:28
GMT **
Request ended normally
IP: 127.0.0.1
OutputPage::sendCacheControl: private caching; **
Request ended normally
Start request
GET /wiki/%E1%8E%BE%E1%8F%8D%E1%8E%A9%E1%8F%AF%E1%8E%A2:Wikisource-logo.svg
Host: chr.wikigadugi.org
User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.7.8)
Gecko/20050524 Fedora/1.0.4-4 Firefox/1.0.4
Accept:
text/xml,application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip,deflate
Accept-Charset: UTF-8,*
Referer: http://chr.wikigadugi.org/wiki/Alabama
Via: 1.1 c-67-177-44-96.hsd1.ut.comcast.net:80 (squid/2.5.STABLE12)
X-Forwarded-For: 67.186.225.37
Cache-Control: max-age=259200
Connection: keep-alive
Main cache: FakeMemCachedClient
Message cache: MediaWikiBagOStuff
Parser cache: MediaWikiBagOStuff
Unstubbing $wgParser on call of $wgParser->setHook from chr2syl
Unstubbing $wgMessageCache on call of $wgMessageCache->addMessages from
wfCite
Unstubbing $wgContLang on call of $wgContLang->getMagic from MagicWord::load
Got localisation for chr from source
Got localisation for en from source
Fully initialised
Unstubbing $wgUser on call of $wgUser->isAllowed from Title::userCanRead
Unstubbing $wgLoadBalancer on call of $wgLoadBalancer->getConnection
from wfGetDB
LoadBalancer::getReaderIndex: Using reader #0: localhost...
Unstubbing $wgOut on call of $wgOut->setSquidMaxage from
MediaWiki::performAction
Unstubbing $wgLang on call of $wgLang->getNsText from ImagePage::showTOC
MessageCache::load(): got from global cache
Image::canRender: entered
Image::canRender: entered
Image::canRender: entered
IP: 127.0.0.1
OutputPage::sendCacheControl: private caching; **
Request ended normally
Start request
GET /wiki/%E1%8E%BE%E1%8F%8D%E1%8E%A9%E1%8F%AF%E1%8E%A2:Commons-logo.svg
Host: chr.wikigadugi.org
User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.7.8)
Gecko/20050524 Fedora/1.0.4-4 Firefox/1.0.4
Accept:
text/xml,application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip,deflate
Accept-Charset: UTF-8,*
Referer: http://chr.wikigadugi.org/wiki/Alabama
Via: 1.1 c-67-177-44-96.hsd1.ut.comcast.net:80 (squid/2.5.STABLE12)
X-Forwarded-For: 67.186.225.37
Cache-Control: max-age=259200
Connection: keep-alive
Main cache: FakeMemCachedClient
Message cache: MediaWikiBagOStuff
Parser cache: MediaWikiBagOStuff
Unstubbing $wgParser on call of $wgParser->setHook from chr2syl
Unstubbing $wgMessageCache on call of $wgMessageCache->addMessages from
wfCite
Unstubbing $wgContLang on call of $wgContLang->getMagic from MagicWord::load
Got localisation for chr from source
Got localisation for en from source
Fully initialised
Unstubbing $wgUser on call of $wgUser->isAllowed from Title::userCanRead
Unstubbing $wgLoadBalancer on call of $wgLoadBalancer->getConnection
from wfGetDB
LoadBalancer::getReaderIndex: Using reader #0: localhost...
Unstubbing $wgOut on call of $wgOut->setSquidMaxage from
MediaWiki::performAction
Unstubbing $wgLang on call of $wgLang->getNsText from ImagePage::showTOC
MessageCache::load(): got from global cache
Image::canRender: entered
Image::canRender: entered
Image::canRender: entered
IP: 127.0.0.1
OutputPage::sendCacheControl: private caching; **
Request ended normally
OutputPage::sendCacheControl: private caching; Fri, 23 Feb 2007 09:51:28
GMT **
Request ended normally
Hi,
I am seeing the NULL revision title error in importDump.php when I
trieto import the latest foundation dumps from en.
I have added the "+" character to $wgLegalTitleChars and I keep getting
the same error. The last time importDump.php
worked properly on FedoraCore 5 with MediaWiki was 1.7. I am running
the 1.8 mediawiki releases on wikigadugi.org
Still have not gotten .svg files to render at all on the site with
ImageMagick or svg. The trace files show that the svg files are
not even getting detected properly.
here's the error from importDump.php
2734300 (143.99045715519 pages/sec 143.99045715519 revs/sec)
2734400 (143.99159320383 pages/sec 143.99159320383 revs/sec)
WikiRevision given a null title in import. You may need to adjust
$wgLegalTitleChars.
Backtrace:
#0 /wikidump/en/includes/SpecialImport.php(632):
WikiRevision->setTitle(NULL)
#1 [internal function]: WikiImporter->in_page(Resource id #41,
'revision', Array)
#2 /wikidump/en/includes/SpecialImport.php(429): xml_parse(Resource id
#41, '89455135</id>? ...', 0)
#3 /wikidump/en/maintenance/importDump.php(110): WikiImporter->doImport()
#4 /wikidump/en/maintenance/importDump.php(97):
BackupReader->importFromHandle(Resource id #40)
#5 /wikidump/en/maintenance/importDump.php(132):
BackupReader->importFromStdin()#6 {main}
Jeff
Hi all,
is there a way to export section of articles by the 'special:export' feature, please?
I.e., I would like to export the sections named ''Exit list'' from the articles 'Autobahn A(number)' from English wikipedia. (Of course I can manage to develop some kind of macro to do that, if this feature is not available).
Anyway, the obvious attempt to export
Autobahn A1#Exit list
via the special:export page yields the entire page exported as a result.
Many thanks,
Claudi
---------------------------------
Découvrez une nouvelle façon d'obtenir des réponses à toutes vos questions ! Profitez des connaissances, des opinions et des expériences des internautes sur Yahoo! Questions/Réponses.
There are cases where it would be useful to run hooks from inside
extensions. For example, there's a hook in an extension that Jim
Wilson and I just wrote, and I wish there were hooks in Cite.php.
Seems to me that this should be accompanied by registering the hook
name with mediawiki to avoid name collisions. Should extension
authors just go and add these to the mediawiki.org and meta hooks
documentation? Or should there be some recommendation for hook names
from extensions?
Jim
=====================================
Jim Hu
Associate Professor
Dept. of Biochemistry and Biophysics
2128 TAMU
Texas A&M Univ.
College Station, TX 77843-2128
979-862-4054
Hi,
I'm trying to write an extension that makes an alternative category
page where the subcategories are split off from the articles. I'm
hooking at CategoryPageView, and I thought that the simplest approach
would be to extend the CategoryViewer class from CategoryPage.php.
But when I try that, I get:
[Tue Feb 20 07:26:33 2007] [error] PHP Fatal error: Class 'Article'
not found in /Library/WebServer/Documents/wiki/includes/
CategoryPage.php on line 15
This is after I require CategoryPage.php in the extension. Requiring
Article.php in the extension doesn't seem to help, but it works if I
hack in a
require_once("$IP/includes/Article.php");
in CategoryPage.php itself. Obviously, CategoryPage doesn't normally
need this, and I'd prefer to not hack the base code. I know that I
could get around this by just copying the CategoryViewer code into
the extension, but if I can I'd like to do this by inheritance.
I don't understand why this happens. The hook is called from
CategoryPage.php, so why does it have to be required again? Any
explanations or suggestions? Thanks!!
Jim
=====================================
Jim Hu
Associate Professor
Dept. of Biochemistry and Biophysics
2128 TAMU
Texas A&M Univ.
College Station, TX 77843-2128
979-862-4054
Some Wikipedias have special pages where users can report abuse via live
mirrors. Unfortunately, the procedure of dealing with them is not really
clear, so that on German Wikipedia e.g. there are lots of reports but
nobody does anything against the mirrors because there is no suitable
place where to request blocking them. The mailing list is quite a bad
place and reporting them to the server admins individually is not very
nice, also. Tim Starling proposed to make a Meta page about that and
I've created one, see http://meta.wikimedia.org/wiki/Live_mirrors. I
think it would be a bit easier for the responsible people if people
reported all live mirrors there than on single Wikis.
Greetings,
Pill (wiki.pill @ gmail.com)
An automated run of parserTests.php showed the following failures:
This is MediaWiki version 1.10alpha (r20027).
Reading tests from "maintenance/parserTests.txt"...
Reading tests from "extensions/Cite/citeParserTests.txt"...
Reading tests from "extensions/Poem/poemParserTests.txt"...
18 still FAILING test(s) :(
* URL-encoding in URL functions (single parameter) [Has never passed]
* URL-encoding in URL functions (multiple parameters) [Has never passed]
* TODO: Table security: embedded pipes (http://mail.wikipedia.org/pipermail/wikitech-l/2006-April/034637.html) [Has never passed]
* TODO: Link containing double-single-quotes '' (bug 4598) [Has never passed]
* TODO: message transform: <noinclude> in transcluded template (bug 4926) [Has never passed]
* TODO: message transform: <onlyinclude> in transcluded template (bug 4926) [Has never passed]
* BUG 1887, part 2: A <math> with a thumbnail- math enabled [Has never passed]
* TODO: HTML bullet list, unclosed tags (bug 5497) [Has never passed]
* TODO: HTML ordered list, unclosed tags (bug 5497) [Has never passed]
* TODO: HTML nested bullet list, open tags (bug 5497) [Has never passed]
* TODO: HTML nested ordered list, open tags (bug 5497) [Has never passed]
* TODO: Inline HTML vs wiki block nesting [Has never passed]
* TODO: Mixing markup for italics and bold [Has never passed]
* TODO: 5 quotes, code coverage +1 line [Has never passed]
* TODO: dt/dd/dl test [Has never passed]
* TODO: Images with the "|" character in the comment [Has never passed]
* TODO: Parents of subpages, two levels up, without trailing slash or name. [Has never passed]
* TODO: Parents of subpages, two levels up, with lots of extra trailing slashes. [Has never passed]
Passed 493 of 511 tests (96.48%)... 18 tests failed!
Hi,
I've configured proxy_check and wgBlockOpenProxies, which seems to be
working, but am still able to edit our wiki anonymously using Tor. Is
there anyway to tighten this up? It seems the majority of edits coming
from Tor proxies are from vandals.
Thanks,
Travis
On 2/21/07, Steve Bennett <stevagewp(a)gmail.com> wrote:
>
>
> Could we increase the limit to 12/minute on a trial basis?
>
> Steve
I'd support up to 20/min with approval from one of the sysadmins.
--Mets501