We've been running on MW 1.4 for longer than we should have been, and
tried upgrading today to 1.6.7. We're running a Squid server on one
machine, and PHP 4.4.2 with Apache 2.0.46 with eAccelerator installed
on another, and then a third server for mySQL 4.1, all machines are
Linux.
I did a db dump, imported it into a new database, went through
config/index.php to convert the new database and pointed Apache to the
new source directory. The load on the Apache server went through the
roof and won't stay down. I tried restarting Apache, rebooting the
server, any way to bring the load down to 0, then started Apache and
Squid back up, and the load would go through the roof again - and it
stays there. The database shows a lot of open & sleeping connections,
but none are locked. Switching back to 1.4 seems to solve the load
problem.
Any ideas? Did I miss something during the upgrade process?
Travis
Hi,
I've been getting a unknown SQL error for any function that uses the
DB_SLAVE database (I only have 1 database) when running a script from
the command line. The script is pretty basic, it's just cleaning up
some talk pages that have broken HTML in them. I've done what
maintenance/cleanUpSpam does with
$dbw =& wfGetDB( DB_MASTER );
$dbw->immediateBegin();
and then
$dbw->immediateCommit();
wfDoUpdates();
but am getting this error:
any ideas?
PHP Warning: mysql_query(): Unable to save result set in
/var/www/html/wiki16/includes/Database.php on line 435
Warning: mysql_query(): Unable to save result set in
/var/www/html/wiki16/includes/Database.php on line 435
A database error has occurred
Query: SELECT old_text,old_flags FROM `text` WHERE old_id = '197785'
LIMIT 1
Function: Revision::loadText
Error: 2000 Unknown MySQL error (10.234.169.196)
Backtrace:
GlobalFunctions.php line 602 calls wfbacktrace()
Database.php line 473 calls wfdebugdiebacktrace()
Database.php line 419 calls databasemysql::reportqueryerror()
Database.php line 806 calls databasemysql::query()
Database.php line 825 calls databasemysql::select()
Revision.php line 667 calls databasemysql::selectrow()
Revision.php line 435 calls revision::loadtext()
Revision.php line 424 calls revision::getrawtext()
Parser.php line 2805 calls revision::gettext()
Parser.php line 2642 calls parser::fetchtemplate()
- line - calls parser::bracesubstitution()
Parser.php line 2231 calls call_user_func()
Parser.php line 2319 calls parser::replace_callback()
Parser.php line 2725 calls parser::replacevariables()
- line - calls parser::bracesubstitution()
Parser.php line 2231 calls call_user_func()
Parser.php line 2319 calls parser::replace_callback()
Parser.php line 824 calls parser::replacevariables()
Parser.php line 231 calls parser::internalparse()
Article.php line 2303 calls parser::parse()
Article.php line 1493 calls article::editupdates()
cleanupTalkPages.php line 18 calls article::updatearticle()
cleanupTalkPages.php line 35 calls cleanuparticle()
An automated run of parserTests.php showed the following failures:
Running test Table security: embedded pipes (http://mail.wikipedia.org/pipermail/wikitech-l/2006-April/034637.html)... FAILED!
Running test Link containing double-single-quotes '' (bug 4598)... FAILED!
Running test Template with thumb image (wiht link in description)... FAILED!
Running test message transform: <noinclude> in transcluded template (bug 4926)... FAILED!
Running test message transform: <onlyinclude> in transcluded template (bug 4926)... FAILED!
Running test BUG 1887, part 2: A <math> with a thumbnail- math enabled... FAILED!
Running test Language converter: output gets cut off unexpectedly (bug 5757)... FAILED!
Running test HTML bullet list, unclosed tags (bug 5497)... FAILED!
Running test HTML ordered list, unclosed tags (bug 5497)... FAILED!
Running test HTML nested bullet list, open tags (bug 5497)... FAILED!
Running test HTML nested ordered list, open tags (bug 5497)... FAILED!
Running test Parsing optional HTML elements (Bug 6171)... FAILED!
Running test Inline HTML vs wiki block nesting... FAILED!
Running test Mixing markup for italics and bold... FAILED!
Running test 5 quotes, code coverage +1 line... FAILED!
Running test HTML Hex character encoding.... FAILED!
Running test dt/dd/dl test... FAILED!
Passed 409 of 426 tests (96.01%) FAILED!
Hi, I am trying to run my bot on pms.wikipedia.org (User RobotSC)
I already have bot status on pms wikipedia.
For some reason it does not allow for login with login.py
I just tried to do the same on nap and vec: there it works.
I checked the user-config.py twice, so it should not be a problem.
Could it be that bots are not allowed on pms wikipedia?
Thank you for having a look.
Best, Sabine
Chiacchiera con i tuoi amici in tempo reale!
http://it.yahoo.com/mail_it/foot/*http://it.messenger.yahoo.com
An automated run of parserTests.php showed the following failures:
Running test Table security: embedded pipes (http://mail.wikipedia.org/pipermail/wikitech-l/2006-April/034637.html)... FAILED!
Running test Link containing double-single-quotes '' (bug 4598)... FAILED!
Running test Template with thumb image (wiht link in description)... FAILED!
Running test message transform: <noinclude> in transcluded template (bug 4926)... FAILED!
Running test message transform: <onlyinclude> in transcluded template (bug 4926)... FAILED!
Running test BUG 1887, part 2: A <math> with a thumbnail- math enabled... FAILED!
Running test Language converter: output gets cut off unexpectedly (bug 5757)... FAILED!
Running test HTML bullet list, unclosed tags (bug 5497)... FAILED!
Running test HTML ordered list, unclosed tags (bug 5497)... FAILED!
Running test HTML nested bullet list, open tags (bug 5497)... FAILED!
Running test HTML nested ordered list, open tags (bug 5497)... FAILED!
Running test Parsing optional HTML elements (Bug 6171)... FAILED!
Running test Inline HTML vs wiki block nesting... FAILED!
Running test Mixing markup for italics and bold... FAILED!
Running test 5 quotes, code coverage +1 line... FAILED!
Running test HTML Hex character encoding.... FAILED!
Running test dt/dd/dl test... FAILED!
Passed 409 of 426 tests (96.01%) FAILED!
Hi,
I just deleted about 10 images or so on English Wikipedia. Only three of
them show up in the deletion log -- maybe because these three had pages
associated with them, while the others may not have. I think this is a
significant bug that needs to be fixed, but I'm too tired to search
MediaZilla for it now. Someone please report it for me, or explain why
it's not a bug :-)
Timwi
I'm making some changes to MediaWiki's localisation files, to make them
easier to write, and to improve loading speed at runtime. I've often
complained about the fact that MediaWiki's LanguageXx.php files force every
would-be translator to also be a programmer. What I'm doing with this change
is to make it so that in 90% of cases, a LanguageXx.php file won't be
necessary, its function will be replaced by the simpler MessagesXx.php file.
The format of the MessagesXx.php file will look like this:
<?php
$fallback = 'en';
$rtl = false;
$timeBeforeDate = true;
$timeSeparator = ':';
$timeDateSeparator = ', ';
$digitTransformTable = null;
$separatorTransformTable = null;
$namespaceNames = array(
NS_MEDIA => 'Media',
...
);
$quickbarSettings = array( ... );
$skinNames = array( ... );
$mathNames = array( ... );
$dateFormats = array(
MW_DATE_DEFAULT => 'No preference',
...
);
$bookstoreList = array( ... );
$weekdayNames = array( ... );
$monthNames = array( ... );
$monthNamesGen = array( ... );
$monthAbbreviations = array( ... );
$magicWords = array(
# ID CASE SYNONYMS
'redirect' => array( 0, '#REDIRECT' ),
...
);
$messages = array(
...
);
?>
Note that there are no globals, no classes, and no language code variable
name suffixes. The language code is simply specified by the filename. The
$fallback variable specifies a fallback language which should be used if
anything in the present localisation file is missing. If $fallback is
missing, English will be used.
If the localisation is sufficiently specified by these variables (and any
others we think need to be added), then the LanguageXx.php file will not be
needed. In cases where the language file needs to be kept, due to special
code, all of the customised accessors for the above variables will be
deleted, leaving only the special code.
The contents of the MessagesXx.php files will be cached, if a cache is
available. The modification time of the file will be checked on every
request, so when the file is updated, the cache is immediately invalidated.
I've already done most of the technical work for this change, including
converting three languages as a pilot. Most of what remains is the rote work
of converting all of the language files. I might consider committing what
I've done to a branch, if someone else wants to help out with this.
Also in my working copy at the moment is a scheme to delay the
initialisation of most object global variables, but that's a topic for
another post.
-- Tim Starling
[This is the E-mail I am referring to in the message that now precedes
this, since it bounced the first time round, since I included a load of
attachments with the original]
I read on the mailing list that AOL are turning on XFF on their proxies.
I have been poking around a bit to find out how the AOL proxy
information squares up with reports of AOL vandalism.
== The boring details ==
First, I did a DNS reverse lookup on the entire AOL Proxy IP range, as
defined in http://webmaster.info.aol.com/proxyinfo.html using dig in
batch mode. Out of the 56542 addresses in those ranges, only 6740 give a
valid reverse lookup.
I then went through the list of 116 pages flagged with the {{AOL}}
template, and extracted their IP addresses, and compared them to the
list of reverse lookups generated above.
Out of 116 pages with addresses flagged with {{AOL}}:
* 76 were found in the list of reverse lookups generated above, and
every one of them had an address of the form *.proxy.aol.com
* 40 were not
I then ran the 40 remaining addresses through dig -x.
Of these, 32 had valid reverse lookups, most of which were of the form
[8 hex digits, starting with AC].ipt.aol.com (for example,
ACA40DC5.ipt.aol.com.) Every single one of these was in the address
range 172.128.0.0 - 172.216.255.255, assigned to AOL clients according
to the AOL proxy info page.
Five of the addresses flagged with {{AOL}} did not belong to AOL
addresses at all, namely:
195.9.72.12.in-addr.arpa. 172768 IN PTR
195.los-angeles-19-20rs.ca.dial-access.att.net.
22.2.25.138.in-addr.arpa. 86369 IN PTR www2.itd.uts.edu.au.
176.72.250.134.in-addr.arpa. 86370 IN PTR elc214-176.lab.suu.edu.
10.219.196.205.in-addr.arpa. 272 IN PTR franc.dreamhost.com.
5.21.196.69.in-addr.arpa. 1773 IN PTR
CPE00609425bbe3-CM00080d7f2c84.cpe.net.cable.rogers.com.
The remaining five {{AOL}}-flagged pages, which appear to have no
reverse lookup at all, are (with whois lookups):
;163.130.157.152.in-addr.arpa. -> 152.157.130.163 -> Washington
School Information Processing Cooperative
;106.209.188.205.in-addr.arpa. -> 205.188.209.106 -> AOL
;76.164.174.149.in-addr.arpa. -> 149.174.164.76 -> Compuserve (ie AOL)
;234.96.12.64.in-addr.arpa. -> 64.12.96.234 -> AOL
;135.209.188.205.in-addr.arpa. -> 205.188.209.135 -> AOL
and of the last four given as AOL addresses, all of them were in the
AOL server IP address ranges given by AOL.
== Summary ==
Out of all of the 116 pages flagged as {{AOL}}:
* six are bogus non-AOL addresses, and are probably attempts by vandals
to confuse anti-vandalism efforts
* 32 are from the AOL client range, with *.ipt.aol.com reverse lookups
* 76 are from the official AOL proxy range, with *.proxy.aol.com reverse
lookups; they all appear to be either of the form
cache-XXX-XXXX.proxy.aol.com [74 of them], or
spider-XXX-XXXXX.proxy.aol.com. [2 of them]
* four have no reverse lookup, but are in the official AOL proxy range
Out of the 458 cache-*.proxy.aol.com servers, only 74 are flagged with
{{AOL}}
Out of the 2537 spider-*.proxy.aol.com servers, only 2 are flagged with
{{AOL}}
== Conclusions ==
* It seems safe to assume that *.proxy.aol.com servers are valid AOL
proxies; these account for about two-thirds of all {{AOL}} warnings
* It _might well_ be safe to assume that other servers from the AOL
server range without reverse lookups are also AOL proxies, but I'm not
sure that this is necessarily so; these account for < 5% of the valid
{{AOL}} warnings.
* But about a third of {{AOL}} warnings are about IPs with reverse
lookups of the form *.ipt.aol.com in the AOL client IP address range:
are these AOL proxies or not? They might, for example, be dynamically
assigned client addresses. If so, we should _definitely not_ be trusting
any XFF headers from these.
[On review: not all the figures sum to 100% so I may have dropped a
couple in my counting, but I think the overall conclusions still hold up]
-- Neil
Hi all!
I am trying to import Ontologies into Mediawiki. I downloaded a couple of
ontologies from the web in .rdf format and imported them using the "Import
ontologies" Special page. However, every time I hit Import an error message
appears (see below).
I am running MediaWiki 1.6.7 and SemanticMediaWiki extension 0.4.3.2. I also
added the rdfapi-php to the SemanticMediaWiki "libs" folder. It looks like
the .php script is not recognized but I do not really know what the problem
is.
Does anybody has experiences with importing ontologies and have an idea what
the cause of this problem might be?
Thank you very much!
Best,
Mei
------------------error
message------------------------------------------------------------------------------
* * @package sparql */ class SparqlClient extends Object { var $server; var
$output; /** * Constructor of SparlClient. * * @param String $server server
address. */ function SparqlClient($server){ $this->server = $server;
$this->output = "array"; } /** * Sets the output format for a SELECT or ASK
query. Possible formats are "xml" for * Sparql Query Results XML Format (
http://www.w3.org/TR/rdf-sparql-XMLres/) or array * for the format described
in our SparqlEngine. * * @param String $format the format. */ function
setOutputFormat($format){ if(strtolower($format)=="xml") $this->output =
"xml"; if(strtolower($format)=="array") $this->output = "array"; } /** *
Main function of SparqlClient. * * @param ClientQuery $query the ClientQuery
object. * @return mixed returns an array that contains the variables an
their bindings or a MemModel * */ function query($query){
if(!is_a($query,"ClientQuery")) die;//ErrorHandling $url =
$this->_buildurl($query); $result = $this->_http_get($url); return
$this->returnResult($result); } /** * Helper function that builds the url. *
* @param ClientQuery $query the ClientQuery Object. */ function
_buildurl($query){ $url = ""; $url =
$this->server."?query=".urlencode($query->query); foreach($query->default as
$defaultg){ $url = $url."&default-graph-uri=".$defaultg; }
foreach($query->named as $namedg){ $url = $url."&named-graph-uri=".$namedg;
} return $url; } /** * Returns the query result. * * @param String $result
the result. * @return mixed */ function returnResult($result){
if(strpos($result,"generateModel(substr($result,strpos($result,"output ==
"xml"){ $pos = strpos($result,"output == "array"){ // $pos =
strpos($buffer,"parseResult($result); } return $result; } function
parseResult($buffer){ include(RDFAPI_INCLUDE_DIR.PACKAGE_SYNTAX_SPARQLRES);
$parser = new SparqlResultParser(); return $parser->parse($buffer); } /** *
Executes the GET Request. * * @param String $url the url. * @return String
result. */ function _http_get($url) { $url = parse_url($url); $port =
isset($url['port']) ? $url['port'] : 80; $fp = fsockopen($url['host'],
$port); $replace = $url['path']; fputs($fp, "GET
".$replace."?".$url['query']." HTTP/1.0\n"); fputs($fp, "Host:".
$url['host']." \r\n"); fputs($fp, "Accept: text/html \r\n"); fputs($fp,
"Connection: close\n\n"); $buffer = ""; while ($tmp = fread($fp, 1024)) {
$buffer .= $tmp; } // return $buffer; $pos = strpos($buffer," * * @package
sparql */ class ClientQuery extends Object { var $default = array(); var
$named = array(); var $prefixes = array(); var $query; /** * Adds a default
graph to the query object. * * @param String $default default graph name */
function addDefaultGraph($default){ if(!in_array($named,$this->default))
$this->default[] = $default; } /** * Adds a named graph to the query object.
* * @param String $default named graph name */ function
addNamedGraph($named){ if(!in_array($named,$this->named)) $this->named[] =
$named; } /** * Adds the SPARQL query string to the query object. * * @param
String $query the query string */ function query($query){ $this->query =
$query; } } ?>
---------------------------------------------------------------------------------------------------------------------------end
error message---------------------------------
Hi,
Is it possible to have a custom left navigation for a custom namespace?
Or is 'MediaWiki:Sidebar' applied to all pages in a wiki, regardless of
the namespace?
Thanks,
~mm