Since there are thousands of reports of slowdowns, database errors, etc,
from time to time there are good news :-) I got a few reports from France
where, after the new squids were installed, wikipedia is faaaast.
Congratulations to the developers!
Alfio
As we were (OK: I am;-) running into trouble integrating HTML-to-XML
parsing into the Bison-based parser, I have written a specialized C++
class that can do this prior to the actual parsing. It will output only
correct XML *structure*, and (as far as I can tell) correct XHTML rules
(<tr> in <table> etc.) as well.
"Broken" HTML will be changed into < / > entities, so only valid
XML will reach the output. However, I took some care to automagically
fix the "usual suspects" (obligatory 21C3 reference) of HTML ugliness,
like not-closed <li> and various table chaos. Even a lonely <caption>
(not closed) somewhere in the text will generate a full table. It might
not be pretty, but it will be vaild XML.
While this is primarily intended for the wiki-to-XML parser, it might
work for enforcing XML output for the current parser as well. We'd only
have to run the wiki source through it before actually parsing.
Source: CVS HEAD, Module "flexbisonparse", file "html2xml.cpp". (GPL, of
course)
Magnus
Warning: mysql_query(): Unable to save result set
in /usr/local/apache/common-local/php-1.4/includes/Database.php on line 312
Database error
From Wikipedia, the free encyclopedia.
A database query syntax error has occurred. This may indicate a bug in the
software. The last attempted database query was:
SELECT cur_id,cur_namespace,cur_title FROM `cur`,`links` WHERE cur_id=l_to AND
l_from=1387711 FOR UPDATE
from within function "LinkCache::preFill". MySQL returned error "1213:
Deadlock found when trying to get lock; Try restarting transaction
(10.0.0.1)".
Article
Discussion
Edit this page
History
Move
--
NSK
Come to see the new wikiprojects at http://portal.wikinerds.org
Hi
I am new to Wiki which we are using to prepare a manual collaboratively. We have downloaded all the pages (about 130) into an XML which runs to just over 15,000 lines. I now wish to convert this to a PDF. Does anyone know if a utility already exists that might make this easier. The main problem appears to be the Wiki formatting stuff embedded in the text blocks.
Any information or suggestions?
Geoffrey
During the developers meeting at 21C3, we saw a pretty impressive
demonstartion of a map-generation software; alas, the underlying data
was (partially) contradicting itself, and the software consists of parts
in at least three programming language, all in alpha/beta stages, AFAIK.
When I saw someone upload dozens of maps on commons the other day, all
identical except for a different county marked in red, I though there
should be something simpler. Along the ideas that I published on meta
some long time ago, I wrote a PHP script for a simple map generation
tool. Source: CVS HEAD, "phase3/tools/geo.php", hereby GPL.
*Disclaimer:* This is just proof-of-concept; it would need significant
work to become useful.
This little script reads data from indexed sets of texts (currently
hardcoded as an array key=>text, but would be title=>wikipage in a
"real" implementation) and generates SVG from it. In the included
example (just run it from command line) it generates a simplified ;-]
map of Germany, showing east and west Germany (no political implications
here, just testing!) as well as the Danube.
The drwaing is based on poly-lines made from coordinate pairs in the
text entries. The poly-lines are then appended to each other as needed,
eventually forming polygons. This will save a great deal of work and
keep the data consistent: a border line between two states, for example,
is only defined once, then "included" into the polygons of both states.
To take this even further, is a river partially forms the border, the
very same polyline can be used as part of the river.
Currently, it draws each polygon separately, so in my demo, "Germany" is
still divided into two states :-(
However, I plan on a "merging" method, which will fuse two polygons into
one by removing common borders, so East and West Germany could be drawn
individually, with borders, or as the entity "Germany", with one common
border around it.
Imagine this for the U.S.: If we had the border data for all counties,
and want to draw a world map with the U.S. as a single entity, the
county polygons would be fused into state polygons, which then would be
fused into a few country polygons (continental, Hawaii, Alaska).
In my implementation, all data sets can carry meta information as well.
The Danube in the example is painted in blue because it carries a
"type:river" marker. Different object groups can be defined for
different purposes (political, geographical, etc.) Also, objects can be
labeled in all languages in-place. I currently don't show the labels,
though.
In a "real" implementation, all data would be stored as wiki pages. A
query to render some complicated view might take some time, then. But,
as long as we don't need real-time refreshment (cache as cache can!), I
would deem such a soultion prefferable to uploading (and maintaining) a
zillion map images...
Magnus
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Hello,
Looks like one of the trouble with the apaches server is that some of
them can have more client at a given time cause they have more memory.
I did some basic http configuration that would allow 512MB apaches to
run with MaxClient 40 and 1024GB apaches with 70.
The scripts ( /h/w/conf/ ):
httpd-define.conf : call a custom conf depending on a variable
'HOST'.$hostname
httpd-(512|1024|3072)MB.conf : different MaxClient values
In the /h/w/b/apache-* scripts, we need to append at the end of the
apachectl command:
-D"HOST$($HOST_NAME)"
So for tingxi, apachectl will get a HOSTtingxi variable defined. The
http-define.conf will then call httpd-512MB.conf .
Et voilà, different memory capabilities = different MaxClient .
cheers,
- --
Ashar Voultoiz - WP++++
http://en.wikipedia.org/wiki/User:Hashar
Servers in trouble ? noc (at) wikimedia (dot) org
"This signature is a virus. Copy me in yours to spread it."
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.4 (GNU/Linux)
iD8DBQFB5Y5mpmyHQ2O4INERAlsIAJ4vth3wUrrWyKTK5faqEZNeo6N1gwCgwhvl
3uzrFugKe/5ScpfYKPMymN0=
=EM67
-----END PGP SIGNATURE-----
Hi again!
I urgently need the csv files that are generated by Erik Zachte's wikistat. They
were accesible via http://www.wikipedia.org/wikistats/csv/ but the directory is
not accesible anymore. Can anyone just make it readable by Apache or tell me
where is it gone, please?
Thanks a lot,
Jakob
Hi,
Is it within your roadmap to include the wikixml import capability in
MediaWiki 1.4?
thnx,
--
NSK
Come to see the new wikiprojects at http://portal.wikinerds.org
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Hello,
tingxi is a 512MB apache and I noticed it is swapping "a bit":
$ free -m | grep -i swap
Swap: 1019 280 738
$
I then looked at RSS of the various process. With 715060 of RSS httpd
comes first, which is not surprising, the followers are:
python 20312
stream 24756
tail 4760
php 91036
- -------------
total 140854
All are related to the irc bots.
Maybe by putting bots on another machine we can save some memory on
the tingxi apache server ? Or is swapping fine ?
cheers,
- --
Ashar Voultoiz - WP++++
http://en.wikipedia.org/wiki/User:Hashar
Servers in trouble ? noc (at) wikimedia (dot) org
"This signature is a virus. Copy me in yours to spread it."
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.4 (GNU/Linux)
iD8DBQFB5KtwpmyHQ2O4INERAhhZAJ9rewqtQK2F693xD9zEKvERrZbfXACgtypY
0/xAJVAnrdcQbv69qQGaYv0=
=LZYw
-----END PGP SIGNATURE-----
hello,
i try to install mediawiki, it fails with the following error messages:
---------------
MediaWiki 1.3.9 installation
Please include all of the lines below when reporting installation problems.
Checking environment...
Warning: set_time_limit(): Cannot set time limit in safe mode in
/mnt/kw1/02/103/00000015/htdocs/install-utils.inc on line 27
* PHP 4.3.9: ok
* Warning: PHP's safe mode is active! You will likely have problems
caused by this. You may need to make the 'images' subdirectory writable or
specify a TMP environment variable pointing to a writable temporary
directory owned by you, since safe mode breaks the system temporary
directory.
* PHP server API is cgi; using ugly URLs (index.php?title=Page_Title)
* Have XML / Latin1-UTF-8 conversion support.
* PHP's memory_limit is 8M. If this is too low, installation may fail!
Attempting to raise limit to 20M... failed.
* Have zlib support; enabling output compression.
* Found GD graphics library built-in, image thumbnailing will be enabled
if you enable uploads.
* Installation directory: /mnt/kw1/02/103/00000015/htdocs
* Script URI path: /werkbankdeutschland.de
Warning: $wgProxyKey is insecure
---------------------
can anybody help me what to do?
thank you in advance,
florian (from munich, germany)