[Wikipedia-l] Re: Snapshot wikipedia installations in schools

Ashar Voultoiz thoane at altern.org
Wed Jun 9 07:51:09 UTC 2004


Andy Rabagliati wrote:
<snip xml export>
> For this to work for me, it has to be fully automateable, over UUCP. 
> 
> Is there a script that will take the XML export and incorporate it
> into the MySQL database so the Mediawiki front end will see it ?
> 
> Then a cronjob my end pulls the XML, and sends it to the school
> via UUCP, which stuffs it on stdin to a script, and the wiki is
> magically updated every day.

Hello,

I just discovered the special:export page accepted a page as parameter
in the url.
XML for en:Main_Page can be retriew at:
	http://en.wikipedia.org/wiki/Special:Export/Main_Page
That code doesn't change that much anyway, it would be better to get the 
sub sections instead like {{Did you know}} & {{In_the_news}}:

http://en.wikipedia.org/wiki/Special:Export/Template:Did_you_know
http://en.wikipedia.org/wiki/Special:Export/Template:In_the_news

The Special:Import page isn't ready yet, although you can get its code 
through:
http://cvs.sourceforge.net/viewcvs.py/wikipedia/phase3/includes/SpecialImport.php
and maybe help developpers to get it working correctly :o)
In your case, I think you should make a simple xml parsing script that 
update the database.

-- 
Ashar Voultoiz




More information about the Wikipedia-l mailing list