Hi,
I am new here just tried importing a page (following Rob suggestions):
<mediawiki
version="0.3"
xml:lang="en">
<page>
<title>JUST A TEST</title>
<id>1287</id>
<revision>
<id>1288</id>
<timestamp>2005-12-09T12:42:38Z</timestamp>
<contributor><ip>172.22.68.72</ip></contributor>
<text xml:space="preserve">testing1
testing2
testing3
testing4
testing5
testing6
testing7
</text>
</revision>
</page>
</mediawiki>
I guess we need to take care of the ids...
works very well using the importDump.php script.
You can search for the title in the search box but not for the text, is
it possible to add the text to the search capabilities?
Thanks
Leon
HumanCell .org wrote:
3) go to
Special:Import while logged in as a sysop, and import your
new XML file. Ta-da! A whole lotta new articles!
That sounds pretty interesting Rowan. So there only needs to be one
XML file containing all the rows and with one call to Special:Import
multiple articles are generated. Neat.
Alternatively, of course, you could write
something that read from one
database and wrote to the other,
Want to avoid that if possible. So your first suggestion sounds better.
Rob's suggestion looks quite interesting too - I'm also looking for a
tool for data validation (to ensure infobox fields are valid data) so
the bot might be quite useful for that too.
Thanks for the suggestions...am going to try them out...
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)Wikimedia.org
http://mail.wikipedia.org/mailman/listinfo/mediawiki-l