I noticed that in the english wikipedia the special
pages have a saved version. In the spanish wiki not.
Who is in charge of running the queries and saving
the results in those pages? Who can do it?
AstroNomer
__________________________________
Do you Yahoo!?
Yahoo! Mail - More reliable, more storage, less spam
http://mail.yahoo.com
Hello,
I'm interrested in porting MediaWiki so that it also supports Postgres.
In every article about Open Databases I read, that Postgres is a more
professional database than MySQL, so I wonder that there is no interrest in
suporting also Postgresql, or is there also interrest?
However, I have a running Postgres database and I want to run also a Wiki
Wiki. The question for me is now if I should take PHPWiki which has Postgres
support or if I should spend some time in enhancing MediaWiki.
At the moment, it seams that MediaWiki has more interesting features than
PHPWiki.......
So my first question: Was there a demand for Postgres in the past?
Second: Is there anyone who can suport me if I have some question about MySQL
(table-structure of MediaWiki) support of MediaWiki? I don't want to install
also MySQL on some of my computers......
Best regards,
Bernhard
Can someone make a small change to the javascript diffing function?
Specifically:
Changing this line in PageHistory.php/beginHistoryList() from
$url = wfLocalUrl( $wgTitle->getPrefixedText(), "dummy=1");
to
$url = wfFullUrl( $wgTitle->getPrefixedText(), "dummy=1");
should do the trick.
As is the javascript diff function no longer works in Firefox 0.8. It errors
out with:
Security Error: Content at
http://en.wikipedia.org/w/wiki.phtml?title=Space_Shuttle_program&curid=3845…
may not load or link to
chrome://browser/w/wiki.phtml?title=Space_Shuttle_program&dummy=1&diff=0&oldid=2304466.
So I suspect the full url will fix the problem and still be compatible
with other browsers.
--
Audin Malmin - audin(a)okb-1.org
Faith is the great cop-out, the great excuse to evade the need to think and
evaluate evidence. Faith is belief in spite of, even perhaps because of, the
lack of evidence. -- Richard Dawkins
Hello all together,
why is the MediaWiki only supporting MySQL? Some alternatives are also interesting (think of POSTGRES)........
/Bernhard Naegele
Email: <mailto: bernhard.naegele(a)t-online.de>
Whoever has installed this... thank you very much!
I was waiting for Polish version of Wiktionary,
for the possibility of defining/describing words in Polish
since 13 dec 2002 (yes... since the start of Wiktionary).
Now my small dream comes true.
Thanks folks :)
Youandme
Something for people looking for a fun project to hack on:
IMDB (the movie database) has a submission interface for signed in users
which can be used to submit "External reviews" to IMDB. I was advised by
the IMDB staff that, in order to work towards a more comprehensive content
partnership with them, we should first submit links to the relevant
movies, so that they can gauge user interest.
So we need a bot that auto-submits all Wikipedia articles about movies to
IMDB's "external reviews" sections about these movies.
There is a [[List of movies]]. An index could be generated from the linked
pages. The script should remember which movies it has already submitted,
so that we can easily add new articles to the index.
IMDB takes some time to process submissions, so you can only test if the
script is working correctly by looking at the HTML sent from their server.
You will probably want to use some throwaway email address for the account
as IMDB sends a confirmation for each submission.
For implementing this I would use libwww-perl, an excellent web scripting
library for Perl, but I'm sure there are equivalents for all other major
scripting languages.
So, does anyone want to take this project?
Regards,
Erik
I am motly on Japanese Wikipedia, and I see really a lot of the same problem
in there, too. I, too, hope that this feature is implemented as soon as
possible, at least for Japanese Wikipedia.
We talk about this feature a lot in Japanese Wikipedia, and I am quite sure
that many others from Japanese Wikipedia would agree that this is something
we need.
Earlier, this idea of deleting particular version was discussed on the list,
and it turned out that Some Wikipedias needed to reorganize deletion
procedures & rules before adding such a feature, so that their admins will
not abuse this function.
[http://mail.wikipedia.org/pipermail/wikipedia-l/2003-June/010684.html]
Yet in the mean time, Japanese Wikipedia received/ revealed plenty of
copyvio, some allegedly intentional, that forced us to delete the articles
in many cases. Some such violation arrives at 100th edit (like at Main page,
which indeed is listed at VfD right now), after all the hard work. Others
are in the first versions of articles, but remain unnoticed until 5th edits
or so, again making us very difficult to decide to delete the whole thing.
Another issue that I am aware is that English Wikipedia may be taking the
stance, based on a provision in US copyright law, that past versions are not
for public viewing but for internal record, and therefore copyright
violation does not matter. That idea is so far considered not applicable to
Japanese Wikipedia, because a Japanese copyrightholder can sue Japanese
Wikipedian in a Japanese court and Japanese law could be applied.
The only alternative we had to cope with this problem was to ask developers
to delete specific version of articles. But because of the server
restructuring and other issues, no request was handled yet. (And let me
quickly say that nobody is complaining about that, since we all think they
work hard. But that makes us wonder about the appropriateness, and sometimes
legal risk, of waiting for such a long time when there is a somewhat clear
case of copyright violation. We may delete the article and start from the
scratch, which, again, is not an easy choice at all.)
Currently, we have at least a few dozen pages whose past versions turned out
to be problematic. Well, I really don't want to count, but really a lot was
found during the last 4 or 6 weeks.
If anybody wants to know more about how the situation is like, I would be
more than happy to do it, but I will stop here since it is not a very
pleasant story...
Regards,
Tomos
_________________________________________________________________
Get tax tips, tools and access to IRS forms all in one place at MSN Money!
http://moneycentral.msn.com/tax/home.asp
Some weeks ago Thomas Luft has written the code that is needed
to delete individual revisions (Request ID 908150 at SourceForge).
Will this be integrated into the MediaWiki software? We need this
for the de-WP, because we now have identified a lot of copyvios in
the page histories that sould be deleted as soon as possible.
de:Benutzer:El
--
+++ NEU bei GMX und erstmalig in Deutschland: TÜV-geprüfter Virenschutz +++
100% Virenerkennung nach Wildlist. Infos: http://www.gmx.net/virenschutz
I've gone ahead and released 1.2.0 on account of a major bug in
1.2.0rc4 that could delete an existing wiki instead of upgrading it
when using the web installer. At some point later a 1.2.1 should clean
up some other minor installation issues and bugs.
Download link:
http://prdownloads.sourceforge.net/wikipedia/mediawiki-1.2.0.tar.gz?
download
See release notes for changes since 1.1.0:
https://sourceforge.net/project/shownotes.php?release_id=226003
Changes from 1.2.0rc4:
Web install:
* Serious upgrade bug fixed; an attempt to install over an existing
database should now either upgrade gracefully or immediately fail,
instead of wiping out your entire database (BUT BACKUP FIRST!)
* Configuration tosses in output compression if available
* Detects more image-related options (but uploads are still disabled by
default)
Localizations:
* Spanish localization has had most hard-coded "Wikipedia"s removed
* French localization has been changed to UTF-8
Maintenance scripts:
* rebuildrecentchanges.php works again
* importPhase2.php for importing from very old format (pre-May 2002)
wiki
-- brion vibber (brion @ pobox.com)