I just observed that when doing a search on our Wiki i get no results.
We use
* MediaWiki: 1.11.0
* PHP: 5.2.0 (apache2handler)
* MySQL: 5.0.26
You can test it here: http://en.giswiki.net
Any hints?
Thx. HeinzJ
Hi,
Remember the revision limit on page deletions that was
hacked in to prevent Wikimedia grinding to a halt every
time someone tried to delete a large page?
Well, give someone a feature and they will abuse it in ways
you never thought possible. Bots are now being used to add
thousands of revisions to pages to make them un-deleteable.
See the history of the English Wikipedia's Main Page for an
example.
Just thought you should know.
Thanks,
-Gurch
Hi,
Currently the dump are both in xml and sql format, would it be
possible to provide one more format, i.e. CSV.
MySQL provide native export & import for CSV file, this should be the
fastest method I believe.
Also it can save quite a lot of storage/bandwidth, of coz include the
for us to parse the xml every time.
Any comments?
Howard
Hi,
Our wiki has become prey to spammers, even after having added e-mail
confirmation for accounts. I am busy trying to clean up the mess, and
now I am trying to find solutions that will make the life of spammers
harder, without penalising ligitimate users. The first thing to do is
find out how to block the user account causing issue. Is there any way
to do this without having to open up the database?
The main anti-spammer solution I am looking at it adding a captcha,
but I would also be interested in other solutions. Additionally I want
to ensure that the text from the spammers is removed from the
database, so that they don't get the benefit of having their links
indexed by search engines, via our site. Any suggestions for this?
Andre
Hi all.
I'm adding some tweaks to the WikiXRay parser of meta-history dumps. I now extract internal, external links, and so on, but I'd also like to extract the plain text (without HTML code and, possibly, also filtering wiki tags).
Does anyone nows a good python library to do that? I believe there should be something out there, as there exist bots and crawlers automating the data extraction process from one wiki to other.
Thanks in advance for your comments.
Felipe.
---------------------------------
¿Con Mascota por primera vez? - Sé un mejor Amigo
Entra en Yahoo! Respuestas.
Hello. I have a question: How can I encode text to text that is used for headings in URLs. E.g. code for '$' is .24. Thanks for any suggestions. P.S. Sorry for my terrible English. -MGrabovsky
rainman(a)svn.wikimedia.org wrote:
> Revision: 30390
> Author: rainman
> Date: 2008-02-01 13:17:38 +0000 (Fri, 01 Feb 2008)
>
> Log Message:
> -----------
> A new branch for LuceneSearch extension for the new daemon:
> will add ajax search and make some minor interface improvements.
Just a note -- I would recommend strongly against doing continued
development on the old LuceneSearch front-end extension, as it's a
maintenance nightmare.
Instead, new front-end code should be in the Special:Search front-end in
core, with a back-end plugin to talk to the Lucene server (the MWSearch
extension, possibly a bit out of date.)
-- brion vibber (brion @ wikimedia.org)
Hello,
A more feasible proposal (than global blocking) which I've put forth
before is crosswiki blocking. A Special:BlockCrosswiki page on Meta
could be used by stewards to block a user on any project, preferably
updating the log on that project. The interface would work in
precisely the same way as the current crosswiki Special:Userrights,
with a steward blocking "Pathoschild's_proposal_sucks!@enwiki" from
Meta.
This doesn't have the problems of global blocking, and it would be
extremely useful in stopping wiki-jumping vandals. Without crosswiki
blocking, a steward needs to navigate to each project, register an
account or log in, navigate to Special:Userrights and set admin access
from Meta, navigate to Special:Blockip and block the vandal from the
local project, and switch back to Special:Userrights on Meta to remove
their admin access. By the time they're done, the vandal has hit six
more wikis. Obviously, the current way we do things is ridiculous and
not scalable in the least.
--
Yours cordially,
Jesse Martin (Pathoschild)