As an alternative for both our very own OK-but-slow search and the
fast-but-never-up-to-date google search, how about
http://www.mnogosearch.org/
Its UNIX version is GPL. It consists of two parts, indexer (crawler) and
search module. The latter can be called from PHP, so we could even keep
out interface. Question is, can we tune it so it updates every edit fast
enough?
I was checking language names from
http://www.rtt.org/ISO/TC37/SC2/WG1/639/ISO-FDIS-639-1.pdf for the Dutch
Wikipedia, and there I noticed there is an error in the general file:
su: is given as "Sudanese", this should be "Sundanese". I have corrected
it in nl:, but I am not sure whether I am supposed to do so in en: and
other languages as well, so I will ask someone else to do that instead.
Andre Engels
Hi,
i found an interesting random feature on the german Wikipedia:
-- 8< --
[..] <nowiki>~~~~</nowiki> [..]
-- >8 --
is translated to
-- 8< --
[..] 3iyZiyA7iMwg5rhxP0Dcc9oTnj8qD1jm1Sfv4 [..]
-- >8 --
It seems, that is just "works" on that [1] single page.
Regards,
Nils.
[1] http://de.wikipedia.org/wiki/Wikipedia:Artikel%2C_die_verschoben_werden_sol…
Hey I just noticed that my email address is here on this
public web page. Is there any way to delete it ? I dont
want spam !
(Its from a message I sent saying that the WP was down).
http://mail.wikipedia.org/pipermail/wikitech-l/2003-July/004657.html
__________________________________
Do you Yahoo!?
Yahoo! SiteBuilder - Free, easy-to-use web site design software
http://sitebuilder.yahoo.com
Dear Developers and fellow Wikipedians
I wrote this idea of Client Side Accelerated Wikifying up in
http://meta.wikipedia.org/wiki/Highly_Avantgarde_Client_Side_Accelerated_Wi
kifying or [[HACSAW]] for short
It would make wikifying articles more easy, intuitive, faster and more
_accurate_ if implemented and I believe it is in the spirit of Wikipedia.
It could possibly reduce the load on the server because currently people
have to fetch full rendered articles if they want to be sure they are
linking to the correct article. Context is not always what you presume and
HACSAW would take some of this "fetch-and-glance" load off the servers.
One of the biggest obstacles I see in implementing HACSAW is that it
cannot be implemented as a browser plugin because that would seriously
violate the egalitarian nature of Wikipedia due to people being divided
into two groups: those who can get the plugin and those who cannot
A Java applet could be the solution, but I have no knowledge of how to
make persistent client side data structures available to an applet.
If you have know-how on building applets that use persistent (browser and
system shutdown persistance is needed) client side data storages please
take a look at [[m:HACSAW]] and help us push the envelope further.
Kind regards and a big thanks for this great wiki implementation, Juxo
--
http://www.consumerium.org/wiki/wiki.phtml?title=User:Juxo
I've fixed a section editing bug that was corrupting the saving of the
previous revision to the old table when used by an anonymous user.
The code in updateArticle() called getContent() to get the page text to
do its section munging on. But, that function grabs the data through
loadContent(), which doesn't load the fields for cur_user_text,
cur_is_minor, or cur_comment (as these are not used when displaying a
rendered article); the later call to loadLastEdit() sees that data has
been loaded (as the data for cur_user _was_ loaded... but that isn't
used for page display either, is it?) and doesn't fill out the rest of
the data.
Thus when we save the last edit's data back to old, it's missing those
fields. (If still in recentchanges, the comment and minor status can be
recovered; user name can be recovered from user id for logged-in users,
but if any anon edits were corrupted that fell off the recentchanges
table, they may not be recoverable unless they happen to have been
caught in a backup.)
Now, this bug is kinda hidden because a logged-in user causes an earlier
call to loadLastEdit() to allow for overriding the edit conflict check
if the last edit was by the present user, and the only section editing
allowed to anons is 'post comment', which is used fairly rarely so far.
Argggh... I moved the call up earlier so all data is loaded, but the
functions should probably be changed to not stomp on each other badly.
The emergency fix is applied to dev and stable branches and installed on
the servers.
-- brion vibber (brion @ pobox.com)
My admired developers!
There's an article about [[P.D.Q. Bach]] in the German Wikipedia which
has been deleted and restored again:
http://de.wikipedia.org/wiki/P.D.Q._Bach
The funny thing that I can't find it when I search for "Bach" or
"Schickele". Are restored pages not added to the search index (or
whatever it's called)?
Kurt
Hello guys,
looking at the TeX rendering code on CVS, I see that each math fragment is
identified with its MD5 hash, similar to images.
But then, I find that this is only the "input hash". The code SELECTs for
an "output hash", and this is used on the html page to retrieve the
rendered math as a PNG image.
How the output hash is computed? Maybe it's there in the code, but I don't
find it. If I could compute the hash offline, I could include TeX math
into the static version, the same way as images: just download them :-)
Ciao,
Alfio
Oops - found it. 'register_globals'was off in the php.ini file. Turning it on did the trick.
Aaron
-----Original Message-----
From: Aaron Oppenheimer
Sent: Friday, August 22, 2003 11:14 AM
To: wikitech-l(a)Wikipedia.org
Subject: [Wikitech-l] trouble setting up...
Hi,
I'm trying to set up an intranet using the wikipedia software. Everything seems set up and configured properly, but I can only access the main page - when I follow any link, I get the main page again. It's as if the arguments to the page are not being passed in, though I suppose there could be other causes. None of the logs show anything particularly weird (though the wiki log claims "We're confused" and that there's some caching trouble).
Anyone done this and know more about setting this up than I do?
Thanks,
Aaron Oppenheimer
aoppenheimer(a)dcontinuum.com
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)Wikipedia.org
http://mail.wikipedia.org/mailman/listinfo/wikitech-l