Well, enotif code isn't mature yet. It does send queries to master it shouldn't send, it uses some strange integers for timestamps it shouldn't use, it handles anonymous users where it shouldn't and moreover, all database code does not check if this functionality is enabled or not. Moreover, it uses 0 as a timestamp value, does not use any timestamp abstraction when talking to database or for values. At least my devel wiki is broken ;-) Probably I'll be able to fix some parts here and there, but I would really ask for code review and fixes before making it live anywhere.
And last but not the least. For all that code it does not check if all that email notification stuff is turned on and off. Please give possibility to turn on experimental features. Cheers :)
Domas
Ah, more thoughts on how topology should be set up.
French squids would have such cache-peer configuration:
Cache_peer frsquid1 sibling 80 3130 proxy-only no-digest no-netdb-exchange
Cache_peer frsquid2 sibling 80 3130 proxy-only no-digest no-netdb-exchange
...
Cache_peer usasquid1 parent 80 3130 round-robin no-query no-digest no-netdb-exchange
Cache_peer usasquid2 parent 80 3130 round-robin no-query no-digest no-netdb-exchange
Cache_peer usasquid3 parent 80 3130 round-robin no-query no-digest no-netdb-exchange
...
The trouble with .paris. (or .tier2. or whatever) subdomain is that it would require either adding rewrite rules on Wikimedia servers, or it will require external rewrite agent on squids. I guess rewriting on Apache is much easier at this point.
Moreover. Florida setup should have 'Purge' service for recursive purges from all official squid caches (there might be more of those in future). Volunteers for that? :) Ok ;-)
And sure, object expiry timeouts could be reviewed here.
Domas
Somebody needs to commit to finally setting up and testing these
things. Any volunteers? What exactly remains to be done with them?
-- brion vibber (brion @ pobox.com)
Hi,
today I started playing with the SQL dump from de.wikipedia and I got a
little confused.
It appears that there are some ids which have the same cur_title such as
12706 and 117948 (both: A.D.)
37325 and 37320
20033 and 53747 ACPI
and so on.
there seem to be quite a lot of them.
I guess this is not a wanted feature.
Mathias
Hi
I've added an mediawiki system (Hebrew) to my site in.
I found out that the word "article" at the top menu is in English
instead of the system language that I chose.
I've check the hebrew file - /languages/LanguageHe.php and its look fine
- the article word was translated there.
Some one has a clue for solving this problem (I know the php fundementals).
My wiki system place in - http://daniel.koala.co.il/wiki2
(Sorry about my english)
Thank for your time
Daniel S
Hi,
I'm using MediaWiki in a site where I'm the only person that can edit articles.
I would like to know how modify the code to let JavaScript to be inserted.
Thanks,
Giovanni Putignano
Larry Sanger believes that the solution to make
Wikipedia more credible are with experts. You can see
a good article descriping his criticisms here (
http://slashdot.org/article.pl?sid=05/01/03/144207&tid=95&tid=1
) posted on Jan 3, 2004.
I think the easiest way to make Wikipedia more
credible is with a Fact and Reference Project, which
the community has been developing over a period of
more than a few months now: (
http://en.wikipedia.org/wiki/Wikipedia:WikiProject_Fact_and_Reference_Check
).
The thing holding this project back, and ultimately
Wikipedia from sheading the skin of being
'noncredible' is the lack of intelligent foot/end
notes. A way to format an article with autonumbering
endnotes for crossreferencing is lacking. I am sure
with this feature programmed in this project can be on
its way to cross referencing all facts on Wikipedia.
You can see some examples offoot/endnote formatting
template here. JesseW has put much effort into trying
to create a formating guide here (
http://en.wikipedia.org/wiki/Wikipedia:Footnotes ) and
another guide here (
http://en.wikipedia.org/wiki/Wikipedia:Cite_sources ).
Examples are here (
http://en.wikipedia.org/wiki/Wikipedia:WikiProject_Fact_and_Reference_Check…
).
How credible will Wikipedia be if each fact is
crossreferenced with 5, 10, 20 external sources like
academic journals, encyclopedias, books? Very.
Please feel welcome to use
(http://en.wikipedia.org/wiki/Wikipedia_talk:WikiProject_Fact_and_Reference_…)
as an area for us to discuss this issue.
Shaun MacPherson
______________________________________________________________________
Post your free ad now! http://personals.yahoo.ca
Hey,
I've written an ICP responder, that would help a lot in squid-apache web server cluster balancing. It does delay requests to frontend caches at high loads, sends immediate responses at idle cpu capacity and might be remotely turned on or off at emergencies. Right now due to Linux kernel scheduler granularity it works at 0/10/20/..ms steps, therefore it does not achieve my design goals (milisecond-level delays), but still, it works with superior functionality, and does not need hacked squids installed on apaches.
Source location
CVS:extensions/icpagent
wm:~midom/extensions/icpagent
Docs: http://wp.wikidev.net/ICP_agent
Cheers,
Domas