Probably nothing new, but I'd like to propose turn talk pages into NNTP
groups resp. articles and at expiring time (or earlier) archive them as
HTML pages.
Talk pages are often to bewildered and it take much to much time
maintaining those pages - just shorten one of the pages and the other
day it is crowded with similar statements again.
--
| ,__o
| _-\_<,
http://www.gnu.franken.de/ke/ | (*)/'(*)
Here's an interesting bug (or feature?) in the block code:
If an IP address is blocked with expiry X, and the corresponding user
is blocked with expiry Y, and the user logs on, his IP will be
anonymously-blocked for 24 hours, but the date X on the original IP
block will also be set to 24 hours in the future, *even if this is a
reduction.*
--
Allan Crossman - http://dogma.pwp.blueyonder.co.uk
PGP keys - 0x06C4BCCA (new) || 0xCEC9FAE1 (compatible)
I notice the webalizer stats for february does not exist. Now whit the
press notice it would be nice if the work.
--
Contact: walter AT wikipedia DOT be
Hi all -
Has anyone looked into or heard anything about a map-based complement to the
Wikipedia?
I am looking at putting something together that would be targeted at K-12,
but might have some nice tie-ins.
The plan is to deliver an easily searchable repository (content & searchable
clearinghouse) of quality map content to cover the entirety of human
history. Along with it would be a freely distributable, open source map
player. People could then link their existing maps, add modifications, build
new map content, etc.
There are a lot of reasons why a text-based search metaphor doesn't work
well for map-based data and why a lot of the maps on the web fall short of
current technologies potential for delivering better map-based experiences.
I am hoping to base this loosely on what the folks at TimeMap and the ECAI
have already established, but their current tools, setup, and content would
need to be modified and expanded to be useful for K-12 education.
Naturally, it would be great to provide contextual links to related web
resources... like the Wikipedia.
Anyway, if anyone has heard of something similar, or has any suggestions,
please let me know.
Thanks,
Jeff
jeff(a)gwhat.org
www.gwhat.org
If Yahoo feels grateful for the increased traffic, and starts making
anonymous donations to WikiMedia, you're not going to tear up the
checks, are you?
Ed
To give some idea of what makes a difference, here are some of the things discovered over the last two weeks:
1. Storing the PHP files on NFS doubled the page load time, from around 180ms to around 360ms. So, that's no longer being done and effectively no complicated programming was needed to double performance.
2. Squid using the disk reached cache hit rates of 78% and still rising, compared to 60% without the disk, but...
3. Squid using synchronous I/O blocked on disk and would sometimes result in timeouts. So, disk was turned off for the last week.
4. The Apaches started slowing down at peak load times earlier this week, so more Squid investigation started, using asynchronous disk I/O this time. Given the past disk caching experience, this has the potential to cut the load the Apaches see by around 25-50% (they see 40% of load without disk, disk cut that to 22%).
5. There's a parameter in Squid which tells it when to ignore the disk, based on the number of file descriptors in use. If the limit is exceeded, Squid ignores the disk and just passes the request directly to the Apaches, or skips saving the page to the disk. Tuning of this parameter is currently ongoing. When set correctly it should let Squid deliver all it can from combined disk and RAM, but only up to the point where it doesn't start to block waiting for the disk.
So, there's no need to get too enthusiastic about tuning the code. The new server setup still isn't tuned fully yet... and it probably won't be before we get a nice fast database server, a second Squid and some more Apaches to spread the load around.
When discussing search engine placments with Yahoo, Google and AOL, do remember that they make money whenever we send a searcher to their search results page, because that searcher sees their ads. These deals can be expected to involve the search company paying the hosting company money.
Jimmy probably already knows this, but just in case...:)
This morning the whois information containing the name servers was still
unchanged:
whois wikipedia.org
Domain ID:D51687756-LROR
Domain Name:WIKIPEDIA.ORG
Created On:13-Jan-2001 00:12:14 UTC
Last Updated On:19-Oct-2003 02:52:44 UTC
Expiration Date:13-Jan-2005 00:12:14 UTC
Sponsoring Registrar:R71-LROR
Status:OK
Registrant ID:C3819183-RCOM
Registrant Name:Jimmy Wales
Registrant Organization:Bomis,Inc.
Registrant Street1:4455LamontSt.,Suite3
Registrant City:SanDiego
Registrant State/Province:CA
Registrant Postal Code:92109
Registrant Country:US
Registrant Phone:+1.6192739361
Registrant FAX:+1.6192739363
Registrant Email:jwales@bomis.com
Admin ID:C11978190-RCOM
Admin Name:Jimmy Wales
Admin Organization:Bomis,Inc.
Admin Street1:3585HancockSt.,SuiteA
Admin City:SanDiego
Admin State/Province:CA
Admin Postal Code:92110
Admin Country:US
Admin Phone:+1.6192961732
Admin Email:jwales@bomis.com
Tech ID:C1-RCOM
Tech Name:Domain Registrar
Tech Organization:Register.Com
Tech Street1:575 8th Avenue
Tech City:New York
Tech State/Province:NY
Tech Postal Code:10018
Tech Country:US
Tech Phone:+1.9027492701
Tech Email:domain-registrar@register.com
Name Server:DNS33.REGISTER.COM
Name Server:DNS34.REGISTER.COM
Those DNS entries need to be changed to point at Zwinger & Joey.
--
Gabriel Wicke
How quickly can we add a "Yahoo" searchbox to the "Full text search
has been disabled temporarily" page?
Yahoo has an 'advanced search' capability that is exactly like Google's
for restricting a search to a domain:
http://search.yahoo.com/search?x=op&va=Thomas+Jefferson&va_vt=any&vs=en.wik…
...for example.
This is a request from Yahoo that I'd like to rush to accomodate, as
they are showing lots of WikiLove toward the project of late.
--Jimbo
We need software that will allow throttling down the number of edits problem
users can make. Please code something so that we could restrict, should we
decide to, someone like this to a set number of edits per day.
The advantage of this is that it then leaves up to that user whether he
wants to get down to article writing or fool around.
Fred Bauder, member of the arbitration committee
> From: "Arno M" <redgum46(a)lycos.com>
> Organization: Lycos Mail (http://www.mail.lycos.com:80)
> Reply-To: redgum46(a)lycos.com, English Wikipedia <wikien-l(a)Wikipedia.org>
> Date: Thu, 26 Feb 2004 12:38:49 +0600
> To: wikien-l(a)Wikipedia.org
> Subject: [WikiEN-l] Plautus - the last three days
>
> Hmm, after viewing messages about Plautus, I must say that I find Jimbo's "All
> you need is love" response most disappointing.
>
> I must say that he's a busy little beaver.
> http://en.wikipedia.org/w/wiki.phtml?title=Special:Contributions&hideminor=…
> arget=Plautus_satire&limit=500&offset=0 shows that he has made over 500 edits
> over the last three days alone. This includes some articles such as quasar,
> but mainly in advocating and splattering his ,um, views all over the talk
> pages.
>
> For instance, on the Sep 11 Attacks page, Plautus is now arguing that the
> Flight 93 passengers did not charge the hijackers, and is veering that page
> off in a direction there that may see this site crashing into a lawsuit or
> two. (He does seem to be saying that the families who reported the phone calls
> are all liars) The size of the talk page is also growing.
>
> People have now left the wikipedia site, or so I hear, and there's a list of
> complaints growing on Jimbo's talk page. Let's also not forget what might
> happen if his stuff gets copied across to other sites that use wikipedia
> material as all or part of their content.
>
> My own feeling is that Plautus should be given the "Get back to where you once
> belonged" treatment.
> He is completely fouling thiongs up here.
>
>
>
> ____________________________________________________________
> Find what you are looking for with the Lycos Yellow Pages
> http://r.lycos.com/r/yp_emailfooter/http://yellowpages.lycos.com/default.as…
> RC=lycos10
> _______________________________________________
> WikiEN-l mailing list
> WikiEN-l(a)Wikipedia.org
> http://mail.wikipedia.org/mailman/listinfo/wikien-l