> ----- Original Message -----
> From: "David Gerard" <dgerard(a)gmail.com>
> > http://en.wikipedia.org/wiki/Wikipedia:Today%27s_featured_article/October_2…
> > If I edit that page I see the code below (among lots of other code)
> > {{Wikipedia:Today's featured article/October 1, 2006}} <br clear="all">
>
>
> The {{ }} includes the article inside the double braces. [[ ]] makes a
> link, {{ }} does an inclusion.
I familiar enough with MediaWiki to readlize that :-)
Where do the inclusions come from, how does one schedule featured articles to appear? Etc... I tried tracing templates but couldn't find the right ones.
Mike O
--
_______________________________________________
Surf the Web in a faster, safer and easier way:
Download Opera 9 at http://www.opera.com
Powered by Outblaze
Previously, memcached keys in MediaWiki were typically constructed like this:
$key = "$wgDBname:user:id:$id";
This does not work when you have more than one wiki in the same database,
which is possible with table prefixes. I have now introduced a memcached key
construction function called wfMemcKey(). It is used like this:
$key = wfMemcKey( 'user', 'id', $id );
The argument count is variable, and the arguments are concatenated with a
":" separator and an appropriate wiki-specific prefix. When there is no
table prefix, the result is the same as the old construction above. When
there is a table prefix, the result will be have a compound prefix, e.g.
"$db-$prefix:user:id$id".
Other similar uses of $wgDBname, such as identifying wikis in log entries,
should be considered deprecated. Instead, use wfWikiID(), which returns the
database name if there is no prefix, and a hyphenated identifier if there is
one.
-- Tim Starling
An automated run of parserTests.php showed the following failures:
Running test TODO: Table security: embedded pipes (http://mail.wikipedia.org/pipermail/wikitech-l/2006-April/034637.html)... FAILED!
Running test TODO: Link containing double-single-quotes '' (bug 4598)... FAILED!
Running test TODO: Template with thumb image (with link in description)... FAILED!
Running test Template infinite loop... FAILED!
Running test TODO: message transform: <noinclude> in transcluded template (bug 4926)... FAILED!
Running test TODO: message transform: <onlyinclude> in transcluded template (bug 4926)... FAILED!
Running test BUG 1887, part 2: A <math> with a thumbnail- math enabled... FAILED!
Running test TODO: HTML bullet list, unclosed tags (bug 5497)... FAILED!
Running test TODO: HTML ordered list, unclosed tags (bug 5497)... FAILED!
Running test TODO: HTML nested bullet list, open tags (bug 5497)... FAILED!
Running test TODO: HTML nested ordered list, open tags (bug 5497)... FAILED!
Running test TODO: Parsing optional HTML elements (Bug 6171)... FAILED!
Running test TODO: Inline HTML vs wiki block nesting... FAILED!
Running test TODO: Mixing markup for italics and bold... FAILED!
Running test TODO: 5 quotes, code coverage +1 line... FAILED!
Running test TODO: HTML Hex character encoding.... FAILED!
Running test TODO: dt/dd/dl test... FAILED!
Passed 412 of 429 tests (96.04%) FAILED!
Salvete,
I wanted to document a design-conversation Tim Starling
and I had the other day over IRC; WikiTeX for MediaWiki
should have the following necessary (but not necessarily
sufficient) properties:
Segregation:
WikiTeX should be deployable on a segregated (virtual)
machine which will communicate with MediaWiki by RPC
(via e.g. SOAP).
Throttling:
Processor-time per request will have Draconian absolute
limits; and per-IP-throttling may also protect against
abuse.
Tim mentioned that we might start with a subset of
WikiTeX services; say: graph, music, amsmath.
Nick,
I am using MediaWiki version 1.7 for the Wikipedia mirror. However, when I copied the "essential" extensions Cite.php and
ParserExtensions.php and enabled them both in LocalSettings.php, I am seeing a lot of errors. However, when I disable Cite.php, then the errors disappear, but <ref.../ref> no longer work.
On googling this problem, I find that I am not the only one. How can this issue be resolved?
Thanks.
Krishna
Date: Thu, 28 Sep 2006 15:40:32 +1000
From: "Nick Jenkins"
Subject: Re: [Wikitech-l] Wikipedia mirror
To: "Wikimedia developers"
Message-ID:
Content-Type: text/plain; charset="us-ascii"
> I have already built a mirror of en.wikipedia at
> http://freeknowledge.dyndns.org/ however some of the
> pages are not rendered properly. For e.g.
> http://freeknowledge.dyndns.org/index.php/India
> Any suggestions/advice on how to fix this
Looks like you haven't installed Parser Functions ( http://meta.wikimedia.org/wiki/ParserFunctions ) or any of the other extensions used on the Wikipedia.
Compare: http://freeknowledge.dyndns.org/index.php/Special:Version
To : http://en.wikipedia.org/wiki/Special:Version
... you want everything under "Parser hooks" to be the same on your wiki as per the Wikipedia, in order to have the best chance of Wikipedia content displaying correctly on your wiki. You can download the extensions you need from http://svn.wikimedia.org/viewvc/mediawiki/trunk/extensions , and get documentation for most of the extensions from meta.wikimedia.org (e.g. http://meta.wikimedia.org/wiki/Cite/Cite.php#How_to_install )
All the best,
Nick.
=====================================
Misinterpreting Copyright by Richard Stallman
"Die Gedanken Sind Frei": Free Software and the Struggle for Free Thought by Eben Moglen mp3 ogg
Free Knowledge blog
.
---------------------------------
Yahoo! Messenger with Voice. Make PC-to-Phone Calls to the US (and 30+ countries) for 2¢/min or less.
I am trying to figure out how Featured Articles works but haven't quite got it yet. I see that if I go to this URL there is monthly list
http://en.wikipedia.org/wiki/Wikipedia:Today%27s_featured_article/October_2…
If I edit that page I see the code below (among lots of other code)
----
;'''October 1'''
{{Wikipedia:Today's featured article/October 1, 2006}} <br clear="all">
[[Wikipedia:Today%27s_featured_article/October_1,_2006|view]] -
[[Wikipedia talk:Today%27s_featured_article/October_1,_2006|talk]] -
[http://en.wikipedia.org/w/wiki.phtml?title=Wikipedia:Today%27s_featured_art… history]
----
But I can't figure out where this Featured Article system, where the above code from, etc. Can someone explain this or point me to an article that does?
Mike O
--
_______________________________________________
Surf the Web in a faster, safer and easier way:
Download Opera 9 at http://www.opera.com
Powered by Outblaze
Short question:
I'm getting inconsistent 403 forbidden errors when trying to read
wikipedia.org content via PHP's file_get_contents().
I believe I'm operating within the limits and terms identified here:
http://en.wikipedia.org/wiki/
Wikipedia:Database_download#Please_do_not_use_a_web_crawler
Am I being blocked (inconsistently)? How would I find out?
Background:
Notwithstanding traditional librarian concerns about the authority of
Wikipedia, I'm working on ways to use Wikipedia data in the library
context.
Longer question:
My app uses the all_titles_in_ns0 export. If the search matches one
of those titles it fetches that page from en.wikipedia.org to
generate a summary and cache the result (future searches are returned
from cache).
In trying to sidestep the 403 forbidden errors (and eliminate any
complaints about scraping wikipedia.org), I've attempted to bring up
a private copy of en.wikipedia.org based on MediaWIki and the pages-
articles.xml export. This should work, as MWDumper seems to import
the contents okay, but the results leave me with a huge number of
missing pages.
So, at the end of the day, my question is: how can I either more
reliably fetch pages from wikipedia.org _or_ more reliably create a
duplicate wiki that I can scrape.
(Those who are interested can contact me privately for URLs to see
this at work. I'd rather not make them public in developmental form.)
Thank you,
Casey Bisson
__________________________________________
e-Learning Application Developer
Plymouth State University
Plymouth, New Hampshire
http://oz.plymouth.edu/~cbisson/
ph: 603-535-2256
Hi,
set in your php.ini
error_reporting = E_ALL
display_errors = On
Cheers, Jimmy
> -----Ursprüngliche Nachricht-----
> Von: fadeev(a)princeton.edu, Wikimedia developers <wikitech-l(a)wikimedia.org>
> Gesendet: 03.10.06 00:41:27
> An: wikitech-l(a)wikipedia.org
> Betreff: [Wikitech-l] basic php debugging for writing mediawiki extension
> Hi,
>
> I am trying to create an extension for the mediawiki
> and found that when my extension does not compile wiki
> just hangs and i get bank screen in the browser.
>
> Is it possible to somehow enable printing php
> compilation errors to the browser scren for debugging
> purposes?
>
> Thank you!
>
> Evgeny.
>
> __________________________________________________
> Do You Yahoo!?
> Tired of spam? Yahoo! Mail has the best spam protection around
> http://mail.yahoo.com
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l(a)wikimedia.org
> http://mail.wikipedia.org/mailman/listinfo/wikitech-l
An automated run of parserTests.php showed the following failures:
Running test TODO: Table security: embedded pipes (http://mail.wikipedia.org/pipermail/wikitech-l/2006-April/034637.html)... FAILED!
Running test TODO: Link containing double-single-quotes '' (bug 4598)... FAILED!
Running test TODO: Template with thumb image (with link in description)... FAILED!
Running test Template infinite loop... FAILED!
Running test TODO: message transform: <noinclude> in transcluded template (bug 4926)... FAILED!
Running test TODO: message transform: <onlyinclude> in transcluded template (bug 4926)... FAILED!
Running test BUG 1887, part 2: A <math> with a thumbnail- math enabled... FAILED!
Running test TODO: HTML bullet list, unclosed tags (bug 5497)... FAILED!
Running test TODO: HTML ordered list, unclosed tags (bug 5497)... FAILED!
Running test TODO: HTML nested bullet list, open tags (bug 5497)... FAILED!
Running test TODO: HTML nested ordered list, open tags (bug 5497)... FAILED!
Running test TODO: Parsing optional HTML elements (Bug 6171)... FAILED!
Running test TODO: Inline HTML vs wiki block nesting... FAILED!
Running test TODO: Mixing markup for italics and bold... FAILED!
Running test TODO: 5 quotes, code coverage +1 line... FAILED!
Running test TODO: HTML Hex character encoding.... FAILED!
Running test TODO: dt/dd/dl test... FAILED!
Passed 412 of 429 tests (96.04%) FAILED!
On 10/3/06, David Monniaux <David.Monniaux(a)free.fr> wrote:
> One lingering issue with deletions (for instance, of prank or malevolent
> entries about totally random people with personal information) is that
> our content is massively replicated:
> * on mirror sites such as Answers.com
> * in Google cache.
>
> Telling people to go contact Answers.com and Google is probably legally
> correct (we're not responsible for these sites) but, to me, somehow
> sounds like we're passing the bucket (we helped create the mess in the
> first place after all).
>
> It would be neat if we could tell Google or Answers.com to flush out a
> particular article. This is probably just about the Foundation arranging
> a programmatic API with them.
Google offers a number of options for removing content from their
cache [1], but none of them are really helpful to us.
I'm sure we could set up some kind of system to automatically send out
a SOAP message or something similar to certain sites when articles are
deleted. Knowing that the technology exists is about the limit of my
knowledge in this field, so I'll send this to wikitech-l too.
Devs, would it be feasible to have some kind of system to notify these
regular updaters when certain articles are deleted?
----
(1) http://www.google.com/support/webmasters/bin/topic.py?topic=8459
--
Stephen Bain
stephen.bain(a)gmail.com