The good folks at Mozilla are working on what they call "Content
Security Policy" [1], basically a whitelist for JavaScript
cross-domain access.
I'm just flagging this up here because of the potential benefits from
querying toolserver tools from wiki(p|m)edia sites. (and yes, there's
JSON, but it's not supported by most tools)
Magnus
[1] http://people.mozilla.org/~bsterne/content-security-policy/
Circa 02:38-02:44 UTC some site downtime was reported; this is visible
as a temporary decrease in network traffic and Apache/PHP load. Site was
accessible for some people while inaccessible for others during this period.
There doesn't appear to have been any internal site problem, and we're
provisionally labeling it as a temporary external network routing problem.
Better known as... net gremlins!
-- brion
Shutting Down XSS with Content Security Policy
http://blog.mozilla.com/security/2009/06/19/shutting-down-xss-with-content-…
I'm usually the first to complain about applying technical solutions
to problems which are not fundamentally technical... but this looks
like it would be reasonably expedient to implement.
While it won't be effective for all users the detection functionality
would be a big improvement in wrangling these problems across the
hundreds of Wikimedia projects, many of which lack reasonable
oversight of their sysop activities.
A while ago, StringFunctions got merged in with ParserFunctions. Tim
disabled them by default before scapping, with the following comment:
/**
* Enable string functions.
*
* Set this to true if you want your users to be able to implement their own
* parsers in the ugliest, most inefficient programming language known to man:
* MediaWiki wikitext with ParserFunctions.
*
* WARNING: enabling this may have an adverse impact on the sanity of
your users.
* An alternative, saner solution for embedding complex text processing in
* MediaWiki templates can be found at:
http://www.mediawiki.org/wiki/Extension:Lua
*/
I'm sure we all agree that wikitext is terrible syntax. But some of
the string functions already are at least partially replicated (with
horrifying inefficiency, and significant limitations in some cases) on
enwiki anyway. Specifically:
* #len is implemented by [[Template:Str len]]. Running {{str len}} it
on a string of 250 a's gives preprocessor node count 152, post-expand
include size 4597 bytes, template argument size 7430 bytes.
* #pos is implemented by [[Template:Str find]]. Trying to find b in a
string of 250 a's gives preprocessor node count 1354, post-expand
include size 5740 bytes, template argument size 50320 bytes.
* #substr is implemented by [[Template:Str sub]]. Using the same
string of a's, with start 30 and length 20, gives preprocessor node
count 1534, post-expand include size 13400 bytes, template argument
size 44578 bytes.
Is there any good reason not to enable these three string functions, at least?
Recent studies by Håkon Wium Lie
<http://www.princexml.com/howcome/2009/wikipedia/> clearly show that
XHTML markup generated by widespread templates such as {{coord}} is
overcomplexified. This is mostly what we are able to fix, but
sometimes we aren’t due to the limitations we have created for
ourselves (Håkon simply pointed it out as he couldn’t know the reasons
behind it; this is why having a fresh look is useful). Perhaps the
most serious limitation is:
We don’t allow attributes for wikilinks.
This limitations results in several disadvantages, for example:
* Each time someone wants to style a link, they have to create a
<span> or something else somewhere inside or outside the link text. In
most cases, this is against the semantics and clarity.
* We can’t give ids to links so that we can use them in CSS and JS.
* Implementations of certain microformats (such as XFN, “url” property
in hCard/hCalendar, etc) inside templates is impossible.
I propose to extend wikilinks syntax by making links being parsed the
same way as we parse file-links.
That is, [[Special:Userlogout|log
out|id=logoutlink|style=color:red|title=This will log you out]] will
be a wikilink with style, title and id attributes. The current syntax
is a subset of my proposal, so nothing should break.
As the syntax for external links leaves us no opportunity to clearly
extend it in the same spirit, I currently think of merging it with
external links’ syntax, leaving the current single-brackets for
backward compatibility. Besides these advantages, it will make our
syntax even friendlier (have you seen newbies trying to insert http://
into the double brackets?), and it will make us implicitly prohibit
Protocol://-like titles (they all are erroneous creations by newbies
anyway).
— Kalan
I've just noticed that on English Wikipedia links such as [[Pink Floyd]]'s
no longer include the 's as part of the resulting link. Is this a parser
bug, or a deliberate change in the way links are being parsed? I don't
recall seeing an announcement about it.
- Mark Clements (HappyDog)
Hi,
Sergey Chernyshev and I are pleased to announce Semantic Bundle, a
recently-released package of MediaWiki extensions based around Semantic
MediaWiki. It's meant to simplify usage of Semantic MediaWiki, by bundling
it together with 15 other extensions that are commonly used in conjunction
with SMW, into one downloadable .zip or .tgz file. You can read more about
Semantic Bundle, and download it, here:
http://www.mediawiki.org/wiki/Semantic_Bundle
To quote from that page, there are a number of benefits to Semantic Bundle:
- it provides a "best practices" set of extensions around Semantic
MediaWiki, based on user experiences
- it simplifies download, especially for those systems that don't have SVN,
and especially for those extensions (like ParserFunctions) that don't have a
downloadable version already
- it tries to guarantee working code by using tagged, stable versions of
extensions whenever possible
- it simplifies installation by providing a pre-generated list of includes
To amplify on the third point, the plan is to update the Semantic Bundle
downloadables whenever a tag is incremented on one of the tagged extensions
that it uses. If you didn't know about tags, they've been supported for a
while on the MediaWiki SVN repository, although not that many extensions use
them yet. You can see their location here:
http://svn.wikimedia.org/viewvc/mediawiki/tags/extensions/
We also encourage anyone who maintains an extension to create a tag for
their extensions, in order to create "stable" versions.
Also, we hope that Semantic Bundle will serve as a model for other extension
"bundles" in the future. As the number of MediaWiki extensions continues to
grow, it becomes harder for administrators to research, download and install
every extension that might be useful to them; bundles help to streamline
that process. It should be noted that Semantic Bundle is not the first
bundle of extensions to be released, and not even the first SMW-based one;
that would be the "SMW+" package, created by Ontoprise. That one is
different, though, in that it includes MediaWiki itself, and in fact at the
moment it's a modified, patched version of MW, which makes SMW+ somewhat
more of an application in itself. So Semantic Bundle may be the first "pure"
package of extensions.
-Yaron Koren
I've not done much template work since parser functions were new. Grabbing
some old code examples, found it didn't work anymore. Workaround?
===
Ancient code (expected single space):
{{{{{subst|}}}#if:{{{par1|}}}|[[Category:{{{par1}}}{{{{{subst|}}}#if:{{{key1|}}}{{{{{subst|}}}!}}{{{key1}}}}}]] <!-- bpar1 -->
}}{{{{{subst|}}}#if:{{{par2|}}}|[[Category:{{{par2}}}{{{{{subst|}}}#if:{{{key2|}}}{{{{{subst|}}}!}}{{{key2}}}}}]] <!-- bpar2 -->
}}{{{{{subst|}}}#if:{{{par3|}}}|[[Category:{{{par3}}}{{{{{subst|}}}#if:{{{key3|}}}{{{{{subst|}}}!}}{{{key3}}}}}]] <!-- bpar3 -->
}}
Also tried (expected double space, hoped for single space):
{{{{{subst|}}}#if:{{{par1|}}}|
[[Category:{{{par1}}}{{{{{subst|}}}#if:{{{key1|}}}{{{{{subst|}}}!}}{{{key1}}}}}]] <!-- bpar1 -->
}}{{{{{subst|}}}#if:{{{par2|}}}|
[[Category:{{{par2}}}{{{{{subst|}}}#if:{{{key2|}}}{{{{{subst|}}}!}}{{{key2}}}}}]] <!-- bpar2 -->
}}{{{{{subst|}}}#if:{{{par3|}}}|
[[Category:{{{par3}}}{{{{{subst|}}}#if:{{{key3|}}}{{{{{subst|}}}!}}{{{key3}}}}}]] <!-- bpar3 -->
}}
===
Each category should be on a new line. If the par* doesn't exist, then the }}
that begins the next line would slurp up the newline, leaving no blank line.
Example {{subst:testing|par1=A|par2=B|key2=C|subst=subst:}}
As I remember, should yield something like:
[[Category:A]] <!-- bpar1 -->
[[Category:B|C]] <!-- bpar2 -->
Currently output:
[[Category:A]] <!-- bpar1 -->[[Category:B]] <!-- bpar2 -->
Did the evaluation order change, so the inner {{subst:!}} happens first
instead of second, and loses the C parameter (interpreting it as "else")?
And the leading and trailing linebreaks are slurped up and ignored?
Any known work around? (I tried searching meta and elsewhere, but no joy.)
I'm going to mention this here, because it might be of interest on the
Wikimedia cluster (or it might not).
Last night I deposited Extension:Minify which is essentially a
lightweight wrapper for the YUI CSS compressor and JSMin JavaScript
compressor. If installed it automatically captures all content
exported through action=raw and precompresses it by removing comments,
formatting, and other human readable elements. All of the helpful
elements still remain on the Mediawiki: pages, but they just don't get
sent to users.
Currently each page served to anons references 6 CSS/JS pages
dynamically prepared by Mediawiki, of which 4 would be needed in the
most common situation of viewing content online (i.e. assuming
media="print" and media="handheld" are not downloaded in the typical
case).
These 4 pages, Mediawiki:Common.css, Mediawiki:Monobook.css, gen=css,
and gen=js comprise about 60 kB on the English Wikipedia. (I'm using
enwiki as a benchmark, but Commons and dewiki also have similar
numbers to those discussed below.)
After gzip compression, which I assume is available on most HTTP
transactions these days, they total 17039 bytes. The comparable
numbers if Minify is applied are 35 kB raw and 9980 after gzip, for a
savings of 7 kB or about 40% of the total file size.
Now in practical terms 7 kB could shave ~1.5s off a 36 kbps dialup
connection. Or given Erik Zachte's observation that action=raw is
called 500 million times per day, and assuming up to 7 kB / 4 savings
per call, could shave up to 900 GB off of Wikimedia's daily traffic.
(In practice, it would probably be somewhat less. 900 GB seems to be
slightly under 2% of Wikimedia's total daily traffic if I am reading
the charts correctly.)
Anyway, that's the use case (such as it is): slightly faster initial
downloads and a small but probably measurable impact on total
bandwidth. The trade-off of course being that users receive CSS and
JS pages from action=raw that are largely unreadable. The extension
exists if Wikimedia is interested, though to be honest I primarily
created it for use with my own more tightly bandwidth constrained
sites.
-Robert Rohde