I've enabled a test configuration of MediaWiki's upload-by-URL ability
on http://test.wikipedia.org/
The default configuration limits URL uploads to sysops, so for now
you’ll need to be a sysop on Test Wikipedia to try it out. If everything
seems fairly problem-free we’ll start rolling this out a bit more widely
for Commons and other sites.
In addition to being able to handle large files without an ugly manual
download+reupload, the upload-by-URL functionality is also needed for
future-facing work Michael Dale is working on to allow an on-wiki media
picker to fetch freely-licensed files from Flickr, Archive.org, and
other places.
We may want to consider improvements to UI and workflow, but it seems to
at least work. :)
More at the Wikimedia technical blog:
http://techblog.wikimedia.org/2009/03/upload-by-url-for-testwikipediaorg/
-- brion vibber (brion @ wikimedia.org)
On Commons the widely-used Template:Information contains code for
machine-readibility. It takes the parameters and puts them in urlencoded
form to the title attribute of a hidden <span>, so scripts can easily
read them (at least I guess that's what it's for, the
"machine-readibility code" is not documented well).
Now the parameters of Template:Information can get quite long. It seems
that if the urlencoded string is longer than 33,xxx characters, it
breaks. See for example
<http://commons.wikimedia.org/w/index.php?title=File:World_homosexuality_law…>
(at the bottom of the description).
The problem is not in the template or in the source of that specific
file. It happens to all very long attributes. The whole <span> is
corrupted in the HTML source code. I guess, the problem is in the parser
or in tidy. Does anybody know, what's the exact reason and how to fix it?
Marcus Buck
User:Slomox
Hi,
I am wondering if someone could help me create a list of Page Titles,
and list of Redirect Pages from the wikidb database in MySQL. I have
downloaded the XML/SQL dumps along with the SQL dumps of the remaining
Tables and imported them into the wikidb database in MySQL. (This is the
English version of Wikipedia.)
I am looking at the Mediawiki Database Schema at
http://upload.wikimedia.org/wikipedia/commons/4/41/Mediawiki-database-schem…
. I also know that there is a list of Titles in NS0 provided along with
the dumps. This does not suit my purpose, because I would like to have
all the Titles i.e. including those that are out of NS0 too. I would
also like to have the list of redirects and where they redirect to.
The Mediawiki Database Schema mentioned above, does provide a lot of
information – but I am not experienced enough to make use of it. I hope
some of you can help me.
For the list of Page Titles, I looked into the Page Table, and simply
got a list of all of the Titles. In SQL I assumed this would be the
result of the Query: “select page_title from wikidb.page;” – The problem
with this is that I am getting a number of Titles that are repeated?? So
I think I am doing something wrong?
As far as getting the List of Redirects, I hope someone would clarify
if the way to do this would be to go through the Redirect Table, for
each rd_from – look in the Page Table for the matching Page ID, and then
get the Title from that row. This can be done using a Join – but I have
not implemented this for now. Is this the correct way to go?
Thanks again to all you guys,
O. O.
I am pleased to announce that the Abuse Filter [1] has been activated
on English Wikipedia!
The Abuse Filter is an extension to the MediaWiki [2] software that
powers Wikipedia allowing automatic "filters" or "rules" to be run
against every edit, and to take actions if any of those rules are
triggered. It is designed to combat vandalism which is simple and
pattern-based, from blanking pages to complicated evasive page-move
vandalism.
We've already seen some pretty cool uses for the Abuse Filter. While
there are filters for the obvious personal attacks [3], many of our
filters are there just to identify common newbie mistakes such
page-blanking [4], give the users a friendly warning [5] and ask them
if they really want to submit their edits.
The best part is that these friendly "soft" warning messages seem to
work in passively changing user behaviour. Just the suggestion that we
frown on page-blanking was enough to stop 56 of the 78 matches [6] of
that filter when I checked. If you look closely, you'll even find that
many of the users took our advice and redirected the page or did
something else more constructive instead.
I'm very pleased at my work being used so well on English Wikipedia,
and I'm looking forward to seeing some quality filters in the near
future! While at the moment, some of the harsher actions such as
blocking are disabled on Wikimedia, we're hoping that the filters
developed will be good enough that we can think about activating them
in the future.
If anybody has any questions or concerns about the Abuse Filter, feel
free to file a bug [7], contact me on IRC (werdna on
irc.freenode.net), post on my user talk page, or send me an email at
agarrett at wikimedia.org
[1] http://www.mediawiki.org/wiki/Extension:AbuseFilter
[2] http://www.mediawiki.org
[3] http://en.wikipedia.org/wiki/Special:AbuseFilter/9
[4] http://en.wikipedia.org/wiki/Special:AbuseFilter/3
[5] http://en.wikipedia.org/wiki/MediaWiki:Abusefilter-warning-blanking
[6] http://en.wikipedia.org/w/index.php?title=Special:AbuseLog&wpSearchFilter=3
[7] http://bugzilla.wikimedia.org
--
Andrew Garrett
I'm at work on a MW extension that, among other things, uses LaTeXML [1]
to make XHTML from full LaTeX documents. One feature is the option to
render the equations in MathML, which requires the skins to be patched so
that they output the page as Content-type: application/xhtml+xml instead
of text/html.
Attached is a patch for the skins directory that allows changing the
Content-type dynamically. After applying this patch, if any code sets the
global $wgServeAsXHTML to true, the page will be output with the xhtml+xml
content type. This seems to work fine with the existing MW XHTML pages.
This has been done before, for instance in the ASCIIMath4Wiki extension
[2]. I don't want to change the Content-type unconditionally, though,
only some of the time, so that we can serve texvc-style images to browsers
or users that don't like the modified content type.
It should be possible to use this patch without breaking any existing
systems (unless someone else's extension happens to use the same global
variable name, I guess).
The patch is made on the 1:1.13.3-1ubuntu1 mediawiki package (from Ubuntu
9.04), and only modifies Monobook.php and Modern.php. There are other
skins in my installation here, but they don't seem to work very well and I
didn't see where to make the change.
Is there a better way to make MathML work in MW? Might this option be
included in a future MW release? Any feedback or alternative suggestions
is welcome.
Lee Worden
McMaster University Dept of Biology
[1] http://dlmf.nist.gov/LaTeXML/
[2] http://www.mediawiki.org/wiki/Extension:ASCIIMath4Wiki
ps. I'm not sure if this list accepts attachments - if not I'll be happy
to send it to people on request.
We have been completly overrun by registrations for the developer meet-up in
Berlin. That’s exhilarating, but forces on me the sad duty to tell you: we are
out of room, we are closing registration early.
So: if you have not yet send a registration mail, you will not be able to attend!
Sorry. We may even have to reject some registrations we have already received.
There’s some good news too, though: anyone interested my join us at the c-base
for the party on saturday March 4., starting 8pm. The developers will be there
and people from the chapter and board meeting will also come. This will be a
good opportunity for getting to know Wikimedians from all over the world.
Regards,
Daniel
Reading http://meta.wikimedia.org/wiki/Help:System_messages, one
wonders if one can set messages in LocalSettings.php instead of
editing MediaWiki:Copyrightpage. However I just get 'Call to a member
function addMessages() on a non-object'.
I’ve just put in Wikimedia’s org application for Google Summer of Code
2009… Hopefully we’ll get in. :)
http://www.mediawiki.org/wiki/Summer_of_Code_2009
^ Add and update cool project ideas as a starting point for student
applicants!
We’ve had mixed luck in previous years with GSoC, but I think we’ve got
enough internal bandwidth this year that we can make sure there’s enough
effort put into interacting with the student candidates ahead of time to
pick the coolest and most go-get-em self-starter awesome projects and
then support them through the project term.
I’ve also tossed up a student application template if you want to get
started early. :)
http://www.mediawiki.org/wiki/Summer_of_Code_2009/Application_template
-- brion
I stopped getting page load replies for about 4-5 min, and they're coming
through now but taking several minutes...
Main unencrypted site is unusually slow right now, but working.
DB issue? Network? Secure server cough up a hairball?
I'm hallucinating it all?? (doubt that)
--
-george william herbert
george.herbert(a)gmail.com