There's an ongoing poll at en.wiki[1] regarding a FlaggedRevs-type
proposal.[2]
It combines 'flagged protection' with patrolled revisions as a compromise
measure between full flagging and no flagging. The proposal is for a
two-month trial.
Just posting here mostly as a heads-up, but also in case anybody at the
staff or Foundation level wishes to veto this preemptively or state any
possible objections.
It's uncertain whether this proposal will pass, however I really don't want
to get into a situation where the proposal has on-wiki consensus but there
are (previously unknown) objections at a higher level.
Frustration levels are quite high regarding the quality of some of our
content (notably biographies of living persons) and FlaggedRevs has been a
long time coming.
CC'ing Erik on this e-mail as he seems to be the point person regarding
this.
MZMcBride
public(a)mzmcbride.com
[1]
http://enwp.org/Wikipedia_talk:Flagged_protection_and_patrolled_revisions/Po
ll
[2] http://enwp.org/Wikipedia:Flagged_protection_and_patrolled_revisions
Hello List,
I know MW a little bit but not so well to avoid asking for hints here.
For research purposes, I am surveying code repositories of the top 10
web applications using a modified version of StatSVN.
Basically, I need to track source changes, over releases, in terms of
"Lines of Code (LOC) matching a certain regexp".
The functions/lines I am interested in are those somehow involved in
handling HTTP request/session/response parameters. Of course, I allow
a certain roughness in the analysis.
I am trying to build a list of such functions/lines. What would you
suggest to include? What would you grep for if you want to estimate
the functions that are, even indirectly, related to processing of HTTP
requests/sessions/responses?
Any input is greatly appreciated. Thanks in advance. Cheers,
-- Federico
I had an occasion to move (with redirect suppressed) a Mediawiki: page
into Template: The intent was that the Mediawiki system message
should then fall back to its default behavior, while the page's
contents could still be used as a template under circumstances where
it was desirable to do so.
However, after several hours the system message is still giving the
same result and doesn't recognize that its contents were removed.
Obviously this is some sort of caching issue, perhaps because the
software doesn't know how to react properly to having a Mediawiki page
subject to a move. Is this kind of lag likely to clear itself up in
short order? How long does the message cache persist?
-Robert Rohde
Hi,
I am looking at the dump of the English Wikipedia at
http://download.wikimedia.org/enwiki/20081008/ There is a file called
“all-titles-in-ns0.gz” which is supposed to contain the List of Page
Titles. If I do
cat enwiki-20081008-all-titles-in-ns0 | wc -l
I get 5716820. On the same page, a little above in
“pages-articles.xml.bz2” we have “enwiki 7649051 pages”.
So why are these two numbers different? Are there pages without a Title?
Thanks a lot,
O. O.
Are there any plans to tie FlaggedRevs in with the page protection system?
I heard something about a feature for making it so that FlaggedRevs was
only applied to specific pages (ie: Ones noted to have heavy
traffic/vandalism) but it appears that the feature has something to do
with whitelisting pages with a config variable, which isn't very good
for my use case.
The use case I'm thinking of is this.
On the Narutopedia nearly every time a new volume comes out, we get
piles of new users coming in trying to add unconfirmed information to
the wiki, most of it false info and pure speculation. For that reason
the community has now taken to the idea of temp protecting articles
related to recent chapters every time a new volume comes out.
Unfortunately this also causes comments on the talkpage of new users
annoyed they cannot edit the page (despite notes that they are free to
mention any improvements on the talkpage, and it only being temp
protection). So I was considering an alternate option.
I was thinking of something like FlaggedRevs, kind of like a protection
level. Whenever we have issues with an article getting a lot of trash,
rather than flat out protecting we go to the protection page and from a
list of options [Flagged, Autoconfirmed, Protected] we chose flagged,
meaning specific high-traffic pages likely to have false information put
on them can be set so that all new revisions need to be reviewed before
they show up in the stable page shown to the public up front.
--
~Daniel Friesen (Dantman, Nadir-Seen-Fire)
Hi,
I attempted to import the English Wikipedia into MediaWiki by first
downloading the pages-articles.xml.bz2, uncompressing it, splitting it
using
xml2sql enwiki-20081008-pages-articles.xml
and finally imported the results using
mysqlimport -u root -p --local wikidb ./{page,revision,text}.txt
I also imported all of the SQL files on
http://download.wikimedia.org/enwiki/20081008/
The problem that I am now facing is that the HTML Rendered is wrong in
places. Mostly this happens at the beginning of the text on the Page.
For example in the beginning the United_Kingdom article I get:
</tr><tr> <th colspan="2">Calling code</th> <td>+44 </td>
</tr></table>
After this I get the normal article text i.e. “The United Kingdom of … “
etc.
The result of this is that the rest of the article is not formatted
correctly. For example in IE the first paragraph is shifted into a
column on the Right. In both IE and Mozilla I do not get the
“navigation”, “search”, “interaction”, “toolbox”, languages” and the
Sunflower MediaWiki Picture on the Top-Left Corner. (I get these
elements in other pages though. I just wanted to illustrate the problems
that bad HTML causes.)
Another problem I am having is at the top of each page I get “Error:
image is invalid or non-existent” – Is there a way to disable this error
message. I know that I don’t have the images – and is not a problem for
me. I would only prefer not to have this error message in red at the top
of the Article.
Any ideas on what I might be doing wrong here?
Thanks,
O. O.
Hello,
2009/3/14 David Gerard <dgerard(a)gmail.com>:
> Here's an idea: nice URLs for the history. So we don't end up with
> stupid things peppered with ? and & and = printed on mugs, travel
> guides, etc.
>
> e.g. http://en.wikipedia.org/history/Xenu for the history of
> http://en.wikipedia.org/wiki/Xenu .
This is already possible in MediaWiki, using a feature called "action
paths". It simply needs Apache rewrites setting up and a configuration
variable within MediaWiki altering, there may be other implications in
terms of internal organisation, robot functionality and caching
though.
MinuteElectron.
Gentlemen, it occurred to me that under close examination one finds
that when making a backup of one's wiki's database, some of the tables
dumped have various degrees of temporariness, and thus though needing
to be present in a proper dump, could perhaps be emptied of their
values, saving much space in the SQL.bz2 etc. file produced.
Looking at the mysqldump man page, one finds no perfect options to do
so, so instead makes one's own script:
$ mysqldump my_database|
perl -nwle 'BEGIN{$dontdump="wiki_(objectcache|searchindex)"}
s/(^-- )(Dumping data for table `$dontdump`$)/$1NOT $2/;
next if /^LOCK TABLES `$dontdump` WRITE;$/../^UNLOCK TABLES;$/;
print;'
Though not myself daring to make any recommendations on
http://www.mediawiki.org/wiki/Manual:Backing_up_a_wiki#Tables
I am still curious which tables can be emptied always,
which can be emptied if one is willing to remember to run a
maintenance script to resurrect their contents, etc.
Hi!
We need to clean up old MediaWiki messages translations on be-x-old
Wikipedia. deleteDefaultMessages.php looks as perfect candidate for
this job. However last changes from MediaWiki_default were made in
2007.
Is this code still functional? If so, I think will be good idea to run
it on all WMF wikis, especially those which had long history of
translation of MediaWiki messages (before translatewiki.net).
Eugene.