>From Gregory Maxwell in the "[Wikitech-l] User block changes" thread:
> If someone can't cooperate they are lost to us and we should just
> block, if they can then there is no need for fancy technical
> measures.
I agree. OT (hence the subject name change), but related to this
general principle of just getting out of the way of people that are
fundamentally sensible:
Something that seems a bit strange to me is rollback. The current
situation I believe is that "Admins have a handy 'rollback' feature
which allows them to instant-revert changes from a user's
contributions page".
Sounds great, useful, and sensible. Everything apart from the "admins
only bit". Why stop at admins? Me personally, I'm not a wikipedia
admin (and currently the idea of yet another system that I'm an admin
of in some way holds zero appeal), but the ability to quickly undo
vandalism is useful, and could be given to far more users and be a big
net win for vandalism control. The whole "history -> click on last
edit minus one -> click edit -> type out 'revert' -> click save" cycle
gets very tedious and repetitive after a while.
<rant>
Why do we do this? Yes, there probably has to be some point at which
we start trusting people enough to do easy rollbacks, but "admin" is
too high a standard. If someone has a login, and has (say) >= 1000
edits, and has used the system for (say) >= 3 months, there's a pretty
good chance they can spot an anon committing vandalism on pages on
their watchlist. So why don't we make undoing this easier? Why don't
we help such people more, empower them more, and make what they can
already do just that bit easier and quicker? And I don't just mean me,
or just this specific user and that specific user, I mean all users
who cross a certain measurable threshold of trustworthiness and
commitment, should automatically be given rollback ability. Maybe
start the entry criteria high so that only a few people qualify
initially, and then gradually lower them whilst the gain from lowering
exceeds the pain from misuse - that would be fine, as long as it's a
systematic attempt to empower a whole category of trusted users, as
opposed to a user-by-user non-systematic approach.
</rant>
All the best,
Nick.
An automated run of parserTests.php showed the following failures:
Running test Table security: embedded pipes (http://mail.wikipedia.org/pipermail/wikitech-l/2006-April/034637.html)... FAILED!
Running test Link containing double-single-quotes '' (bug 4598)... FAILED!
Running test Template with thumb image (wiht link in description)... FAILED!
Running test message transform: <noinclude> in transcluded template (bug 4926)... FAILED!
Running test message transform: <onlyinclude> in transcluded template (bug 4926)... FAILED!
Running test BUG 1887, part 2: A <math> with a thumbnail- math enabled... FAILED!
Running test Language converter: output gets cut off unexpectedly (bug 5757)... FAILED!
Running test HTML bullet list, unclosed tags (bug 5497)... FAILED!
Running test HTML ordered list, unclosed tags (bug 5497)... FAILED!
Running test HTML nested bullet list, open tags (bug 5497)... FAILED!
Running test HTML nested ordered list, open tags (bug 5497)... FAILED!
Running test Parsing optional HTML elements (Bug 6171)... FAILED!
Running test Inline HTML vs wiki block nesting... FAILED!
Running test Mixing markup for italics and bold... FAILED!
Running test 5 quotes, code coverage +1 line... FAILED!
Running test HTML Hex character encoding.... FAILED!
Running test dt/dd/dl test... FAILED!
Passed 409 of 426 tests (96.01%) FAILED!
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
It's interesting to see other people pick up the blame/annotate project,
and totally disregard previous work.
http://meta.wikimedia.org/wiki/Annotation
I've discussed a lot of the implementation issues and already have blame
generating code based on DifferenceEngine, which means that as that
evolves to better support different languages, Annotation doesn't have
to either.
I think it /can/ work, but it will require careful supervision by
someone who has a large amount of expertise about the MediaWiki system
and Wikimedia server cluster.
Oh, and the relevant bug report is
http://bugzilla.wikimedia.org/show_bug.cgi?id=639
- --
Edward Z. Yang Personal: edwardzyang(a)thewritingpot.com
SN:Ambush Commander Website: http://www.thewritingpot.com/
GPGKey:0x869C48DA http://www.thewritingpot.com/gpgpubkey.asc
3FA8 E9A9 7385 B691 A6FC B3CB A933 BE7D 869C 48DA
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.1 (MingW32)
iD8DBQFEtbuzqTO+fYacSNoRAvYSAJ9gvbokp+edQAm2l1b0roDseau5KwCcDWEd
wudhvwGydwCFHKPzBg2GVlo=
=e+Hp
-----END PGP SIGNATURE-----
An automated run of parserTests.php showed the following failures:
Running test Table security: embedded pipes (http://mail.wikipedia.org/pipermail/wikitech-l/2006-April/034637.html)... FAILED!
Running test Link containing double-single-quotes '' (bug 4598)... FAILED!
Running test Template with thumb image (wiht link in description)... FAILED!
Running test message transform: <noinclude> in transcluded template (bug 4926)... FAILED!
Running test message transform: <onlyinclude> in transcluded template (bug 4926)... FAILED!
Running test BUG 1887, part 2: A <math> with a thumbnail- math enabled... FAILED!
Running test Language converter: output gets cut off unexpectedly (bug 5757)... FAILED!
Running test HTML bullet list, unclosed tags (bug 5497)... FAILED!
Running test HTML ordered list, unclosed tags (bug 5497)... FAILED!
Running test HTML nested bullet list, open tags (bug 5497)... FAILED!
Running test HTML nested ordered list, open tags (bug 5497)... FAILED!
Running test Parsing optional HTML elements (Bug 6171)... FAILED!
Running test Inline HTML vs wiki block nesting... FAILED!
Running test Mixing markup for italics and bold... FAILED!
Running test 5 quotes, code coverage +1 line... FAILED!
Running test HTML Hex character encoding.... FAILED!
Running test dt/dd/dl test... FAILED!
Passed 409 of 426 tests (96.01%) FAILED!
Another crosspost to wikipedia-l and wikitech-l, please follow up to one
list only. Tell me offlist if you think the crossposting is useful.
I'm making some changes to user blocks. Here is the changelog entry:
* Allow blocks on anonymous users only (bug 550)
* Allow or disallow account creation from blocked IP addressess on a
per-block basis.
* Prevent duplicate blocks.
* Fixed the problem of expiry and unblocking erroneously affecting multiple
blocks.
* Fixed confusing lack of error message when a blocked user attempts to
create an account.
* Fixed inefficiency of Special:Ipblocklist in the presence of large numbers
of blocks; added indexes and implemented an indexed pager.
This will require a schema update, hopefully it can be done on Wikimedia
with a minimum of disruption.
A unique index will be introduced which will prevent duplicate blocks on
users or IP addresses. To migrate to the new schema, it is necessary that
all existing duplicate blocks be deleted. This will be done automatically
during the schema update. I *think* the winning block will be the earliest
one, with all others silently deleted, but we'll see for sure during the update.
If it's a problem for anyone that these blocks will be removed, please speak
up. I'm not going to create a list of dropped blocks unless anyone has a
really good reason why it's necessary. The original record of the duplicate
block being created, available in Special:Log, will not be deleted, and
there will be a backup of the ipblocks table in case anything goes wrong.
I'll aim to put this change live in about 24 hours from now. It has to be
soon, because we won't be able to deploy any other updates to MediaWiki
until the schema is updated.
I know a lot of people have all sorts of dreams for the perfect blocking
feature, and I'm sorry to say that this won't fulfill all of them. It is a
concrete step forward, it fixes some embarrassing bugs, and it implements a
much requested feature. I'd be happy to hear about your ideas for a perfect
blocking module, just don't expect them to be live on the site within 24 hours.
-- Tim Starling
Interesting idea!
Is there *maybe* there some kind SQL injection attack?
E.g. I saved an evil data source like so:
=======================================
<data table="Companies'; insert into user set user_id = '9999',
user_name = 'Nickj2', user_email = 'address@name'; insert into
user_groups set ug_user = '9999', ug_group = 'sysop'; insert into
user_groups set ug_user = '9999', ug_group = 'bureaucrat'; "
template="Infobox company">
name=Microsoft
founded=1492
revenue=$8
</data>
=======================================
And then the all pages list from the table namespace only showed the 3
test tables:
http://www.kennel17.co.uk/testwiki/index.php?title=Special%3AAllpages&from=…
So that says to me that *maybe* it did something (otherwise I would
expect the test 3 + this one).
Of course, I then tried to use the "reset and email me my password"
function to get admin rights to see if it was working, but there was
no such user :-(
> www.kennel17.co.uk/testwiki
I'm probably going to regret asking this, but with a hostname like
that I just have to ask: What happened to kennels numbered one through
sixteen inclusive?
All the best,
Nick.
[1] Thanks to superb work by Erik Garrison, we now have an efficient,
C-based parser that extracts header data from WMF xml dumps into csv files
readable by standard statistical software packages.
* Source for this parser will soon be web-available; stay tuned.
* The csv files will also be available online, either from
download.wikimedia.org (if the parser can be run on the WMF servers) or from
a webserver on karma or at NBER (see below).
* If you just can't wait, let us know and we'll offer express service :)
* The csv files consist of these variables with these types:
names: title,articleid,revid,date,time,anon,editor,editorid,minor
types: str,int,int,str,str,[0/1],str,int,[0/1]
[2] We have begun to use these csv files to produce weekly sets of
statistics.
See last week's work here:
http://en.wikipedia.org/wiki/Wikipedia:WikiProject_Wikidemia/Quant/Stats200…
This week we will finish out that set of stats.
Next week's list needs your creative suggestions: Please edit directly!
http://en.wikipedia.org/wiki/Wikipedia:WikiProject_Wikidemia/Quant/Stats200…
[3] NBER has set us up with a pretty good Linux box, wikiq.nber.org, running
Fedora Core 5. We hope to have Xen instances available for researchers
interested in doing statistical analysis on the csv files within two weeks.
[4] WMF readership data continues to be irretrievably lost. What can we do
to begin saving at least some of it as soon as possible? If we were to save
only articleid for one of every hundred squid requests, and include some
indicator in the file at the end of each day, privacy concerns and
computational burdens would be minimized, and this would still be a great
start.
How can we make this happen?
Best,
Jeremy
An automated run of parserTests.php showed the following failures:
Running test Table security: embedded pipes (http://mail.wikipedia.org/pipermail/wikitech-l/2006-April/034637.html)... FAILED!
Running test Link containing double-single-quotes '' (bug 4598)... FAILED!
Running test Template with thumb image (wiht link in description)... FAILED!
Running test message transform: <noinclude> in transcluded template (bug 4926)... FAILED!
Running test message transform: <onlyinclude> in transcluded template (bug 4926)... FAILED!
Running test BUG 1887, part 2: A <math> with a thumbnail- math enabled... FAILED!
Running test Language converter: output gets cut off unexpectedly (bug 5757)... FAILED!
Running test HTML bullet list, unclosed tags (bug 5497)... FAILED!
Running test HTML ordered list, unclosed tags (bug 5497)... FAILED!
Running test HTML nested bullet list, open tags (bug 5497)... FAILED!
Running test HTML nested ordered list, open tags (bug 5497)... FAILED!
Running test Parsing optional HTML elements (Bug 6171)... FAILED!
Running test Inline HTML vs wiki block nesting... FAILED!
Running test Mixing markup for italics and bold... FAILED!
Running test 5 quotes, code coverage +1 line... FAILED!
Running test HTML Hex character encoding.... FAILED!
Running test dt/dd/dl test... FAILED!
Passed 407 of 424 tests (95.99%) FAILED!
(Apologies if this message appears more than once - I have had problems
posting to the list...)
I have been working on the design of a wiki-style database extension to
MediaWiki. If you can't be bothered to read the description and want to
jump right into a demo, then skip to the bottom of this mail.
WikiDB is a MediaWiki extension that adds a new type of namespace and a
couple of extra tags to the MediaWiki software. After installing the
extension you are able to set up a table namespace in which you can define
table structures. No table needs to be defined in order to add data to it,
and data may be located anywhere on the wiki. This means that you can keep
record-type data inside the article it applies to, rather than being forced
to add it to a central store. In the true 'wiki way', if you have
information to add, you can add it straight away, without requiring that a
formal structure be set up in advance. Other people can come and refactor
your work later, once a structure is in place.
Defining a table structure adds additional functionality that allows you to
validate and format data and this structure (as with any wiki page) can be
modified at any time. For example if you have an 'Albums' table, you can
specify, for example, that the 'year of release' field should be a date.
This will then cause all records that don't fit this definition to be
flagged up. Change the definition to 'integer between 1850 and 2006' and it
will highlight the rows that are invalid under the new definition. The
important thing to note is that no data is affected - when displaying any
data it is interpreted according to current the field type, so changing the
field type changes how it is _displayed_, but not how it is _stored_.
Finally, you are able to query the database for information from any table
and format it for display within any wiki page (again, no table definition
required). Ultimately this will allow complicated queries that include join
expressions and the like, although it is currently quite (very!) basic.
This extension is at a very early (but working) stage of development, and
this is a request for comment from the MediaWiki community. I am looking
for input at all levels - from code-optimisation to syntax to usage issues.
I have set up a demo on my testwiki (www.kennel17.co.uk/testwiki) which
gives a bit of an introduction to the extension and some details about
implementation. I will be adding to this as I go on. The main URL for
the extension is www.kennel17.co.uk/testwiki/WikiDB.
Please bear the following points in mind:
* The extension is in a very early stage of development. Core functionality
is in place, but many planned features are currently lacking (including some
of those mentioned above).
* The syntax is currently fairly long-winded for certain tasks. This will be
streamlined (suggestions welcome) but it is sufficient for proof-of-concept.
* There has been very little optimisation, and there is a lot of potential
for it. I welcome advice on this matter, but do not be put off by
speed/efficiency of the current version.
Thank you for reading - I look forward to your feedback.
--
Mark Clements (HappyDog)
(To contact me directly, remove the capitals from my address)
An automated run of parserTests.php showed the following failures:
Running test Table security: embedded pipes (http://mail.wikipedia.org/pipermail/wikitech-l/2006-April/034637.html)... FAILED!
Running test Link containing double-single-quotes '' (bug 4598)... FAILED!
Running test Template with thumb image (wiht link in description)... FAILED!
Running test message transform: <noinclude> in transcluded template (bug 4926)... FAILED!
Running test message transform: <onlyinclude> in transcluded template (bug 4926)... FAILED!
Running test BUG 1887, part 2: A <math> with a thumbnail- math enabled... FAILED!
Running test Language converter: output gets cut off unexpectedly (bug 5757)... FAILED!
Running test HTML bullet list, unclosed tags (bug 5497)... FAILED!
Running test HTML ordered list, unclosed tags (bug 5497)... FAILED!
Running test HTML nested bullet list, open tags (bug 5497)... FAILED!
Running test HTML nested ordered list, open tags (bug 5497)... FAILED!
Running test Parsing optional HTML elements (Bug 6171)... FAILED!
Running test Inline HTML vs wiki block nesting... FAILED!
Running test Mixing markup for italics and bold... FAILED!
Running test 5 quotes, code coverage +1 line... FAILED!
Running test HTML Hex character encoding.... FAILED!
Running test dt/dd/dl test... FAILED!
Passed 407 of 424 tests (95.99%) FAILED!