On de.wikipedia, we were discussing the topic of blocked titles, meaning
topics that otherwise would go through a cycle of deletion and
re-creation by trolls/vandals/fanatics. Currently, this is done by
putting a template on it and protecting the page.
This is rather ugly and has certain side effects, e.g., filling up the
"short pages" list.
Many would prefer a clean technical soution. The simple way would be to
add a new table to the database, listing the topics blocked from
recreation. This table would only be accessed
* when an admin clicks on an "(un)block this article" link on a
non-existing page (write access)
* when someone tries to edit a non-existing article (read access)
As table fields, I suggest title, blocking user id, and maybe date.
Unblocking a topic would, im my model, equal the removal of the entry
from the table; I don't see the need to keep a history on such a rarely
used and easy-to-counteract-by-another-admin function. Perhaps a new log
would be in order, though.
Magnus
Hi,
MediaWiki has a fairly useful case-guessing heuristic where if you
search for "foo boo moo" it attempts, if that fails, to match "Foo Boo
Moo" and "FOO BOO MOO".
Would it be possible to do something similar for templates? The
heuristic for "template:foo boo moo" would be:
1. If that page is not found, try "template:foo-boo-moo"
2. If that page is not found, try "template:fooboomoo"
This may just be useful for English Wikipedia, but the lack of naming
convention there makes it really difficult to remember the names of
multi-word templates.
Steve
Hi all
I have written an experimental installer for extensions. It's in SVN,
maintenance/installExtension.php; It would be great if you would try it,
and give some feedback. I hope this will make life much easier for users.
Basically, it fetches the files, places them in the extension dir, and
optionally patches LocalSettings.php (if the user chooses that option
and the extension provides an install.settings patch file). The
CategoryTree extension has such a file, if you want to try the patching
feature.
Extensions can be fetched from local tgz files or directories, from
remote tgz files, or from a SVN path. The installer looks up extensions
in "repositories", which are directories containing tgz files or
subdirectories.
Per default, http://svn.wikimedia.org/svnroot/mediawiki/trunk/extensions
is used as the repository (in svn mode), unless MediaWiki was checked
out via SVN - in that case, the SVN path given in .svn/entries is used.
For testing, I have set up a tgz based repository at
<http://tools.wikimedia.de/~daniel/repository/extensions/>. It has a tgz
for each subdirectory in the extensions module. Note that extensions
that do not have their own subdirectory are not supported.
Caveats: Custom install scripts (for example to patch the database) are
not yet supported. This needs some more thought I guess. The ability to
uninstall or deactivate extensions could be added - not sure how helpful
that would be. The present script will also not work on Windows, because
it relies on things like tar.
The installer is currently intended for use from the command line. A web
interface would raise security concerns - perhaps it could be made part
of the original installation procedure, though.
I wrote the installer in a manner that does not require changes to
extensions or the way they are loaded. On the long run, it would
probably be nicer to be able to hook up extensions without messing with
LocalSetttings.php - although I'm not sure how to do that in a clean and
at the same time flexible way. We should talk about how to do that, too,
in time, but please keep that separate from talk about the present
installer.
Any input would be appreciated
-- Daniel
--
Homepage: http://brightbyte.de
> Brion Vibber wrote
> Daniel Kinzler wrote:
>
>> Basically, it fetches the files, places them in the extension dir, and
>> optionally patches LocalSettings.php (if the user chooses that option
>> and the extension provides an install.settings patch file). The
>> CategoryTree extension has such a file, if you want to try the patching
>> feature.
>
> I really, really don't like that. An improved extension system should simply
> have autodiscovery of dropped-in directories and a simpler way to enable things.
I agree in principle, but don't see a way to do this without changing
all existing extensions and/or causing some inconvenience when upgrading
an existing install to the new system. Here some points that should be
considered:
* we'd need a way to know which file in the extension's dir should be
included on every page request. This could be done by a naming
convention, as with skins - but that would mean changing many extensions.
* there would have to be a separate directory for "auto-load"
extensions, or a file listing the extensions to load. The ability to add
something there would be just as dangerous as patching LocalSettings.
* Extensions may use configuration variables. Would they go into
LocalSettings, or into separate files? The former would require patching
again (or worse, manually inserting them), the latter seems hard to
maintain.
* Some extensions may need to "patch" the database on install. That
would be hard to do with a "drop in" scheme. While this issue isn't
addressed by my installer either, it could easily be added there.
> An installer should not be necessary, IMHO, as it would be useless for the
> primary target audience of simplified extension installation -- people with
> limited hosting accounts.
Right now, it's pretty hard for people to even *get* an extension, since
they are only available from SVN (correct me if I'm wrong there). Even
if we had bundles somewhere, I think it's much easer to say "php
installExtension.php Foo" than to manually download, extract, and hook
up an extension.
Internally, the installer provides a set of classes for dealing with
repositories and resources, and for hooking up the extension. This could
easily be used by a web based installation process (maybe as part of the
original installation) - which would be what people without shell access
would probably want. This would of course require a lot of precautions
to avoid opening a huge attack vector. This is the case for any web
based installation process, no matter if it patches LocalSettings or not.
So:
* if we don't want an installer, how can the installation of more
complex extension (e.g. requiring db patches) be made simple for the user?
* how would we migrate existing installations and extensions to a "drop
in" scheme?
* if there's going to be a web based installer, how can it be made secure?
My current implementation is meant as a proof of concept, especially of
a system to list available extensions and fetch them from a repository.
I hope this can be reused regardless of how extensions are hooked up in
the future. As it is, it makes life easier for people with shell access,
and work with the extensions we have now.
-- Daniel
An automated run of parserTests.php showed the following failures:
Running test Table security: embedded pipes (http://mail.wikipedia.org/pipermail/wikitech-l/2006-April/034637.html)... FAILED!
Running test Link containing double-single-quotes '' (bug 4598)... FAILED!
Running test Magic Word: {{CURRENTMONTHNAMEGEN}}... FAILED!
Running test Template with thumb image (wiht link in description)... FAILED!
Running test message transform: <noinclude> in transcluded template (bug 4926)... FAILED!
Running test message transform: <onlyinclude> in transcluded template (bug 4926)... FAILED!
Running test BUG 1887, part 2: A <math> with a thumbnail- math enabled... FAILED!
Running test HTML bullet list, unclosed tags (bug 5497)... FAILED!
Running test HTML ordered list, unclosed tags (bug 5497)... FAILED!
Running test HTML nested bullet list, open tags (bug 5497)... FAILED!
Running test HTML nested ordered list, open tags (bug 5497)... FAILED!
Running test Parsing optional HTML elements (Bug 6171)... FAILED!
Running test Inline HTML vs wiki block nesting... FAILED!
Running test Mixing markup for italics and bold... FAILED!
Running test 5 quotes, code coverage +1 line... FAILED!
Running test HTML Hex character encoding.... FAILED!
Running test dt/dd/dl test... FAILED!
Passed 412 of 429 tests (96.04%) FAILED!
I didn't change the subject line like I should have.
On 8/3/06, Mark Bell <typewritermark(a)gmail.com> wrote:
>
> Folks,
>
> This may be answered somewhere but i have not found a reference.
>
> I would like stats or a graph that shows:
>
> Number of unique editors per article X number of articles with that many
> unique editors.
>
> So for example
>
> number of unique editors Number of articles with that many unique
> editors
> 15 356
> 30 3455
>
>
> And so on. Is this even possible?
>
> M
>
--
Mark Bell
MA student in Ball State University's Digital Storytelling program
http://www.storygeek.com
"The future is here...it's just not widely distributed." - Tim O'Reilly
Let's for a minute forget about my code and look at what we want. From
what I gather from the discussion:
* discover extensions available locally, base on some meta-file in each
extension's directory.
* Have a (web?) UI for listing the available extensions, and
enable/disable them.
* That would supposedly manipulate a list of active extensions stored in
a file (or database table)
* That list would have to be read and evaluated for each page request
(it should probably be cache in memory, using the object cache or something)
* any options the extensions may have would have to be set manually in
LocalSettings.php. Since with the new system, extensions wouldn't load
until after LocalSettings.php is evaluated, extensions would have to
take care not to overwrite options when applying defaults.
This raises some questions
* In case of a web UI, how would it be protected? Would it be inside
MediaWiki itself, available to, say, beurocrats?
* When an extension is enabled by the UI, some custom script may be run
to patch the db. That may require a different db user.
* It would be nice to have a way to provide basic options to the
extension on install, without having to edit LocalSettings.php
All of the above do not address how to locate and download extensions -
which is what the code I wrote mainly does. I personally feel that it's
nice to have a repository based installer, like, say, apt or something.
So, I wrote something like that. How useful a CLI-based tool for this
really is - well, it doesn't hurt to have it, I guess. It would be very
convenient, but probably way too dangerous, to allow fetching extensions
via a web interface.
In any case, *please* let's establish a repository of tgz bundles of
extensions somewhere. It's simply silly to require people to use SVN to
get extensions, unless they want bleeding edge. It's also trivial to do
for all extensions that have their own directory. A simple script can
generate something like this automatically:
<http://tools.wikimedia.de/~daniel/repository/extensions/>
-- Daniel
--
Homepage: http://brightbyte.de
At the risk of getting a whack on the head, could we have a status
update on the toolserver, particularly with respect to English
Wikipedia? I haven't heard much since it was announced that the data
was being reloaded but there was some corruption.
http://meta.wikimedia.org/wiki/Toolserver isn't very helpful.
Sorry to be a nuisance!
Steve
The Daily Usage statistics are empty in
http://stats.wikimedia.org/EN/TablesWikipediaZZ.htm
(first table, last column on page)
Where should I look to find them?
What I'm really looking for is insights on the relationship between
load and servers, e.g. how many concurrent sessions per server, page
views per time and server etc.
Thanks,
Johannes.
Hi all,
Don't know if this is widely known (and much less, whether it is
useful), but I discovered that you can quickly make a non-linking link
simply by inserting a single link break. That is, the code:
[[Some
link]]
Will render as: [[Some link]] - just plain text.
Caveats:
- Only really works with multi-word links, as breaking anywhere but a
space inserts an extra space:
[[Foo
]]
renders as: [[Foo ]]
- Attempting to use a "pipe trick" with a single word produces the
amusing effect of making the link totally vanish: [[Foo|
]] renders as nothing at all. (while [[Foo|]] renders as linked Foo).
- explicitly linking the name back to itself produces a normal link: [[Foo|
Foo]] is the same as [[Foo]].
- doesn't seem to work for templates
- probably not useful for anything?
Anyway, just some more rather curious fringe parser behaviour.
Steve