$wgDBtransactions gets set to true if using InnoDB tables. Is there
an advantage to using InnoDB tables?
The disadvantage is that with MySQL there is a file, ibdata1, that
seems to grow endlessly if InnoDB tables are used. See
http://bugs.mysql.com/bug.php?id=1341
We're wondering if we should just convert everything to MyISAM. Any
thoughts?
=====================================
Jim Hu
Associate Professor
Dept. of Biochemistry and Biophysics
2128 TAMU
Texas A&M Univ.
College Station, TX 77843-2128
979-862-4054
Kind of a newbie question... I'm not even sure of the appropriate
taxonomy for my request, maybe that's why Google didn't help.
Using last stable (1.16.2 I believe), I'm trying to achieve this:
Got a default setup, Mediawiki installed in /w, Apache conf edited with
an Alias. So now my wiki is accessible at http://domain.tld/wiki/
I would like to link a specific namespace (let's say "bar") to its own
URL, outside of /foo/.
Ok, to rephrase, hopefully better:
I would like:
domain.tld/wiki/Bar:SomePage
to be in fact:
domain.tld/Bar/SomePage
And if Apache could rewrite on the client side /wiki/Bar:SomePage into
/Bar/SomePage that would be even better, albeit not a deal breaker.
Without breaking anything in MW of course. And if the Recent or Search
tool return the Bar:SomePage that's not an issue, I don't want to
rewrite the MW core :)
Anyone got an idea?
Hi,
Is there a way to ensure that a template is included in all pages of a
particular namespace? I'd like to use our mediawiki to allow people to
annotate some things in a biology project, and the wiki URLs will be
generated automatically by an external program.
Thanks,
Matthew
--
Matthew Betts, Russell Group
CellNetworks, BioQuant, University of Heidelberg
Im Neuenheimer Feld 267, 69120 Heidelberg, Germany
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
I would like to announce the release of MediaWiki 1.16.3, which is a
security release. Three security issues were discovered.
Masato Kinugawa discovered a cross-site scripting (XSS) issue, which
affects Internet Explorer clients only, and only version 6 and
earlier. Web server configuration changes are required to fix this
issue. Upgrading MediaWiki will only be sufficient for people who use
Apache with AllowOverride enabled.
Due to the diversity of uploaded files that we allow, MediaWiki does
not guarantee that uploaded files will be safe if they are interpreted
by the client as some arbitrary file type, such as HTML. We rely on
the web server to send the correct Content-Type header, and we rely on
the web browser to respect it. This XSS issue arises due to IE 6
looking for a file extension in the query string of the URL (i.e.
after the "?"), if no extension is found in path part of the URL.
Masato Kinugawa discovered that the file extension in the path part
can be hidden from IE 6 by substituting the "." with "%2E".
To fix this issue, configure your web server to deny requests with
URLs that have a path part ending in a dot followed by a dangerous
file extension. For example, in Apache with mod_rewrite:
RewriteEngine On
RewriteCond %{QUERY_STRING} \.[a-z]{1,4}$ [nocase]
RewriteRule . - [forbidden]
Upgrading MediaWiki is necessary to fix this issue in
dynamically-generated content. This issue is easier to exploit using
dynamically generated content, since it requires no special
privileges. Accounts on both public and private wikis can be
compromised by clicking a malicious link in an email or website. For
more details, see bug 28235.
Wikipedia user Suffusion of Yellow discovered a CSS validation error
in the wikitext parser. This is an XSS issue for Internet Explorer
clients, and a privacy loss issue for other clients since it allows
the embedding of arbitrary remote images. For more details, see bug 28450.
MediaWiki developer Happy-Melon discovered that the transwiki import
feature neglected to perform access control checks on form submission.
The transwiki import feature is disabled by default. If it is enabled,
it allows wiki pages to be copied from a remote wiki listed in
$wgImportSources. The issue means that any user can trigger such an
import to occur. For more details, see bug 28449.
The localisations were updated using content from translatewiki.net.
**********************************************************************
Download:
http://download.wikimedia.org/mediawiki/1.16/mediawiki-1.16.3.tar.gz
Patch to previous version (1.16.2), without interface text:
http://download.wikimedia.org/mediawiki/1.16/mediawiki-1.16.3.patch.gz
Interface text changes:
http://download.wikimedia.org/mediawiki/1.16/mediawiki-i18n-1.16.3.patch.gz
GPG signatures:
http://download.wikimedia.org/mediawiki/1.16/mediawiki-1.16.3.tar.gz.sighttp://download.wikimedia.org/mediawiki/1.16/mediawiki-1.16.3.patch.gz.sighttp://download.wikimedia.org/mediawiki/1.16/mediawiki-i18n-1.16.3.patch.gz…
Public keys:
https://secure.wikimedia.org/keys.html
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.10 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
iEUEARECAAYFAk2jxbAACgkQgkA+Wfn4zXn38gCWISDEZuC+Ap3Z4aBfibnuNSU1
EgCfeL2lo/4XtCuoKOwah0YbuaHyf5I=
=S2JZ
-----END PGP SIGNATURE-----
Hi,
I'm trying to output some tags just before the </head> attribute on each
page generated by MW.
Since I want to minimize updating headaches, I'm trying the extension
way. But for the life of me, I can't find the right hook or syntax.
My Manual searches gave me the OutputPageBodyAttributes hook but it's
1.17 only.
Anyone got the right hook, or a simple syntax example?
Thanks a lot.
Tim Starling <tstarling(a)wikimedia.org> wrote:
> You could add a static function to User which provides the field name
> array. If it were used in User::loadFromDatabase(), then it would be
> immune to bit rot.
Dug into this a little bit, but is it promised that loadFromDatabase will
be called early enough? Such as the case of a non-logged in user getting
a list of contributions for an article?
--
Greg Sabino Mullane greg(a)endpoint.com
End Point Corporation
PGP Key: 0x14964AC8
OQ <overlordq(a)gmail.com> wrote:
> I tried poking at this a bit and got as far as trying to figure out
> what UserArrayFromResult actually needed to create the objects. I had
> pictured going the cheap route and making it a two step operation,
> getting the list of contributors and then pulling in the relevant user
> data based on the obtained list.
I think the bare minimum is nothing at all, if you check out loadFromRow
inside of User.php! The resulting User object would be less than useful,
of course. Seems to me the list of "public" attributes from User is
really only user_id and user_name.
Tim Starling <tstarling(a)wikimedia.org> wrote:
> On 28/04/11 00:15, Greg Sabino Mullane wrote:
> > 1) Manually add in all the columns from the user table to the
> > GROUP BY. Painful, and subject to immediate breakage when a
> > column is added or removed from the user table.
>
> You could add a static function to User which provides the field name
> array. If it were used in User::loadFromDatabase(), then it would be
> immune to bit rot.
Yes, that's a good point: we don't really care about the user table
per se, merely MW's canonical representation of it in User.php.
An interesting approach.
> Can you add a note about this to docs/database.txt and
> docs/databases/postgres.txt? Something explaining why we can't use *
> with group by.
Done.
--
Greg Sabino Mullane greg(a)endpoint.com
End Point Corporation
PGP Key: 0x14964AC8