Hi! I'm a PhD student at Rey Juan Carlos University (Spain). I'm focusing my
research work in Wikipedia performance rates and profiling techniques. I'm
writing you to ask for a little help : )
first, what kind of parameters do you think will be more useful. That is
served
pages, number of database accesses, requests categorization, averaged
response
time,...
where could I find statistics and measurements involving the operation of
squids, databases, apaches and so on….
And, finally and probably most important, could I have access to some log
files
of squids, apache servers, mysql servers? That would be great….
Thank you very much
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
robchurch(a)svn.wikimedia.org wrote:
> +'ShowRawCssJs': When presenting raw CSS and JavaScript during page views
> +&$text: Text being shown
> +$title: Title of the custom script/stylesheet page
> +$output: Current OutputPage object
> +
[snip]
> +// Give hooks a chance to do formatting...
> +if( wfRunHooks( 'ShowRawCssJs', array( &$text, $this->mTitle, $wgOut ) ) ) {
> + // Wrap the whole lot in a <pre> and don't parse
> + preg_match( '!\.(css|js)$!u', $this->mTitle->getText(), $m );
> + $wgOut->addHtml( "<pre class=\"mw-code mw-{$m[1]}\" dir=\"ltr\">\n" );
> + $wgOut->addHtml( htmlspecialchars( $text ) );
> + $wgOut->addHtml( "\n</pre>\n" );
> +} else {
> + // Wrap hook output in a <div> with the right direction attribute
> + $wgOut->addHtml( "<div dir=\"ltr\">\n{$text}\n</div>" );
> +}
I find I'm a bit leery of this hook. The $text parameter is source text
on input, and may be *either* source text *or* HTML on output.
This sort of thing feels "unsafe by default"; not only does the variable
change type, but it changes in an unsafe direction (eg, a safe text
string may be unsafe HTML).
I'd rather have the hook either do its own output on $output when
returning false, or return an HTML string via another parameter.
- -- brion vibber (brion @ wikimedia.org)
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.2.2 (Darwin)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iD8DBQFGhQhBwRnhpk1wk44RAlFgAJ9HPkd9o3bLbo272qaDM8V+QjcIqQCgkKQG
H0H29izL+vUqWc855dn/ci8=
=Q1JP
-----END PGP SIGNATURE-----
robchurch(a)svn.wikimedia.org schrieb:
> Revision: 23515
> Author: robchurch
> Date: 2007-06-28 18:18:42 +0000 (Thu, 28 Jun 2007)
>
> Log Message:
> -----------
> Pass the two titles in plain text as (unused per default) parameters
$3 and $4 to allow additional goodies in the message (parser functions,
extra links etc.)
>
> Modified: trunk/phase3/includes/SpecialMovepage.php
> ===================================================================
> --- trunk/phase3/includes/SpecialMovepage.php 2007-06-28 18:16:56
UTC (rev 23514)
> +++ trunk/phase3/includes/SpecialMovepage.php 2007-06-28 18:18:42
UTC (rev 23515)
> @@ -278,7 +278,8 @@
> $olink = $wgUser->getSkin()->makeKnownLinkObj( $old, '',
'redirect=no' );
> $nlink = $wgUser->getSkin()->makeKnownLinkObj( $new );
>
> - $wgOut->addHtml( wfMsgExt( 'movepage-moved', array(
'parseinline', 'replaceafter' ), $olink, $nlink ) );
> + $wgOut->addHtml( wfMsgExt( 'movepage-moved', array(
'parseinline', 'replaceafter' ),
> + $olink, $nlink, $old->getPrefixedText(),
$new->getPrefixedText() ) );
Something is wrong with $3 and $4.
If I use in [[MediaWiki:Movepage-moved]] something like:
"For what links here see [[Special:Whatlinkshere/$3]]"
After a move I get:
"For what links here see Special:Whatlinkshere/title"
but the URL contains still $3:
http://localhost/wikitest/index.php/Special:Whatlinkshere/$3
Raymond
An automated run of parserTests.php showed the following failures:
This is MediaWiki version 1.11alpha (r23542).
Reading tests from "maintenance/parserTests.txt"...
Reading tests from "extensions/Cite/citeParserTests.txt"...
Reading tests from "extensions/Poem/poemParserTests.txt"...
Reading tests from "extensions/LabeledSectionTransclusion/lstParserTests.txt"...
18 still FAILING test(s) :(
* URL-encoding in URL functions (single parameter) [Has never passed]
* URL-encoding in URL functions (multiple parameters) [Has never passed]
* Table security: embedded pipes (http://mail.wikipedia.org/pipermail/wikitech-l/2006-April/034637.html) [Has never passed]
* Link containing double-single-quotes '' (bug 4598) [Has never passed]
* message transform: <noinclude> in transcluded template (bug 4926) [Has never passed]
* message transform: <onlyinclude> in transcluded template (bug 4926) [Has never passed]
* BUG 1887, part 2: A <math> with a thumbnail- math enabled [Has never passed]
* HTML bullet list, unclosed tags (bug 5497) [Has never passed]
* HTML ordered list, unclosed tags (bug 5497) [Has never passed]
* HTML nested bullet list, open tags (bug 5497) [Has never passed]
* HTML nested ordered list, open tags (bug 5497) [Has never passed]
* Fuzz testing: image with bogus manual thumbnail [Introduced between 08-Apr-2007 07:15:22, 1.10alpha (r21099) and 25-Apr-2007 07:15:46, 1.10alpha (r21547)]
* Inline HTML vs wiki block nesting [Has never passed]
* Mixing markup for italics and bold [Has never passed]
* dt/dd/dl test [Has never passed]
* Images with the "|" character in the comment [Has never passed]
* Parents of subpages, two levels up, without trailing slash or name. [Has never passed]
* Parents of subpages, two levels up, with lots of extra trailing slashes. [Has never passed]
Passed 526 of 544 tests (96.69%)... 18 tests failed!
When visiting Special:Boardvote from either enwikiversity or dewikiversity
(and I presume from all other wikiversities as well), the interface fails to
retrieve my user ID despite being logged in on that project. This has been
replicated by several other users of those projects as well.
If someone could look into fixing this problem, it would be very much
appreciated. Thanks.
--
Daniel Cannon (AmiDaniel)
http://amidaniel.com
cannon.danielc(a)gmail.com
simetrical(a)svn.wikimedia.org schrieb:
> > Revision: 23410
> > Author: simetrical
> > Date: 2007-06-26 04:07:27 +0000 (Tue, 26 Jun 2007)
> >
> > Log Message:
> > -----------
> > (bug 6711) Add and to allow finer control over usergroup
assignment. Completely reverse-compatible with existing
userrights-based setups, but can replace Makesysop and Makebot and open
the door to lots of other things as well. Could be moved to extension,
I guess, but it just seems a lot simpler to have one interface for all
adding/removing of groups.
It does not work togehter with MakeSysop extension, it breaks:
Fatal error: Call to private method
UserrightsForm::showEditUserGroupsForm() from context
'MakesysopStewardForm' in
F:\xampp\htdocs\wikitest\extensions\Makesysop\SpecialMakesysop_body.php
on line 307
Changing the function to public fixes the fatal error, but than I get
warnings due to missing third parameter in
function showEditUserGroupsForm( $username, $addable, $removable )
The extension has to be updated too...
Raymond.
An automated run of parserTests.php showed the following failures:
This is MediaWiki version 1.11alpha (r23500).
Reading tests from "maintenance/parserTests.txt"...
Reading tests from "extensions/Cite/citeParserTests.txt"...
Reading tests from "extensions/Poem/poemParserTests.txt"...
Reading tests from "extensions/LabeledSectionTransclusion/lstParserTests.txt"...
18 still FAILING test(s) :(
* URL-encoding in URL functions (single parameter) [Has never passed]
* URL-encoding in URL functions (multiple parameters) [Has never passed]
* Table security: embedded pipes (http://mail.wikipedia.org/pipermail/wikitech-l/2006-April/034637.html) [Has never passed]
* Link containing double-single-quotes '' (bug 4598) [Has never passed]
* message transform: <noinclude> in transcluded template (bug 4926) [Has never passed]
* message transform: <onlyinclude> in transcluded template (bug 4926) [Has never passed]
* BUG 1887, part 2: A <math> with a thumbnail- math enabled [Has never passed]
* HTML bullet list, unclosed tags (bug 5497) [Has never passed]
* HTML ordered list, unclosed tags (bug 5497) [Has never passed]
* HTML nested bullet list, open tags (bug 5497) [Has never passed]
* HTML nested ordered list, open tags (bug 5497) [Has never passed]
* Fuzz testing: image with bogus manual thumbnail [Introduced between 08-Apr-2007 07:15:22, 1.10alpha (r21099) and 25-Apr-2007 07:15:46, 1.10alpha (r21547)]
* Inline HTML vs wiki block nesting [Has never passed]
* Mixing markup for italics and bold [Has never passed]
* dt/dd/dl test [Has never passed]
* Images with the "|" character in the comment [Has never passed]
* Parents of subpages, two levels up, without trailing slash or name. [Has never passed]
* Parents of subpages, two levels up, with lots of extra trailing slashes. [Has never passed]
Passed 526 of 544 tests (96.69%)... 18 tests failed!
On 6/27/07, brion(a)svn.wikimedia.org <brion(a)svn.wikimedia.org> wrote:
>
> Revision: 23492
> Author: brion
> Date: 2007-06-27 20:51:41 +0000 (Wed, 27 Jun 2007)
>
> Log Message:
> -----------
> Fix regression -- wfMkdirParents() started whining if target directory
> existed, instead of just giving the thumbs-up
Haha! Thank you -- I was having a nightmarish time with this a few days ago
where I kept getting cryptic error messages about being unable to write to a
directory. Eventually sorted out the problem causing the error, but was so
frustrated that I didn't get around to fixing this function.
--
Daniel Cannon (AmiDaniel)
http://amidaniel.com
cannon.danielc(a)gmail.com
In my opinion, the existing Job Queue is awfully close to being suitable for my media recoding needs. Rather than recreating a very similar mechanism just for queueing media recoding jobs, I wonder if it would be okay for me to make a few changes that would make it a little more robust and suitable to my needs.
Currently, JobQueue::pop_type() just gives up on finding a job to do if it finds that the first one it has selected already got taken by a concurrent process. I believe this could be improved easily by just returning a recursive call:
if ($affected == 0) {
wfProfileOut( __METHOD__ );
//used to be: return false;
//but there may still be other jobs of $type we can claim...
return self::pop_type($type);
}
...but since this hasn't been done already, I'm afraid there's some reason for it to be as it is. This is safe and behaves as expected when no more jobs of $type exist. There's no way this could break any client code, since it has no idea whether this condition will occur when it calls pop_type. Any other reasons not to do this?
Also, I want to populate the job queue with things that cannot be run on just any machine, probably including those that currently pop() from the job queue. To prevent this from happening, I could:
-make a new table structurally identical to job, and make a child class of JobQueue that just redefines a class constant containing the table name.
-add a flagging column to the current job table that JobQueue::pop() checks for and skips over if the flag is on
-just hardcode job types that shouldn't be claimed into pop() so they will only be taken on pop_type() calls.
I don't know how much changing database structure makes people's lives difficult at upgrade time, so I'm not sure which of these would be best in the event that my work eventually proves itself worthy of inclusion in mainline MediaWiki.
Mike