On Thu, Jun 12, 2008 at 11:32 AM, Deni Symonds <symode09(a)hotmail.com> wrote:
> Hey everyone!
>
>
>
> I was just wondering how you think we could try and get an iPhone application developed. I think the best way would be
> to hold a competition and, then try and get apple to donate a macbook
> or iPod touch or something to the winner. Apple has incorporated
> Wikipedia's incorporation into OSX in the past and I think it would have a similar position with the iPhone. What do you think?
>
> cheers
>
> brown_cat
>
>
See <http://www.mediawiki.org/wiki/Mobile_browser_testing> and I think
Wikitech-l (CC'ed) would probably be more help than Foundation-l in
this.
--
Casey Brown
Cbrown1023
---
Note: This e-mail address is used for mailing lists. Personal emails sent to
this address will probably get lost.
Can anyone tell me an easy way to download all of the images and audio
files for Wiktionary? I am trying to set up an off-line english
wiktionary for an OLPC pilot school in Nepal. I have used wikix so far
but it doesn't catch everything. suggestions?
any help would be very much appreciated. thank
--
Bryan W. Berry
Systems Engineer
OLE Nepal, http://www.olenepal.org
Anyone here have any experience with protocol relative URLs, that is
URLs of the form "//some.domain.org/file.ext"? URLs of this form are
uncommon but appear compliant with RFC 1808.
A possible application of protocol relative URLs for MediaWiki is that
they could be used remove the problem of needing duplicate parsings of
pages containing external (and cross-domain) links in order to support
HTTPS. With that issue out of the way the only impediment to high
performance SSL is connection setup which can be addressed with
dedicated crypto cards or crypto enhanced CPUs like Ultrasparc T1/T2.
I've confirmed protocol relatives they work in the browsers I have
ready access to. Googling around I found
http://nedbatchelder.com/blog/200710/httphttps_transitions_and_relative_url…
which claims "The HTML 2 spec references RFC 1808 which describes this
behavior, and was written in 1995. I know this syntax works in IE6,
IE7, FF2, and Safari 2 and 3. I don't know of any browsers in which it
doesn't work."
Anyone here have practical experience with URLs of this form?
>From: David Gerard <dgerard(a)gmail.com>
>Date: 2008/6/5
>Subject: Re: [Wikitech-l] TorBlock extension enabled
>To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
>
>
>2008/6/5 Tim Starling <tstarling(a)wikimedia.org>:
>> Andrew Garrett wrote:
>
>>> The TorBlock extension will override local IP blocks to provide a
>>> consistent treatment of tor.
>
>> I've disabled this behaviour for now, so that we can have a more orderly
>> phase-in period with community discussion. Admin blocks of Tor exit nodes
>> will continue to work. The new protections which have been introduced will
>> also work, and so Tor anonymous users on the English Wikipedia will
>> typically see two block messages.
>
>
>Thanks for holding off on this :-)
>
>I've asked the other checkusers concerned about this to post useful
>information to wikitech-l about what we actually see in practice on
>en:wp (buckets of toxic waste through Tor, the fabulously illustrative
>case of Runcorn concerning softblocks, etc), so as to supply the devs
>with good info.
I don't have the records to do a statistically valid analysis of the
use of Tor by sockpuppets and vandals as compared to other types of
proxies. I'm sure that the new extension, which amounts to global
"firm" blocking of Tor exits (more than a soft block but less than a
hard block) will cut down on the use of Tor by casual or lazy vandals.
However, the firm block would not address the problem of determined
abusive users using proxies to conceal their activities. The two most
prominent cases that come to mind are Poetlister (whose sock Runcorn
downgraded blocks on Tor exits so that her other socks could use them)
and Mantanmoreland, who created a second account that exclusively used
proxies in order to avoid checkuser confirmation. Or see
http://en.wikipedia.org/wiki/Wikipedia:Requests_for_checkuser/Case/Fantevd,
where a nominally good user with 6000 edits was found to be running a
sock farm via open proxies, ultimately involving 24 accounts with 3000
edits to
Both of these accounts caused significant disruption and drama.
Blocking all proxies that exit to Wikipedia could potentially prevent
similar future situations, but not if all the puppetmaster has to do
is to keep a low profile for 90 days.
And at least on enwiki, the "moral" reason for softblocking Tor exits
(to allow people to edit from repressive locations, etc) has been
voided by the enabling of the IP block exemption.
Gmaxwell correctly pointed out in an email to checkuser-L that if Tor
exits are hardblocked, smart puppetmasters will use other proxies.
True, but we can block those proxies. We *can't* block Tor exits, at
least if the override behavior is in place. In fact, with the
override enabled, the new extension will actually *encourage*
sockpuppeteers to use Tor, because it will guarantee they will always
be able to edit as long as they have the patience to wait for their
socks to be autoconfirmed. They will no longer run the risk of
enrolling in a commercial anonymizing service only to discover that we
have blocked it.
I think this extension is a great idea and I thank all the volunteers
who worked on it, but I think the override is a very bad idea.
Thatcher
On Sun, Jun 8, 2008 at 6:48 AM, <werdna(a)svn.wikimedia.org> wrote:
> +// Define new autopromote condition
> +define('APCOND_TOR', 'tor'); // Numbers won't work, we'll get collisions
Speaking of which, is there any reason for us to ever use named
constants like APCOND_TOR rather than just strings? This isn't C,
after all. I think it would be wise to deprecate such things for the
future and just use strings, arrays of them if necessary, as with
wfMsgExt() and similar. (Although in this particular case a named
constant might be better, to be consistent with the
already-established core conditions.)
I'm Ccing Wikitech, i suggest we follow this thread there.
Nikola Smolenski wrote:
> (thread about interwiki bots at toolserver)
>
> Coincidentally, yesterday I released a MediaWiki extension which, if
> accepted on Wikimedia projects, may make interwiki bots much less busy.
> See http://meta.wikimedia.org/wiki/A_newer_look_at_the_interwiki_link
It also works by manual writing of the interwikis. I don't think it's
the good way.
*You're not taking into account page moves. What will you do when a page
is moved? (by a low tech user which knows nothing about the global wiki)
*The articles will still have a 'preferred' title at the interwiki wiki.
That means discussing about article titles, "Move to English name", "No,
that's not", "Interwikis with pages on Chinese are ugly!"...
IMHO it should be a shared table referencing the wiki and page ids.
Then you provide a Special page showing all pages on that group. You'd
reference it as 'include this page into the group XX:sometitle is on'.
You can also provide some space for free-form commenting (such as
explaining the difference with another page).
Obviously, all of that must be properly logged, which with SUL should be
much easier.
As is well known, we're still a bit inconsistent and confusing about
what format various localized messages come in.
Some are to be formatted as plain text; some as wiki text; some as raw
HTML. Some are plain text, but get wiki constructs tossed in as a
preprocess step to support the plural-alternate function.
But what we do know is that things get ugly when messages *change*
format -- customized messages in use on live sites break.
One that's been changed back and forth several times lately has been
'linkstoimage', used in the file links section on Image: description pages.
English Wikipedia uses a customized <div> wrapper with an id in there:
http://en.wikipedia.org/wiki/MediaWiki:Linkstoimage
but it's been broken several times on the assumption that the message
should be output as plaintext because the default text has no markup in it.
The most recent breakage was in r36139, changing it from wikitext to
plaintext with wikitext preprocessing, apparently in order to support
{{PLURAL}}.... which shouldn't even be necessary at all; it should work
fine with $wgOut->addWikiMsg as is! :)
The change caused the "<div>" tag to be literally displayed in page
output as plain text.
I've reverted the change temporarily so things will work as before until
a proper update to the message and the message calls can be done.
At a minimum, one should check what format is really being used by the
current code. When considering changing it, doing a quick survey of
existing usage is strongly recommended. Wikipedians are notorious for
using software facilities in ways we didn't expect! ;)
If it's *really* necessary to change formats in a way that could likely
break (for instance, from HTML to wikitext or plaintext, or wikitext to
plaintext, etc), the message key name should be changed. This prevents
incompatible customizations from being incorrectly displayed.
The same goes when changing parameters or the places a message gets
used; if the old version or a reasonably customized version of it
doesn't make sense in the new system, it has to be changed.
In this case, the message doesn't need to change format in order to gain
the extra capability (this is already built in to the wikitext display)
and the extra parameters used for the {{PLURAL:}} won't have any
negative effect on an old customized version in wikitext, so it doesn't
need to be renamed as long as the code retains the format.
-- brion
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Just a quick note -- we encountered an interesting bit of cache
pollution today, where an article on de.wikipedia.org was reported as
having bogus article links URLs in the form used by the DumpHTML static
dumps.
This was unexpected, as DumpHTML disables $wgEnableParserCache while
it's working; it thus shouldn't save anything to the parser cache.
This may have been caused by some overaggressive parser-cache work in
FlaggedRevisions, which Aaron has fixed by adding a $wgEnableParserCache
check to FlaggedRevs::updatePageCache().
It's possible some bad pages remain in cache; they can be flushed with
an ?action=purge or null edit.
(If it's safe to do an epoch update or something to force them all out,
we could try to do that just in case, but we've had only one report of a
bad entry so far.)
- -- brion
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.8 (Darwin)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iEYEARECAAYFAkhPFrIACgkQwRnhpk1wk47ysACcD5FxH2GM95l1HFRLPYibkUAD
OngAoK7kNZj9j04rW+lCozCPnZdbdHo6
=qXXH
-----END PGP SIGNATURE-----
(https://bugzilla.wikimedia.org/show_bug.cgi?id=14443)
I tried to ask on #wikimedia-tech, but received no reply. Can someone look up
server logs for indication what's failing? Thanks.
--
Max Semenik ([[User:MaxSem]])
Hi,
We're in the process of upgrading from 1.9 to 1.12 (I know I know I
know...), and it seems that the including of extensions is taking a long
time.
Profile from 1.9:
26.449 WebStart.php-conf
24.659 Setup.php
1.362 Setup.php-includes
2.117 Setup.php-misc1
1.020 Setup.php-memcached
0.570 Setup.php-SetupSession
0.117 Setup.php-globals
0.052 Setup.php-User
0.017 Setup.php-misc2
18.543 Setup.php-extensions
Seems fast, but 1.12:
594.761 WebStart.php-conf
0.012 WebStart.php-ob_start
207.409 Setup.php
50.838 Setup.php-includes
8.691 Setup.php-misc1
6.780 __autoload
9.632 Setup.php-memcached
0.685 __autoload
0.564 Setup.php-SetupSession
0.197 Setup.php-globals
0.046 Setup.php-User
0.023 Setup.php-misc2
128.051 Setup.php-extensions
This is pretty slow. It seems like for WebStart, having multiple inclusions
like this:
require_once('extensions/wikihow/SpecialThankAuthors.php');
slows things down. Has something changed, or did we miss flipping a switch
in the upgrade? What about Setup.php, does that have the same issue?
Thanks,
Travis