In 2002, before the change of software, the ip information was visible
even when users were identified under a pseudonyme. The information
could be visible when people were just moving the mouse's pointer over
the name of the user.
When the software was upgraded, this option disappeared and the ip data
of registered users became *private* data.
I would like to know if that disappearance was discussed at that time
and if it was on purpose that ip data of loggued in people became
*private* data.
Ant
> Earlier: "...In 2002, before the change of software, the ip
information was visible even when users were identified under a
pseudonym. The information could be visible when people were just
moving the mouse's pointer over the name of the user... When the
software was upgraded, this option disappeared, and the ip data of
registered users became *private* data... I would like to know if that
disappearance was discussed at that time and if it was on purpose that
ip data of logged in people became *private* data..."
Peter Blaise responds: Web servers can keep logs of IP connections, here
a recent line from mine:
X:\www\apache2\logs\access.log 3,784 KB 2007-11-07 10-32 am ...
. . .
10.113.9.106 - - [12/Dec/2007:10:51:34 -0500] "GET
/mediawiki/index.php/Special:Search?search=%2B400+ITU+-1200+-substitute&
fulltext=Search HTTP/1.1" 200 11926
. . .
What's the reason, again, to spend the time to universally empower
browsing end users to ID IPs?
> Earlier: "...I can not help thinking that the rather ugly atmosphere
that developed on enwiki is largely due to the very large and
uncontrolled use of ... tool[s] by a minority... When one gives
specific tools to a person, that's creates a power lever which may be
used to grab bits of power. Which is more or less what is happening,
much to the dismay of those who do not have that power..."
Peter Blaise responds: Ahh, I see where you're going. Agreed. Which is
why I advocate that an admin never use admin tools to resolve their own
conflicts, and why banning/blocking et cetera be banned, and a
moderation system be developed to handle the transom between the
community on one side, and spam/vandalism/off-topic-contributions on the
other, instead of banning.
> Earlier: "...Other options ?..."
> Earlier: "...Unblocking Tor and anonymizing proxies, thereby making
checkuser relatively useless..."
Peter Blaise responds: Agreed. Anonymity is anonymity. If you want an
audit trail on contributions, see http://www.citizendium.com/
> Earlier: "...Though maybe we're talking about it on the wrong list..."
Peter Blaise responds: Does someone want to cross post this dialog to
wikien-l(a)lists.wikimedia.org and mediawiki-l(a)lists.wikimedia.org and
foundation-l(a)lists.wikimedia.org to connect them with this thread?
> Earlier: "...I am tired of long trolls on other lists :-)..."
Peter Blaise responds: As Ajahn Amaro said: "Have you ever noticed how
we're never traffic?" So, you don't think YOU'RE ever perceived as a
troll, then, eh? Hahahahah. Anyway, that's what the delete key and
scroll down arrow keys are for on your keyboard - so you can participate
in an intact community and still exert your own (self) control to pay
attention to the things that interest you, and skip the things that
interest others more so than they interest you. This is apparently a
significantly insurmountable lesson for some of us to learn - to merely,
graciously, patiently, tolerantly, acceptingly, with equivalent
consideration, skip things that interest others more so than they
interest us, and not call people trolls, not call for banning, deletion,
and so on..
Please copy this to your local village pump or other relevant on-wiki forum.
Werdna's #ifexist limit feature is now live. In response to complaints of
template breakage, I have increased the limit on Wikimedia wikis
temporarily, from 100 to 2000. Barring a coup, it will stay at 2000 for
about a week, and then we'll lower it to 100.
Please use this one-week period to check pages and templates that use
#ifexist heavily. Look in the HTML source of the preview or page view.
There will be a "limit report" that looks like this:
<!--
Pre-expand include size: 617515/2048000 bytes
Post-expand include size: 360530/2048000 bytes
Template argument size: 51168/2048000 bytes
#ifexist count: 1887/2000
-->
This is the limit report from
http://commons.wikimedia.org/wiki/Template:Potd/2007-12 ,
one of the pages that will break.
At the end of the week, any pages which have a #ifexist count of over 100
will cease to be rendered correctly (after the next edit or cache clear).
All #ifexist calls after the hundredth will be treated as if the target
does not exist.
In some cases it may be possible to rewrite your templates so that they
still do the same thing, but with less #ifexist calls. In other cases, you
will need to remove template features. Removing features is always sad, as
a sofware developer I know that, but sometimes it is necessary for the
good of the project. This is one of those times.
-- Tim Starling
Hello.
I've noticed that in enwiki, featured articles have a {{featured-article}} tag on the top of the page. However, I haven't found any tag marking good articles (though, there is a list of good articles).
In other language editions, neither featured nor good articles have a wiki tag on top of the page, but the nice star is displayed on the top right corner of the page for featured articles, once it is processed.
Anyone knows how these special pages are identified in the database? Are there common tags to identify them in other language editions? Is that info available in the complete pages-meta-history dumps?
Thanks a lot for the help.
Regards,
Felipe.
---------------------------------
¿Chef por primera vez? - Sé un mejor Cocinillas.
Entra en Yahoo! Respuestas.
I am developing an extension for MediaWiki for which I have defined a 'PersonalUrls' hook. When creating the href for my personal URL, I use makeSpecialUrl from the Skin class which generates a relative path. However, I need to guarantee that my page is over https. Does the MediaWiki API set have a way for me to guarantee this?
Thanks,
Tom
Hello,
sorry if this has been covered or answered before somewhere else - but
googling didn't help :/
Fro some wiki I participate in we do have backups in XML and in CSV
format only (please don't ask :) - but it seems there is no way to
import these formats back into the mysql DB - only import of .sql files
is supported by phpMyAdmin.
What would one to do to get these formats back into the DB? Could anyone
point us into the right direction?
All the best,
Tels
--
Signed on Sun Dec 9 12:29:28 2007 with key 0x93B84C15.
View my photo gallery: http://bloodgate.com/photos
PGP key on http://bloodgate.com/tels.asc or per email.
Mediawiki graph-extension: http://bloodgate.com/perl/graph/
Hi,
I'm trying to write an extension that will work basically like a "What
Links Here?" between three wikis in a wiki-family. I saw something in
the mailing list archives from Brion that got interwiki links stats by
using this query: SELECT count(*) from cur where cur_text like
"%[[la:%"; I'm guessing that is an old version and it would be looking
at the revision or text tables now. Would I still need to basically
parse the wikitext of a page to find the interwiki links?
Thanks,
Courtney Christensen
Anyone interested in joining the project? :)
Here is the initial content of the page
http://meta.wikimedia.org/wiki/Hello%2C_world!_project
'''[[w:en:Hello world program|Hello, world!]]''' is usually one of the
simplest possible programs in some programming language. However, it
becomes very complex if you want to put it on every [[Wikimedia]]
project in every language.
The goal of this project is not to show one more trivia implemented in
multilingual wiki environment, but to make roots for building
multilingual bots on (Media)Wiki based projects. It should be, also, an
educational material for future bot programmers.
The project is in the early state of development, and all constructive
inputs are welcome.
== "Simple" things to do ==
* Be sure that you know what are you doing. If you make a mess on some
project, you will be blocked there. If you make a mess on all projects,
you will be blocked everywhere.
* Make a page <nowiki>[[</nowiki>User:Your'''Bot'''Name/Hello, world!]]
with sentence "Hello, world!" on all Wikimedian projects.
** ''Simple'' note: You will need to open accounts for you and your bot
on all Wikimedian projects (if you didn't do yet) and verify your email
addresses. You will, also, need to wait four days for "maturity" of your
accounts.
** It is, also, a matter of good manners to introduce yourself and your
bot to the people of particular projects.
*** If you don't know particular language, it is good enough to write it
in English.
*** If you don't want to do copy-paste your text 1000+ times, make it
with your bot.
*** Tell something about you at your user page (including languages
which you are speaking by using Babel templates)
**** There are no some Babel templates on some projects? Copy it from
[[:en:|English Wikipedia]]. Of course, use bot for that if you are not a
masochist ;)
*** Tell something like "This is <YourUserName>'s test bot and it will
operating only inside of its and my user space" at your bot's user page.
*** Put interwiki links on all of your and your bot's user pages.
** Try to find some clever way how to translate "Hello, world!" in a
number of languages. [http://translate.google.com/ Google Translator] is
one of useful method for doing so. However, it has only a couple of
languages and Wikimedia has only a couple of hundreds of languages. If
you find some good method for doing so, please [[Talk:Hello, world!
project|let us know]].
*** Of course, maybe it is possible to find translations of this
sentence on the Internet?
*** You may ask native speakers to translate the sentence.
*** And, don't be disappointed if you don't have a translation. If you
tried all possible and impossible methods, make a page in English, your
language or some regional lingua franca.
** [[Talk:Hello, world! project|Let us know]] on how many projects in
how many languages you wrote "Hello, world!".
* Make a bot for checking updates of your "Hello, world!" pages and its
talk pages. Start it every day (or put it in the crontab of '''your'''
computer). Program should make output on some subpage of your bot's page
here, on Meta. Other option is to put feeds for all of the pages in your
favorite feed reader.
* [[Hello, world! project/code|Show the code]].
== Participants ==
Write it in the form:
<pre>
=== UserName, BotName ===
* phase 1
* phase 2
* phase ...
* phase n
* '''Current phase'''
</pre>
=== [[User:Millosh|Millosh]], [[User:Millbot|Millbot]] ===
* '''Making accounts.'''
[[Category:Bots]]
Seems this change was causing the bug 10837.
Or is that any other method to catching the user variant to getting
the variant setting for the current user?
The problem that is, for axample, if a user selected the user language
with "zh-hk", and a variant with "zh-cn", then the value of the code
should be the user language (zh-hk), but not the variant (zh-cn). The
variant setting *should not* override the user language setting
(userlanguage =/= uservariant).
However, seems this kind of settings was spread out though out the JS
in the kk message file. I think we can have a method to catch the user
variant, but not modifying the current user language in the StubObject
class.
Shinjiman
-----------------------------------------------------------------------------------------------------------------------------------------
Revision: 17173
Author: rainman
Date: 2006-10-21 18:24:39 -0700 (Sat, 21 Oct 2006)
Log Message:
-----------
Fix bug #7605. For logged-in users use the selected variant(if any)
insted the one from user settings.
Modified Paths:
--------------
trunk/phase3/includes/StubObject.php
Modified: trunk/phase3/includes/StubObject.php
===================================================================
--- trunk/phase3/includes/StubObject.php 2006-10-22 00:56:28 UTC (rev 17172)
+++ trunk/phase3/includes/StubObject.php 2006-10-22 01:24:39 UTC (rev 17173)
@@ -92,6 +92,15 @@
$code = $wgRequest->getVal('uselang', '');
if ($code == '')
$code = $wgUser->getOption('language');
+
+ // if variant is explicitely selected, use it instead the one from wgUser
+ // see bug #7605
+ if($wgContLang->hasVariants()){
+ $variant = $wgContLang->getPreferredVariant(false);
+ if($variant != $wgContLanguageCode)
+ $code = $variant;
+ }
+
# Validate $code
if( empty( $code ) || !preg_match( '/^[a-z]+(-[a-z]+)?$/', $code ) ) {
$code = $wgContLanguageCode;
On 12/4/07, brion(a)svn.wikimedia.org <brion(a)svn.wikimedia.org> wrote:
> Revision: 28156
> Author: brion
> Date: 2007-12-04 21:18:15 +0000 (Tue, 04 Dec 2007)
>
> Log Message:
> -----------
> Parser_OldPP class had older signature function. Updating to the current one, using site-customizable formatting.
Maybe it would be a good idea to have one of the two classes inherit
most of its functions from the other? They seem to be about 80% the
same at present.