I've gone ahead and made another change to the schema which I'd
originally passed over.
The links and brokenlinks tables are now merged to a single pagelinks
table, which records the namespace+title key pair of target links rather
than the page ID or the prefixed title.
While I've been eyeing this for a while to simplify things, doing it now
is mainly a response to the scalability problems of renaming and
deletion of widely-linked pages. These actions required updating all
linking records, hundreds of thousands in extreme cases, to maintain
consistency and for instance are a significant factor in the
unpleasantness of dealing with page-move vandalism.
(This issue is similar to but separate from the issue of title updates
to all 'old' records for renaming often-edited pages, which was dealt
with by the page/revision split.)
It may be necessary to do some shakedown testing to make sure I haven't
introduced fun new bugs, but I figured better to do it now than have to
wait until the next major release. The update.php script should convert
the existing tables automatically (it will leave them in place for now...)
At some point we should also introduce the ability to run page_touched
and squid purge updates in the background, by handing the target page to
a purge daemon. This won't require database changes, though.
-- brion vibber (brion @ pobox.com)
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
I'm currently running MediaWiki 1.4.4. Everything is running fine except
for one little thing.
Using Firefox 1.0.4, view this page:
http://www.thewritingpot.com/wikirevision/index.php?title=Special:Version
and take a look at the search and toolbox menus: they're missing their
bottom borders.
The funny thing is that they work in Wikipedia, so it's not a browser
compat problem. IE renders it correctly, and it "flashes" with the line
before it disappears. So what exactly is wrong? I've tried re-uploading
the style sheets relating to monobook.
- --
Edward Z. Yang Personal: edwardzyang(a)thewritingpot.com
SN:Ambush Commander Website: http://www.thewritingpot.com/
GPGKey:0x869C48DA http://www.thewritingpot.com/gpgpubkey.asc
3FA8 E9A9 7385 B691 A6FC B3CB A933 BE7D 869C 48DA
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.1 (MingW32)
iD8DBQFClkjlqTO+fYacSNoRAkXxAJ9gSMOgHFRpFHz5Fgrzwt5KwK39DQCfVZp1
uEB6pTk7OB/rALN67IgQhtQ=
=3uvV
-----END PGP SIGNATURE-----
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Brent 'Dax' Royal-Gordon wrote:
> The basic idea is to set up a honeypot wiki. This would probably be a
> copy of a real wiki to make it as realistic as possible. Then add a
> special page to list all anonymous IPs that have ever edited the wiki,
> and use that information to maintain block lists on other wikis.
> There should probably be warnings plastered all over it so that human
> users know not to edit the wiki anonymously.
That sounds very good. I personally maintain a wiki, and link spammers
are really, really annoying.
Although this would only be helpful against linkspammers that operate
totally automated, otherwise, they'd find the wiki, and they'd think,
"Hey! What's this? I'm not going to spam this!"
- --
Edward Z. Yang Personal: edwardzyang(a)thewritingpot.com
SN:Ambush Commander Website: http://www.thewritingpot.com/
GPGKey:0x869C48DA http://www.thewritingpot.com/gpgpubkey.asc
3FA8 E9A9 7385 B691 A6FC B3CB A933 BE7D 869C 48DA
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.1 (MingW32)
iD8DBQFCli1uqTO+fYacSNoRAliNAJ0ZicVTC0fg9KQWuNZIzZt3kCu4QACfcGa2
GRFpbpslppVuvmnXFo1HL6k=
=gKzM
-----END PGP SIGNATURE-----
Hi there,
I am trying to download the german "old" table on:
http://download.wikimedia.org/#wikipedia
The link says that the size of that file is 13.97 GiB.
Howver, the file is just 2GB (definatelly) and there are no aditional files like
described on the wikipedia download help page.
Can somebody please give me some help on where to download that proper file? I
tried this one, but it is not complete (which gzip is telling me).
My guess is that the file is stored on a server which does not support files >
2GB and that this is just a mistake.
Any suggestions?
Thank you in advance,
merlin
Hello.
I'm glad to report MediaWiki won the first prize at les Trophées du
Libre ( http://www.tropheesdulibre.org/rubrique.php3?id_rubrique=2 ),
a french event geared towards free software, in the category 'special
php prize' created just this year.
Many people often forget that, backing Wikimedia Foundation's
projects, lies MediaWiki, an amazing piece of software. Today
developers's efforts were recognized.
The Trophées du Libre, 2nd edition this year, gathers people from
companies, associations, and also administration. Many people from
many different backgrounds, working in or with open source software,
sharing ideas, concerns (for instance software patents), solutions.
Many thanks and kudos to all developers who make it easy & fun to work
on Wikimedia's projects!
Nicolas Weeger
I was watching connections into my server today when I saw a
linkspammer connect to one of my wikis. I headed to that wiki,
expecting to see pages defaced as normal, but was surprised to find
them intact--until I checked the configuration and realized that the
particular wiki they had tried to connect to had anonymous edits
turned off.
That's when I had an idea.
The basic idea is to set up a honeypot wiki. This would probably be a
copy of a real wiki to make it as realistic as possible. Then add a
special page to list all anonymous IPs that have ever edited the wiki,
and use that information to maintain block lists on other wikis.
There should probably be warnings plastered all over it so that human
users know not to edit the wiki anonymously.
Does this seem like a good idea?
--
Brent 'Dax' Royal-Gordon <brent(a)brentdax.com>
Perl and Parrot hacker
I have started to write some documentation about the Chinese
conversion system at meta:
http://meta.wikimedia.org/wiki/Chinese_conversion
People interested in implementing conversion systems for other
languages should take a look at it. Note that most features
implemented are not Chinese specific.
--
zhengzhu
Guys !
We made it !
More information to come when Ryo and nota have
finished the champaign.
Anthere
__________________________________________________
Do You Yahoo!?
Tired of spam? Yahoo! Mail has the best spam protection around
http://mail.yahoo.com
No client cache suggestion worked!!
Thank you, zigger.
Btw, wiki 1.4.0 on IIS6, PHP 4.3.3 (isapi), MySQL 4.0.16-nt woks ok
without the no cache option
Zigger wrote:
>IIS 6.0 may have the same problems, according to
>http://bugzilla.wikimedia.org/show_bug.cgi?id=1872 .
>
>You could try the no-client-cache suggestion in comment #3 of
>http://bugzilla.wikimedia.org/show_bug.cgi?id=1763 .
>
>Good luck,
>-- Zigger
>
>
Have a nice day!
--
Jure Špik,
Carpe diem, d.o.o., Kranj
Hi,
in response to my appointment as Chief Research Officer of the Wikimedia
Foundation, I have put together a page describing this role, as well as
a potential larger Wikimedia Research Team that I want to form. Please see
http://meta.wikimedia.org/wiki/Wikimedia_Research_Team
for details. To the Board: The proposal is largely unchanged from the
version I sent you, but it includes a note about what I call
"semi-official titles". I suggest therein that members of the Team can,
internally, use certain titles like "WRT Survey Coordinator." Please -
and that goes for non-Board members as well ;-) - let me know how you
feel about this idea, I think it could help reduce the impression that
the "Chief Research Officer" holds special authority over the other
members, and generally motivate people to join and work in certain roles.
The page includes a list of individuals I'd like to invite to join the
team; if you feel that anyone is missing from that list, please add
them. I will extend personal invitations soon, but if you see your name
on the list right now, please do indicate if you're interested (just
strike through or remove your name if you're not). Of course, if you
yourself are interested and not listed there, feel free to add yourself
to the list of members right away. There's no application procedure --
we can always deal with problems as a team if there are any.
I'm copying this to wikitech-l, as I want to encourage the developers to
take a look at the above page. I want to ensure you that at no point
will anyone try to tell volunteers what to do, or what code to accept,
and any assignment to developers paid by Wikimedia will have to be made
by the Board: the Team only gives recommendations. I also absolutely
want to encourage any interested developers to join; if there are any
conflict of interest issues, we can deal with them as they arise. I have
mainly not listed developers in my list of proposed members because my
intuition is that most of them are too busy to get involved, but I'd be
happy to be proven wrong on that count.
I'm sure that some of you will be skeptical about the usefulness of a
systematic research effort: In the open source world, code is everything
and words are often considered meaningless. However, I believe strongly
that analysis should precede implementation, and that volunteer
development can be combined in useful ways with targeted, task-oriented
coding. The Research Team also has other roles, but see the page on Meta
for details.
All best,
Erik