2010/12/6, wikitech-l-request(a)lists.wikimedia.org
<wikitech-l-request(a)lists.wikimedia.org>:
> Date: Mon, 6 Dec 2010 18:50:45 +1100
> From: Andrew Dunbar <hippytrail(a)gmail.com>
>
> Could anybody help me locate a dump of mediawiki.org while the dump
> server is broken please? I only need current revisions.
Use your Google-fu; there are some dumps available at various
locations on the internet (archive.org, Pirate Bay).
What is perhaps more important: can the checksums be made available
somewhere while the dump server is down? That way, we can verify the
dumps we must now fetch from untrusted sources.
I hope the dump server goes back online soon, I need the SQL dumps for
research into link mining...
Regards,
Lars Buitinck
I've been chipping away at our skins system lately, there's a lot we can
improve to improve the skins system.
Right now there's a lot of it that doesn't work so nicely for an
ecosystem promoting the creation of a wide variety of skins.
- footerlinks is duplicated across skins and is hardcoded (I fixed this
in trunk, I did footericons too, however these could be improved with
helper methods, we still have unwanted boilerplate)
- our directory structure is odd
-- SkinName.php with skinname/ beside it means skins are not self
enclosed and can't simply be dropped into their own directory
-- having a first class skins directory we store skins in with proper
autoloading, stylepath and localstylepath (1.17) variables, and having
conventions for including skin resources like
skins/{skinname}/path/to/file.ext but saying that any skin we don't
include in core should instead go in extensions, use extensions assets
path, not have localstylepath, and not use the
skins/{skinname}/path/to/file.ext convention and be left without any
autoloading at all, and include extra boilerplate/code that native skins
don't need to, does not promote the creation of new third party skins
--- in fact, besides the 4 skins in extensions/skins/ that were in svn
which don't appear to be really notable skins and haven't been touched
in a half year, all the 3rd party skins out there don't even use the
extensions features, they all just have you place them inside skins/ the
same way the native skins are.
- we have page end boilerplate you have to copy into a skin (we can
probably fix this somewhat similarly to the way headelement was added)
- toolbox is hardcoded into the skin and needs to be copied (this one
definitely should be fixed)
- many of our toolboxes portlets nav links need very verbose loops and
things hardcoded which would be better inside of a common helper method
-- it should also be possible to differentiate between the page/talk and
other navlinks the way vector does, without having to hardcode it or add
a pile of extra php
- building a sidebar should be minimal in the specialcases and php you
have to hardcode (SEARCH, TOOLBOX, etc... special cases shouldn't be
hardcoded into the skin)
- we need a method of building a search form that is flexible to styles,
but minimizes what you have to hardcode and what boilerplate you need,
most skins should be able to call something and just drop a search box
or search bar in with a single method if they don't want to do any
special customization.
- the large block of common boilerplate inside the content area also
isn't that nice for building 3rd party skins
I've been working on improving the system for the past few days. And
I've also come up with a new idea for a skin packaging and installation
convention.
To show it off I ported WordPress' P2 theme into a wiki skin and
committed it to
http://svn.wikimedia.org/viewvc/mediawiki/trunk/extensions/skins/p2wiki/
The result being http://trunk.new.wiki-tools.com/index.php?useskin=p2wiki
The convention should work with currently stable versions of MediaWiki
as well. And it is compatible with adding new features to newer versions
of MediaWiki to tear away boilerplate necessary to package up a skin for
distribution. As well as with $wgStylePath conventions.
Currently the convention can be done by packaging up a skin like I did
with p2wiki, using extension style techniques to add a new skin. It can
be installed into skins/skinname/ and
`require_once("$IP/skins/skinname/skinname.php");` can be added to
LocalSettings.php.
That convention will work in already released versions of MediaWiki. For
future versions we may want to consider adding autoloading code for this
style of skins, and adding some convention based defaults.
As an ideal, it would be nice if some way down the road it's possible to
build a skin without the use of any php, even if you have to enable an
extension to add the actual template system or whatever.
As for places for putting skins and distributing them, Lcawte proposed
adding a /trunk/skins/ for the new convention.
There was a bit of discussion in irc where some downsides, upsides, and
counter issues were brought up.
- Adding /trunk/skins would require some (possibly small?) changes to
the Translation extension.
- ExtensionDistributor would also need tweaks.
-- However it was brought up that ExtensionDistributor doesn't work with
/trunk/extensions/skins/ anyways, at least not if you're only trying to
get one skin out.
-- I think tweaking ExtensionDistributor to support /trunk/skins would
be quite easy (at the simplest it's probably a case of making small
tweaks to make a SkinDistributor that uses /trunk/skins instead of
/trunk/extensions and whatnot)
It's not part of distribution, but the new installer's ability to point
out extensions and allow you to install them from the installer was
pointed out. However generally each skin doesn't have it's own set of
configuration (vector does, but generally as an ideal having a bunch of
skins that are nothing but a separate theme for the site should not
require special configuration of each one of them) so there isn't really
much use for sharing configuration infrastructure. Additionally, if we
do add an autoloader for the new style of skin there's not really any
point to having the installer point out skins. If they're in a spot the
installer can find them, they'll already be autoloaded anyways.
Oh, and lastly... if anyone knows of any "really" good WordPress or
other CMS themes or templates I might consider porting some as examples.
--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]
Hi everyone,
The code review team has been doing a fantastic job of clearing out
the backlog, which you can see here:
http://toolserver.org/~robla/crstats/crstats.html
(uncheck "ok" on the checkboxes on the bottom to see new commits)
There's some release planning issues that we have to sort out:
1. When will we branch 1.17?
2. When will we push 1.17wmf1 to production?
3. When will we release MediaWiki 1.17?
Trevor and Roan believe that ResourceLoader will be ready to deploy in
January. However, looking at the backlog of code reviews, and I don't
think it's realistic to assume we'll have everything else ready in
January. My assumption here is that we need to be through the code
review backlog prior to pushing what is currently in trunk into
production. Simply extrapolating from the October/November rate of
code review, March is looking more like the target, and that assumes
we keep up the rapid pace of review. What seems realistic without
being complacent?
One thing I think will help the rate of code review is for more
developers to comment on commits more frequently. I know many of you
already chime in on Code Review, which is great. If you can do your
best job at providing a full code review (commenting rather than
marking "ok"), that will hopefully catch some of the most immediate
problems so that the code review team can focus on the more subtle
issues and perhaps do a quicker review. Longer term, Roan has a new
feature brewing to the Code Review system ("sign-offs") that will make
it easier for you to provide some useful metadata for the code review
team to use.
With respect to MediaWiki 1.17 releases, I think we can actually
release sooner after the first production deployment than we did after
1.16 if we pair our usual testing with some small amount formal QA and
test automation (especially focusing on installer and a small amount
of alternate DB testing [e.g. sqlite]) What seems realistic here?
Longer term, everyone has stated a desire to move to more frequent
deploys. I think that's a conversation worth having, but let's not
have it as part of this thread.
Thanks
Rob
Hi,
I want to authorize uploading of CML
files<http://en.wikipedia.org/wiki/Chemical_Markup_Language>on my
wiki <http://wiki.jmol.org/index.php/Main_Page>.
These files have an XML format with a limited set of possible root elements.
I would like them to be recognized by MediaWiki with a specific MIME type :
chemical/x-cml
I'm doing some tests on a virtual machine Ubuntu 10.10 with default
MediaWiki setup (MW 1.15.5, PHP 5.3.3, MySql 5.1.49).
I have done several modifications in the configuration :
- $wgFileExtensions[] = 'cml'; in LocalSetting.php to allow the .cml
extension
- chemical/x-cml cml; in includes/mime.types to attach the MIME type to
the extension
- $wgXMLMimeTypes = array_merge( $wgXMLMimeTypes, array( ... ) ); in
LocalSettings.php so that MimeMagic correctly detects CML files as
chemical/x-cml
When I upload the file, everything is ok and in the debug log the file seems
correctly detected as chemical/x-cml.
Even in the database, the file is stored with the correct MIME type.
But when I display the File: page for this file, I see "MIME type:
unknown/unknown" displayed by MediaWiki (same problem in the debug log).
What do I need to do to have the MIME type correctly detected once the file
is uploaded ?
I want to develop a media handler for this kind of files, so I need to
have correct Mime type detection.
Thanks in advance.
Nico
Hi everyone,
On IRC, Trevor lead the charge "to Etherpad!", and some of us
followed. This was the result:
http://www.mediawiki.org/wiki/MediaWiki_roadmap/1.17
This is an aggressive plan, starting with branching for 1.17 early
next week. It is by no means official; Tim and Mark H are both named,
but neither have weighed in, so don't take this as "the plan", but as
a proposal for discussion here.
Rob
On Fri, Dec 3, 2010 at 2:35 PM, Trevor Parscal <tparscal(a)wikimedia.org> wrote:
> Agreed - branching will not prevent bugs from getting fixed, we will be
> merging in a limited set of changes (fixes to bug) while ignoring new
> and irrelevant development.
>
> - Trevor
>
> On 12/3/10 2:26 PM, Roan Kattouw wrote:
>> 2010/12/3 Platonides<Platonides(a)gmail.com>:
>>> So if we can cope with reviewing, I would wait (I think
>>> RobLa took it into account). Who will work into stabilizing the branch?
>>> A branch will have less attention (bug hunting) than trunk
>> Well to keep review catchup manageable, I think we need to either
>> branch or declare something like a feature freeze and freeze for minor
>> stuff in trunk. My general opinion on this sort of thing is that trunk
>> should not generally be subject to such freezes.
>>
>> Roan Kattouw (Catrope)
>>
>> _______________________________________________
>> Wikitech-l mailing list
>> Wikitech-l(a)lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
Do not block. The people of usa needs to know the truth. Plus our leaders be leading in a way that the people of the usa would like them too. Thank you.
-----Original Message-----
From: Rob Lanphier
Sent: 9/23/2010 12:59:56 AM
To: Wikimedia developers
Subject: [Wikitech-l] Drafting the upcoming engineering overview
Hi everyone,
As you probably know, we're trying to get into the habit of providing
a monthly overview of all WMF-sponsored engineering activity. The
September update was posted to the techblog here:
http://techblog.wikimedia.org/2010/09/wmf-engineering/
For October, we'd like to draft this in public so as to get the
information out a little sooner, and to give you all the opportunity
to help out. Here's where we're drafting this:
http://www.mediawiki.org/wiki/WMF_Engineering_Overview_October_2010
Here's a very simple way you can help. If you see something on the
list that you're interested in, but don't see the status for yet, ping
one of us, then be bold and add what you learn to the appropriate wiki
page. If you do know the status, by all means add it.
Another useful thing to do: you'll notice that many of the project
pages that the status post links to are pretty sparse. Same rules
apply there. We'd love to get help keeping this up to date.
Thanks!
Rob
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> There is a way to download wikidumps for any project / language, the
> data is from early 2009. I will detail the steps as a note for future
> reference.
>
[...]
> Next step is either to copy a file using SCP or start your own FTP
> server on the EC2 instance and download the files that you need. You
> will need to pay a small fee but this is in the range of cents.
>
> Best,
>
> Diederik
>
Thank you for your comprehensive description of accessing the archived data
in EC2.
Personally, I will consider it to get a dump that way in order to continue
working. However, I am curious if there are any news about the regular
server. Last update in http://wikitech.wikimedia.org/view/Dataset1 was ten
days ago and I wonder how likely it is to fix the server e.g. next week.
Is there anybody here on the list who may update the wiki page?
Thanks!
Sven
Hello,
Would it be possible to get commit rights to the 1.16wmf4 ?
For now, I would focus on merging revisions for the CodeReview
extension, no Core work yet :p
cheers,
--
Ashar Voultoiz
Quick note before I go to sleep: with bundled efforts from Casey,
Chad, Sam and me, we now have a mailing list that all CodeReview
comments are sent to: mediawiki-codereview(a)lists.wikimedia.org [1],
requested in bug 22046 [2]. Nice as another stalk list or if you miss
knowing what's going on from the old days when commenting on code
meant replying to mediawiki-cvs e-mails on wikitech-l.
The list is set up so only CodeReview itself can post to it. Replies
are set to go to wikitech-l, but of course you should use CR's own
commenting mechanism to reply to a comment instead. Just like
mediawiki-cvs, this list is not archived. Tim and I are the current
list admins.
Roan Kattouw (Catrope)
[1] https://lists.wikimedia.org/mailman/listinfo/mediawiki-codereview
[2] https://bugzilla.wikimedia.org/show_bug.cgi?id=22046