Hi everyone,
I recently set up a MediaWiki (http://server.bluewatersys.com/w90n740/)
and I need to extra the content from it and convert it into LaTeX
syntax for printed documentation. I have googled for a suitable OSS
solution but nothing was apparent.
I would prefer a script written in Python, but any recommendations
would be very welcome.
Do you know of anything suitable?
Kind Regards,
Hugo Vincent,
Bluewater Systems.
I've been informally mentoring André, Tiago, Diego, and César. They
are four students at Minho University who are currently working on a
project to improve DB2 database support in MediaWiki.
So far, they've:
- Fixed several outstanding issues with DB2 support involving
character encoding, Windows vs Linux, etc
- Added DB2 support to the new MediaWiki 1.17 Installer and Updater
- Put in the appropriate Updater sql patches to reflect database
schema changes since 1.14
MediaWiki already had some DB2 support, but it's been broken since
1.15 and never complete. As a result of their work, it's now possible
to successfully install MediaWiki on DB2 out of the box and to use the
core wiki features.
I'll shortly commit their first patch using my SVN account (leonsp).
I've taken some care to look over the code and make sure it abides by
the MediaWiki code guidelines.
Regards,
Leons Petrazickis
http://lpetr.org/blog/
I've modified Mediawiki:Common.js to include the following:
document.getElementById("p1").innerHTML="New text!";
I then created a page called testPage and inserted:
<p id="p1">Hello World!</p>
I'm expecting the text to change upon load but nothing is happening.
Prior to the <p> tag addition I had an alert that was working as
expected so at least I know Common.js is working.
What is the best practice for adding custom javascript code in
Mediawiki? While we're at it is there a real good tutorial on including
AJAX calls w/o needing an extension anywhere?
Thanks - Tod
After the question came up a few times, Ryan Kaldari and I decided to
document the official list of supported browsers and the information and
logic that surrounds this topic.
http://www.mediawiki.org/wiki/Supported_browsers
- Trevor
MediaWiki is currently developed in two branches, with everything going
to trunk, and a slower number of revisions getting also copied to the
stable branch.
That's also the way we want to continue working, having a trunk and a
stable branch.
Currently we store the release notes in a file called RELEASE-NOTES in
the phase3 root of each branch. This is a problem when adding a revision
for backport, since the release notes are not suitable to be added to
the trunk RELEASE-NOTES (they would go inside HISTORY)
but it needs to be added to REL1_X RELEASE-NOTES.
So we end up with release notes added in a different commit, or
revisions with merge conflicts for merging. Which is inconvenient.
Thus, I propose that, from the point we branch 1.18, we keep the stable
branch release notes in trunk, and add trunk release notes in a
different file. trunk and branch RELEASE-NOTES would effectively be the
same file (a revision modifing RELEASE-NOTES and not tagged for backport
would be a bug). This simple change would give us much cleaner merges.
When tagging a new branch, the trunk RELEASE-NOTES would move to
HISTORY, the trunk release notes file to RELEASE-NOTES and a new one
would be created for trunk.
Hi
I was planning to make an api sandbox for mediawiki as a part of gsoc 2011.
while going through mediawiki documentation, i found that many functions
needed POST rather than GET requests. i am planning for flickr like api
sandbox<http://www.flickr.com/services/api/explore/?method=flickr.contacts.getList>for
mediawiki.
in flickr the documentation of every method has a link to api explorer
(sandbox) where user can test different values for each parameter and the
result is displayed in a div in the same page.
media wiki sandbox will display available parameters that can be used for a
particular method (drop-down if the values are previously known). the user
can fill the forms with his own values. the api then will send a GET or POST
request (AJAX request) according to the scenario (using JQuery or some other
JS library) and display the results in a div in the same page. the page
shall also display a url for executing the same ajax request.
to make the api sandbox really useful, it ideally should have automatic php
code generation too ( i don't know if it is overambitious ). for example for
login and logout, user can just give his userid and password and the code he
would write for php is automatically displayed.
i eagerly await for suggestions
--
Salil
IRC : _Salil_
On Wed, Apr 27, 2011 at 12:43 PM, Chad <innocentkiller(a)gmail.com> wrote:
> Getting the merges reviewed [0] and making sure we have a good set of
> release notes. That's what I know of, and Reedy's been working on the
> latter.
>
I've been thinking about this over the past few days, and I've got a proposed
release schedule and development roadmap to carry us through the rest of the
year.
As we all know, 1.17 is due to drop Real Soon Now. To summarize for those who
don't know the status: it's pretty much done. In talking with Roan, Tim and Sam
earlier this week, we discussed that we're pretty much ready to drop a
1.17beta1.
Tim was concerned about the release notes, but as I pointed out in my previous
e-mail, Sam's tidied this up (and it's low-hanging fruit if someone wants to
check behind us for sanity). That being said, I don't see any reason why we
can't drop a beta1 sometime this week. Give it a week, and drop a beta2. Wait
another week, then go final I think, all depending on what response we get from
the betas.
As for 1.18, I say we branch it the same day we drop 1.17 final (making the
branch is easy). There's still quite a bit of code to review, but going ahead
and giving ourselves a cutoff point will make catching up easier. Large projects
still outstanding in 1.18 to review are the img_metadata merge, and rewrites of
Skin and Action code. By branching soon I think we can try to get a release out
by the end of summer, at the very latest.
Looking ahead to 1.19, I'd like to do the same and branch soon after 1.18 has
been dropped. Since 1.19's a little further out and hasn't started taking shape
yet, I'd like to go ahead and propose what sort of release we should aim for.
Going back over the past couple of releases, we've had quite a few "rewrites"
of major portions of code. While these are a necessary part of the process of
developing MW, they are difficult to review due to their complexity. This
complexity also makes it more likely for things to break. If I may be so bold,
I would like to ask that 1.19 not contain any of these rewrites. Let's focus on
making it a bugfix/cleanup release. Personally I think it would make for a very
clean and polished release, as well as reducing the time for us to review and
ship it.
If we go this route, I don't see any reason we couldn't ship 1.19 by year end
(or if we really push, 11.11.11, as the other thread suggested). I
think it would
put us in a really good place to move forward into 2012, and help get us back
into a somewhat regular release pattern.
I really would love to hear from people to see if they think I'm crazy or if
this could work out fairly well. I know it's pretty tl;dr for most people, but
the ones who read it are the ones I wanna hear from anyway ;-)
-Chad
Enhanced media player goodies like embedding have been slowly coming along,
with a handy embedding option now available in the fancy version of the
media player running on Commons. This lets you copy a bit of HTML you can
paste into your blog or other web site to drop in a video and make it
playable -- nice! Some third-party sites will also likely be interested in
standardish ways of embedding offsite videos from Youtube, Vimeo, and other
providers.
There's a lightweight standard out in the wild called oEmbed which I've
previously worked with on StatusNet: identi.ca uses it to show thumbnails
for flickr photos, youtube & vimeo videos, and other such things that get
called out in posts. Basically it specifies a discovery system for
automating the embedding process, so you can get a thumbnail image and/or
inline player HTML fragment that fits a reasonable size.
http://oembed.com/https://bugzilla.wikimedia.org/show_bug.cgi?id=25854
I'm interested in bringing oEmbed provider and consumer support to
MediaWiki, but there are a couple limitations: currently there's no exposed
license metadata (highly desired for Wikimedia's usage, obviously) and
cross-site embedding for videos currently requires the consuming site to
either drop raw HTML into their site (dangerous!) or maintain a second
domain for iframe content (difficult).
There's a discussion going on on the oEmbed list about updating the
standard, so if any of you out there have an interest in the wonderful world
of on-web media embedding and how we can make it work better for MediaWiki,
do feel free to pop your nose in. :)
http://groups.google.com/group/oembed/browse_thread/thread/99ca7193a3c3c11f…
-- brion
Dear all,
there are two empty files in the distribution. Are they of any usage?
/maintenance/archives/patch-page_no_title_convert.sql
/maintenance/archives/patch-image_reditects.sql
Best regards,
Johannes Weberhofer
--
Johannes Weberhofer
Weberhofer GmbH, Austria, Vienna
Dear all!
I'm currently packaging mediawiki (for opensuse), and have a question related to "includes/zhtable".
I think, this part can be stored in a seperate package, as many systems do not require chinese translations. Making the ZhConversion.php works nicely, but I do not know how to package all the stuff:
* How should this package be labeled and described (e.g. mediawiki-zhtable)?
* Are the *.manual files needed on a target server?
* Are the *.txt file needed on the target server?
* Is only ZhConversion.php to be pacakged?
If someone uses this thing, how is it to be included to mediawiki?
Looking forward for help...
Best regards,
Johannes
--
Johannes Weberhofer
Weberhofer GmbH, Austria, Vienna