Some Wikimedia developers have requested help improving their English
writing skills. Erika Hanson is volunteering her proofreading skills to
help by leading interactive proofreading sessions -- we ran a pilot
today and it was successful enough that we'll do another one next weekend.
Participants paste what they've written into the Etherpad
http://notes.wikimediadc.org/p/english-lessons by Thursday night, so
Erika has time to read it and mark problem areas in boldface. Then,
during the tutorial session, Erika gives the learner lessons in
spelling, grammar, punctuation, etc., for about 15 minutes per student.
If she has time in the 15 min., she'll then move on to matters of
style. Discussion can be within the Etherpad chat.
If you want to participate, email Erika (cc'd) with the date/times that
you can attend (your 2 choices are below - I don't know whether Erika
can do both times or just the more popular 1), and put something you've
written into the Etherpad by the end of Thursday. It should be at least
200 words long -- blog entry, email, school essay, whatever.
ALBUQUERQUE, 1:30pm-3:30pm , Sat, May 5 2012
UTC, 19:30-21:30 Sat, May 5 2012
AMSTERDAM, 21:30-23:30 Sat, May 5 2012
MUMBAI, 1am-3am Sun, May 6 2012
or
ALBUQUERQUE, 10am-noon, Sun, May 6 2012
MUMBAI, 9:30pm-11:30pm, Sun, May 6 2012
UTC, 16:00-18:00, Sun, May 6 2012
AMSTERDAM, 18:00-20:00, Sun, May 6 2012
If people learn effectively from this, then we could repeat it.
--
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation
2012/4/24 Samuel Klein <meta.sj(a)gmail.com>
> Where's the latest thread on the Timed Media Handler progress?
>
> I am meeting with MIT Open CourseWare tomorrow - they want to expand
> the set of videos they released last year under CC-SA, starting with
> categories / vids that would be fill gaps on Wikipedia. Any thoughts
> on how to make that collaboration more effective would be welcome.
>
> SJ
>
>
You can upload them to Internet Archive, if Wikipedia has temporal issues
with videos. When the problems are fixed, we can move them from Internet
Archive to Wikimedia Commons.
--
Emilio J. Rodríguez-Posada. E-mail: emijrp AT gmail DOT com
Pre-doctoral student at the University of Cádiz (Spain)
Projects: AVBOT <http://code.google.com/p/avbot/> |
StatMediaWiki<http://statmediawiki.forja.rediris.es>
| WikiEvidens <http://code.google.com/p/wikievidens/> |
WikiPapers<http://wikipapers.referata.com>
| WikiTeam <http://code.google.com/p/wikiteam/>
Personal website: https://sites.google.com/site/emijrp/
Hi,
One of LibreOffice killer features is exporting documents in MediaWiki
format. It's not perfect, but it's nevertheless very useful for
converting papers written by academics and their students to Wikipedia
articles.
Unfortunately, it's broken in the latest LibreOffice releases:
https://bugs.freedesktop.org/show_bug.cgi?id=46509
Is there, by any chance, anybody here who is familiar with the world
of LibreOffice development that can help get it fixed?
--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
“We're living in pieces,
I want to live in peace.” – T. Moore
My "cherry-picking" answers are inline:
>
>> == embedding in mw or not / can we look at both? ==
>>
>> And we do want embedding, right? And the MW extension would require
>> -lite, or wouldn't it?
>>
>> Can we setup 2 labs instances, one with classic and one with -lite and
>> a mediawiki instance, so users can already
>> see how embedded pads feel and then vote for having them. (and/or also
>> which etherpad version)
>>
> Again, no embedding. Let's ignore this exists. It's great that Thomas
> wrote this for MediaWiki, but it's not a production-level solution.
I am interested to know why do you think my work is not production-level
? The only thing to improve is
https://bugzilla.wikimedia.org/show_bug.cgi?id=36319 . Perhaps I/we can
fix this at or until the hackathon.
>> Anyone started with these already? Do we have packages? For both? Are
>> they on external repos? (on labs we can include 3rd party, in prod. we
>> can't).
>> (can help with puppet code to add repos and keys)
>>
>> What about the extension review? Can bump? Remaining issues?
>>
>> == Bug tracking ==
>>
>> BZ - Migrate to etherpad-lite
>> https://bugzilla.wikimedia.org/show_bug.cgi?id=34953
>>
>> RT - umbrella ticket for all things Etherpad
>> http://rt.wikimedia.org/Ticket/Display.html?id=2018
>>
>> 1180: (Nobody) Update etherpad and do some configuration [open]
>> 1247: (Nobody) Etherpad connection failures [new]
>> 1270: (Nobody) https://etherpad.wikimedia.org/ points to the blog [open]
>> 1673: (Nobody) install etherpad search [stalled]
>> 1690: (Nobody) etherpad reports all logged in users as IP 127.0.0.1 [open]
>> 1720: (Nobody) Etherpad improvement - install etherpadlite [open]
>> 2536: (Nobody) Please upgrade Etherpad server, preserving all old
>> Etherpads and URLs [new]
>> 2555: (dzahn) Please update Etherpad default text [resolved]
>> 2751: (Nobody) HTTPS on etherpad is broken [new]
>>
>> These open more questions:
>>
>> - Does etherpad-lite have search?
No. As it uses changesets, one can currently not search.
>> - Are we hoping for/expecting less connection failures? (re:
>> connection analysis).
EtherpadLite (EPL) works well.
What is neither working in EP nor in EPL but I proposed that already in
https://github.com/Pita/etherpad-lite/issues/190
is "Ultra feature request: make etherpad-lite offline-capable (HTML5)"
>> - The 127.0.0.1 issue is due to setup in production, we'd have to
>> check again and do not have an easy solution so far.
>> - Can we preserve all old etherpads and URLs when switching to -lite ?
>> when upgrading ?
Would be possible. Urls are looking the same, http://server/p/padname
>
>> Would it help if i copied the content of these tickets to public
>> Bugzilla as well?
Bugzilla has already two components where the aspects could perhaps be
filed as "enhancement":
i)
https://bugzilla.wikimedia.org/enter_bug.cgi?product=Wikimedia&component=Et…
ii)
https://bugzilla.wikimedia.org/enter_bug.cgi?product=MediaWiki%20extensions…
Tom (Wikinaut)
Hi,
Is there a function that looks something like
getUrl( 'ruwiki' );
and returns 'https://ru.wikipedia.org'?
WikiMap.php is supposed to have such functions, but it gives me
strange results on my local test wiki.
Of course, it's possible that I screwed something up in my CentralAuth
configuration.
--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
“We're living in pieces,
I want to live in peace.” – T. Moore
Hi all,
I hope this is the right mailing list for this enquiry - apologies if not!
We're using mediawiki to run our ORBIT project site, see
http://orbit.educ.cam.ac.uk
The ORBIT project is an Open Educational Resources project, to support
interactive pedagogy in maths / science / ICT teaching.
We've got some funding available to make some tweaks to the codebase /
extensions, mainly to do with the appearance and usability of the
wiki, see
http://www.sciencemedianetwork.org/Blog/20120427_Mediawiki_tweaks
Resulting code would be Open Source.
What's the best way of going about this? If there are people willing
to do the work, we could hire them. Or if different people can do
different bits, we could put out a bounty?
Any thoughts and offers for help would be greatly appreciated!
Many thanks,
Bjoern
http://orbit.educ.cam.ac.uk
Hi, it's been almost 4 years since we came with the idea of
implementing an OAuth to mediawiki. I think it's time to start.
Question now is if it should be a part of core or extension for
mediawiki. I myself would rather make it as extension, since there is
probably no use for most of installations, except for large wikis.
Quote:
OAuth provides a standard protocol to negotiate secure access tokens
and to provide third-party tools (web or client) with granular access
to private resources. This protocol does not reveal usernames or
passwords to the third-party tool. Offering OAuth based authorization
on Mediawiki wiki's will increase the reusability of its data and spur
the creation of an ecosystem of app's around Mediawiki.
Is there anyone who is willing to help with this? If there is no one
interested in this, or no comments, I would start a new extension
called OAuth, which only purpose would be to enable this feature in
mediawiki.
Hi all. I have a bold proposal (read: evil plan).
To put it briefly: I want to remove the assumption that MediaWiki pages contain
always wikitext. Instead, I propose a pluggable handler system for different
types of content, similar to what we have for file uploads. So, I propose to
associate a "content model" identifier with each page, and have handlers for
each model that provide serialization, rendering, an editor, etc.
The background is that the Wikidata project needs a way to store structured data
(JSON) on wiki pages instead of wikitext. Having a pluggable system would solve
that problem along with several others, like doing away with the special cases
for JS/CSS, the ability to maintain categories etc separate from body text,
manage Gadgets sanely on a wiki page, or several other things (see the link below).
I have described my plans in more detail on meta:
http://meta.wikimedia.org/wiki/Wikidata/Notes/ContentHandler
A very rough prototype is in a dev branch here:
http://svn.wikimedia.org/svnroot/mediawiki/branches/Wikidata/phase3/
Please let me know what you think (here on the list, preferably, not on the talk
page there, at least for now).
Note that we *definitely* need this ability for Wikidata. We could do it
differently, but I think this would be the cleanest solution, and would have a
lot of mid- and long term benefits, even if it's a short term pain. I'm
presenting my plan here to find out if I'm on the right track, and whether it is
feasible to put this on the road map for 1.20. It would be my (and the Wikidata
team's) priority to implement this and see it through before Wikimania. I'm
convinced we have the manpower to get it done.
Cheers,
Daniel
I just wanted to make you aware that enotifs on enwiki (like on dewiki)
are now sent with a further "convenience link" in the footer.
The link allows mail recipients to easily "unwatch" the page by
following the link (if not yet done, users must log in).
see https://bugzilla.wikimedia.org/show_bug.cgi?id=36331