Hi,
I'm Pubudu Fernando and I'm currently a first year Computer Science and
Engineering student at the University of Moratuwa in Sri Lanka. For my GSoC
'14 project, I would like to work on switching Semantic Forms
autocompletion to use the Select2 library.
Thanks,
Best Regards,
Pubudu Fernando
Hello and welcome to the latest edition of the WMF Engineering Roadmap
and Deployment update.
The full log of planned deployments next week can be found at:
https://wikitech.wikimedia.org/wiki/Deployments#Week_of_March_24th
Notable items...
== Tuesday ==
* MediaWiki deploy window, currently following the 1.23 schedule
** group1 to 1.23wmf19: All non-Wikipedia sites (Wiktionary, Wikisource,
Wikinews, Wikibooks, Wikiquote, Wikiversity, and a few other sites)
** <https://www.mediawiki.org/wiki/MediaWiki_1.23/wmf19>
** Schedule:
<https://www.mediawiki.org/wiki/MediaWiki_1.23/Roadmap#Schedule_for_the_depl…>
* Switching the Compact Personal Bar Beta Feature discussion page to
Flow
** <https://www.mediawiki.org/wiki/Talk:Compact_Personal_Bar>
== Wednesday ==
* Enable Hovercards on all wikis as a BetaFeature
** <https://www.mediawiki.org/wiki/Beta_Features/Hovercards>
== Thursday ==
* MediaWiki deploy window, currently following the 1.23 schedule)
** group2 to 1.23wmf19 (all Wikipedias)
** group0 to 1.23wmf20 (test/test2/testwikidata/mediawiki)
** <https://www.mediawiki.org/wiki/MediaWiki_1.23/wmf20>
* As a part of the MediaWiki rollout the Typography Refresh beta feature
will move from VectorBeta extension/Beta Feature to the Vector skin in
MediaWiki core. This effectively updates the default skin for all
users with improved readability.
** See the summary of changes:
<https://www.mediawiki.org/wiki/Typography_refresh#Summary_of_changes>
** This will roll out with the 1.23wmf20 branch mentioned above, which
means it will gradually roll out to all wikis following the normal
progression (testwikis & mediawiki.org on Thursday -> non-wikipedias
(eg Commons etc) on Tuesday -> All wikis following Thursday).
Thanks, and as always, questions welcome,
Greg
--
| Greg Grossmeier GPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @greg A18D 1138 8E47 FAC8 1C7D |
I realize this has been discussed on this list rather recently (starting with Jon's CologneBlue question), and I realize some exploratory work has started (or at least was considered), but I'm submitting this anyway. I came up with the general idea first (really!), just haven't had time to write it down before.
Several people asked me to reply regarding the CologneBlue thread – consider this my response. :)
Comments (here or on the talk page) would be very welcome. I haven't gotten anyone to formally commit to mentoring this project yet, hopefully that can be sorted out on time.
https://www.mediawiki.org/wiki/User:Matma_Rex/Separating_skins_from_core_Me…
Project synopsis:
MediaWiki core includes four skins, and allows site administrators to create and install additional ones. However, the process is less than pleasant, due to several related problems (lack of documentation, more than one "correct" way to make a skin work, directory layout that makes packaging and (un)installation difficult, core skins and MediaWiki itself being interdependent, and possibly others).
I intend to solve at least two of the aforementioned issues by devising and documenting a saner directory layout for skins (and applying it to the four core ones) and then carefully disentangling them from MediaWiki code, removing cross-dependencies and making it possible for non-core skins to have the same level of control over all aspects of the look&feel as core ones currently have. This would make the lives of both skin creators and site administrators wishing to use a non-default skin a lot easier.
If everything goes well, the process would be culminated with moving the core skins out of core, to separate git repositories. This would require coordination with MediaWiki release managers (to have them shipped in the release tarballs the way certain extensions are shipped now) and Wikimedia Foundation Operations team members (to ensure the deployment of the new system on Wikimedia wikis goes smoothly), so it cannot be made a part of my core proposal.
--
Matma Rex
If you're looking for a paid open source-related internship this summer
and you missed the deadline for OPW*/GSoC**, check out
http://openhatch.org/blog/2014/summer-internships-for-open-source-enthusias…
. Deadlines vary from March 31st to May 15th to "none" and eligibility
requirements vary (for some, you do have to be a student).
--
Sumana Harihareswara
Senior Technical Writer
Wikimedia Foundation
* A few organizations, such as Mozilla, have extended their OPW deadline
to March 31st. Wikimedia has not.
** Or maybe you submitted an OPW/GSoC application but aren't sure it'll
get accepted.
OK, this is killing me. I'm trying to upload files to Commons (using
PHP/CURL).
* I can upload local files with my own bot user.
* I can upload from remote URLs using OAuth, /if the user is an admin/
What I can't figure out is how to upload local files via OAuth. It's either
"File upload param file is not a file upload; be sure to use
multipart/form-data for your POST and include a filename in the
Content-Disposition header."
or (trying to add multipart/form-data to the header)
"The authorization headers in your request are not valid: Invalid signature"
Is there any example code for uploading local files to Commons via OAuth? A
trick I can't find? Anything?
Cheers,
Magnus
Hello everyone,
My name is Konarak Ratnakar. I'm planning on creating a new
localisation update service, named LUv2, as a summer of code project.
You can read the full details on the proposal page at
https://www.mediawiki.org/wiki/Extension:LocalisationUpdate/LUv2.
I would be glad to answer any questions that you might have. Just post
it on the talk page.
Best,
--
Konarak Ratnakar
Hi,
I'm Erick Guan, a 2nd student and wikipedian since 2008 who is interested
in MediaWiki development.
In general, the goal of this project would be: redesign
Special:SearchTranslations page, bugfix and alternative backend based on
ElasticSearch. This made the Extension:Translate more powerful and easy to
use.
And Wikimedia Language engineering team described one stop translation
search in details in the GSoC feature idea page. Bugzilla, visual spec
defines the exactly idea about what will it be and what we want to achieve.
It's not hard to work on it and actually made this happen.
Here is my proposal:
https://www.mediawiki.org/wiki/User:Fantasticfears/GSoC_2014_2#One_stop_tra…
--
Regards,
Erick Guan/管啸 (fantasticfears)
Hi everyone, I am a participant of GSOC 2014. I am looking forward to
contribute into "Book Management in Wikibooks/Wikisource" project of
mediawiki foundation.
Link for my proposal page at mediawiki: https://www.mediawiki.org/wiki
/User_talk:Hrishy. Please have a look at it and suggest any edits I can
make.
Hi!
I think will be good idea to introduce support for files in TeX and
ABC/Lilypond (Score extension) formats, so such files could be hosted
on Commons.
This will simplify maintenance of formulas and music across projects
as well as allow to refer to mathematical and music notations from
Wikidata.
Eugene.