Hi,
While working on implementing VERP for Mediawiki[1], Nemo pointed
to me, Tyler' recommendation[2] on shifting from PHP mailer to Swift
Mailer[3]. Quoting Tyler's words :
"PHPMailer has everything packed into a few classes, whereas Swift_Mailer
actually has a separation of concerns, with classes for attachments,
transport types, etc. A result of this is that PHPMailer has two different
functions for embedding multimedia: addEmbeddedImage() for files and
addStringEmbeddedImage() for strings. Another example is that PHPMailer
supports only two bodies for multipart messages, whereas Swift_Mailer will
add in as many bodies as you tell it to since a body is wrapped in its own
object. In addition, PHPMailer only really supports SMTP, whereas
Swift_Mailer has an extensible transport architecture, and multiple
transport providers. (And there's also plugins, and monolog integration,
etc".
My mentors too think about it to be a nice idea, and Nemo
recommended adding it to my GSoC project deliverable here (
https://www.mediawiki.org/wiki/VERP#Deliverables ). But, we need more
community-consensus on the same as this needs to be done first, and VERP as
a plugin to it, if Swift mailer needs to be done. I have opened a BZ ticket
for the same ( https://bugzilla.wikimedia.org/show_bug.cgi?id=63483 ).
Please comment to this thread or in the BZ regarding the shift as it needs
to be done for a start. The discussions we had on this till date is here:
https://www.mediawiki.org/wiki/Talk:VERP#Swift_Mailer_and_VERP__40928.
[1]: https://www.mediawiki.org/wiki/VERP
[2]:
https://www.mediawiki.org/wiki/Talk:Requests_for_comment/Third-party_compon…
[3]: http://swiftmailer.org/
Thanks,
Tony Thomas <http://tttwrites.in>
FOSS@Amrita <http://foss.amrita.ac.in>
*"where there is a wifi,there is a way"*
Hi, today GeoData spatial searches were switched from Solr to
Elasticsearch, for now only on testwiki. We're going to test it there
for a while before proceeding to production wikis. Suggestions where
to start are welcome - while GD is populated on many projects, it is
rarely used with a few exceptions:
5825 eswiki
2046 enwiki
223 zhwiki
216 svwiki
155 fawiki
130 ruwiki
100 dewiki
75 itwiki
61 frwiki
55 arwiki
54 jawiki
I would like to start with one project, however only itwiki and kowiki
are using Elasticsearch among large Wikipedias.
And finally, appreciation: this was made possible only thanks to awesome
help from our search team, Nik Everett and Chad Horohoe. You kick ass
guys!
--
Best regards,
Max Semenik ([[User:MaxSem]])
On a second thought, do we want to add an optional "affiliation" field to the signup form, so the affiliation goes at the end of username in braces?
- DGarry (WMF)
- Fred (DesignSolutionsInc)
- David (MIT)
- ...
So the signup form would look like this:
-------------------------------------------------------------
| |
| [ Username preview in large green font ] |
| |
| Username: |
| ___________________ |
| Password: |
| ___________________ |
| Password 2: |
| ___________________ |
| Email (optional): |
| ___________________ |
| Affiliation (optional; if your editing is related to work): |
| ___________________ |
| |
-------------------------------------------------------------
I.e.
-------------------------------------------------------------
| |
| [ "Gryllida (FOO)" in large green font ] |
| |
| Username: |
| _Gryllida__________ |
| Password: |
| ___________________ |
| Password 2: |
| ___________________ |
| Email (optional): |
| ___________________ |
| Affiliation (optional; if your editing is related to work): |
| _FOO_______________ |
| |
-------------------------------------------------------------
Gryllida.
On Sun, 23 Feb 2014, at 1:25, rupert THURNER wrote:
> hi,
>
> could wmf please extend the mediawiki software in the following way:
> 1. it should knows "groups"
> 2. allow users to store an arbitrary number of groups with their profile
> 3. allow to select one of the "group"s joined to an edit when saving
> 4. add a checkbox "COI" to an edit, meaning "potential conflict of interest"
> 5. display and filter edits marked with COI in a different color in history
> views
> 6. display and filter edits done for a group in a different color in
> history views
> 7. allow members of a group to receive notifications done on the group page,
> or when a group is mentioned in an edit/comment/talk page.
>
> reason:
> currently it is quite cumbersome to participate as an organisation. it is
> quite cumbersome for people as well to detect COI edits. the most prominent
> examples are employees of the wikimedia foundation, and GLAMs. users tend
> to create multiple accounts, and try to create "company accounts". the main
> reason for this behaviour are (examples, but of course valid general):
> * have a feedback page / notification page for the swiss federal archive
> for other users
> * make clear that an edit is done private or as wmf employee
>
> this then would allow the community to create new policies, e.g. the german
> community might cease using company accounts, and switch over to this
> system. this proposal is purely technical. current policies can still be
> applied if people do not need something else, e.g. wmf employees may
> continue to use "sue gardner (wmf)" accounts.
>
> what you think?
>
> best regards,
> rupert
> -------------------
> swissGLAMour, http://wikimedia.ch
> _______________________________________________
> Wikimedia-l mailing list
> Wikimedia-l(a)lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, <mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe>
Currently I do:
* clone a repo
* setup git & hooks
# patch 1:
* apply my changes
* commit
* execute git-review
# patch 2:
* apply my changes
* commit
FAIL - the new commit it depending on previous commit - I can't push
What am I supposed to do in order to push multiple separate patches?
GIT-IDIOT way please, no long explanations, just commands and examples. Thanks
I'm trying to understand what our current situation is and what our
choices are around HTML templating systems and MediaWiki, so I'm gonna
note what I think I understand so far in this mail and then would love
for people to correct me. TL;DR - did we already consense on a
templating system and I just missed it?
Description: An HTML templates system (also known as a templating
engine) lets you (the programmer) write something that looks more like a
document than it looks like code, then has hooks/entry points/macro
substitution points (for user input and whatnot) that then invoke code,
then emits finished HTML for the browser to render.
Examples: PHP itself is kinda a templating language. In the PHP world,
Smarty is a somewhat more mature/old-school choice. Mustache.js is a
popular modern choice. And in other languages, you'd pick a lot of the
MVC frameworks that are popular, e.g. Django or Jinja in Python.
Spectrum of approaches: One approach treats HTML as a string ("here's a
bunch of bytes to interpolate"). From a security perspective, this is
dangerously easy to have vulnerabilities in, because you just naively
insert strings. Then on the other end of the spectrum, you have code
that always keeps the document object model (DOM) in memory, so the
programmer is abstractly manipulating that data model and passing around
an object. Sure, it spits out HTML in the end, but inherent in the
method for turning those objects into HTML is a sanitization step, so
that's inherently more secure. There's some discussion at
https://www.mediawiki.org/wiki/Parsoid/Round-trip_testing/Templates . I
presume we want the latter, but that the former model is more performant?
We talked about this stuff in
https://www.mediawiki.org/wiki/Architecture_meetings/RFC_review_2014-02-21
and
https://www.mediawiki.org/wiki/Talk:Architecture_Summit_2014/HTML_templatin…
. Based on that plus
https://www.mediawiki.org/wiki/Architecture_Summit_2014/RFC_clusters#HTML_t…
it seems like we are supposed to get consensus on which system(s) to
use, and we kind of have four things we could choose:
* oojs - https://www.mediawiki.org/wiki/OOjs_UI -- could use this
toolkit with one of the template approaches below, or maybe this is
enough by itself! Currently used inside VisualEditor and I am not sure
whether any other MediaWiki extensions or teams are using it? This is a
DOM-based templating system.
Template approaches which are competing?:
* MVC framework - Wikia has written their own templating library that
Wikia uses (Nirvana). Owen Davis is talking about this tomorrow in the
RFC review meeting.
https://www.mediawiki.org/wiki/Requests_for_comment/MVC_framework
* mustache.js stuff - Ryan Kaldari and Chris Steipp mentioned this I think?
* Knockout-compatible implementation in Node.js & PHP
https://www.mediawiki.org/wiki/Requests_for_comment/HTML_templating_library…
and
https://www.mediawiki.org/wiki/Requests_for_comment/HTML_templating_library…
, being worked on by Gabriel Wicke, Matt Walker, and others. DOM-based.
There's also an OutputPage refactor suggested in
https://www.mediawiki.org/wiki/Requests_for_comment/OutputPage_refactor
that's part of the HTML Templating RFC Cluster
https://www.mediawiki.org/wiki/Architecture_Summit_2014/RFC_clusters#HTML_t…
.
I guess my biggest question right now is whether I have all the big
moving parts right in my summary above. Thanks.
--
Sumana Harihareswara
Senior Technical Writer
Wikimedia Foundation
There've been some issues reported lately with image scaling, where
resource usage on very large images has been huge (problematic for batch
uploads from a high-resolution source). Even scaling time for typical
several-megapixel JPEG photos can be slower than desired when loading up
into something like the MMV extension.
I've previously proposed limiting the generatable thumb sizes and
pre-generating those fixed sizes at upload time, but this hasn't been a
popular idea because of the lack of flexibility and potentially poor
client-side scaling or inefficient network use sending larger-than-needed
fixed image sizes.
Here's an idea that blends the performance benefits of pre-scaling with the
flexibility of our current model...
A classic technique in 3d graphics is
mip-mapping<https://en.wikipedia.org/wiki/Mip-mapping>,
where an image is pre-scaled to multiple resolutions, usually each 1/2 the
width and height of the next level up.
When drawing a textured polygon on screen, the system picks the most
closely-sized level of the mipmap to draw, reducing the resources needed
and avoiding some classes of aliasing/moiré patterns when scaling down. If
you want to get fancy you can also use trilinear
filtering<https://en.wikipedia.org/wiki/Trilinear_filtering>,
where the next-size-up and next-size-down mip-map levels are combined --
this further reduces artifacting.
I'm wondering if we can use this technique to help with scaling of very
large images:
* at upload time, perform a series of scales to produce the mipmap levels
* _don't consider the upload complete_ until those are done! a web uploader
or API-using bot should probably wait until it's done before uploading the
next file, for instance...
* once upload is complete, keep on making user-facing thumbnails as
before... but make them from the smaller mipmap levels instead of the
full-scale original
This would avoid changing our external model -- where server-side scaling
can be used to produce arbitrary-size images that are well-optimized for
their target size -- while reducing resource usage for thumbs of huge
source images. We can also still do things like applying a sharpening
effect on photos, which people sorely miss when it's missing.
If there's interest in investigating this scenario I can write up an RfC
with some more details.
(Properly handling multi-page files like PDFs, DjVu, or paged TIFFs could
complicate this by making the initial rendering extraction pretty slow,
though, so that needs consideration.)
-- brion
Following change I189ba71de[0], the hierarchical list in
Special:Allpages becomes a simple alphabetic pager if the total number
of pages exceeds a safety threshold. The threshold is designed to
protect wikis on which the load generated by the process of generating
the hierarchical list would be prohibitively expensive (bug 56840[1]).
I189ba71de resolved the immediate operational issue, but there is a
further question of whether we want to keep the hierarchical list at
all, especially given that it cannot be enabled (in its current
implementation, at least) on larger installations.
>From my perspective, the ideal outcome of this discussion would be
that we agree that the hierarchical list is a poor fit for the
MediaWiki of today, and we resolve to remove it from core.
According to stats.grok.se, enwiki's Special:Allpages receives
approximately 158 hits a day.[2]
[0]: https://gerrit.wikimedia.org/r/#/c/94690/
[1]: https://bugzilla.wikimedia.org/show_bug.cgi?id=56840
[2]: http://stats.grok.se/en/latest90/Special:Allpages
---
Ori Livneh
ori(a)wikimedia.org
I'm happy to announce the availability of the second beta release of the
new MediaWiki 1.19 release series.
Please try it out and let us know what you think. Don't run it on any
wikis that you really care about, unless you are both very brave and
very confident in your MediaWiki administration skills.
MediaWiki 1.19 is a large release that contains many new features and
bug fixes. This is a summary of the major changes of interest to users.
You can consult the RELEASE-NOTES-1.19 file for the full list of changes
in this version.
Five security issues were discovered.
It was discovered that the api had a cross-site request forgery (CSRF)
vulnerability in the block/unblock modules. It was possible for a user
account with the block privileges to block or unblock another user without
providing a token.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=34212
It was discovered that the resource loader can leak certain kinds of private
data across domain origin boundaries, by providing the data as an executable
JavaScript file. In MediaWiki 1.18 and later, this includes the leaking of
CSRF
protection tokens. This allows compromise of the wiki's user accounts, say
by
changing the user's email address and then requesting a password reset.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=34907
Jan Schejbal of Hatforce.com discovered a cross-site request forgery (CSRF)
vulnerability in Special:Upload. Modern browsers (since at least as early as
December 2010) are able to post file uploads without user interaction,
violating previous security assumptions within MediaWiki.
Depending on the wiki's configuration, this vulnerability could lead to
further
compromise, especially on private wikis where the set of allowed file types
is
broader than on public wikis. Note that CSRF allows compromise of a wiki
from
an external website even if the wiki is behind a firewall.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=35317
George Argyros and Aggelos Kiayias reported that the method used to generate
password reset tokens is not sufficiently secure. Instead we use various
more
secure random number generators, depending on what is available on the
platform. Windows users are strongly advised to install either the openssl
extension or the mcrypt extension for PHP so that MediaWiki can take
advantage
of the cryptographic random number facility provided by Windows.
Any extension developers using mt_rand() to generate random numbers in
contexts
where security is required are encouraged to instead make use of the
MWCryptRand class introduced with this release.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=35078
A long-standing bug in the wikitext parser (bug 22555) was discovered to
have
security implications. In the presence of the popular CharInsert extension,
it
leads to cross-site scripting (XSS). XSS may be possible with other
extensions
or perhaps even the MediaWiki core alone, although this is not confirmed at
this time. A denial-of-service attack (infinite loop) is also possible
regardless of configuration.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=35315
*********************************************************************
What's new?
*********************************************************************
MediaWiki 1.19 brings the usual host of various bugfixes and new features.
Comprehensive list of what's new is in the release notes.
* Bumped MySQL version requirement to 5.0.2.
* Disable the partial HTML and MathML rendering options for Math,
and render as PNG by default.
* MathML mode was so incomplete most people thought it simply didn't work.
* New skins/common/*.css files usable by skins instead of having to copy
piles of
generic styles from MonoBook or Vector's css.
* The default user signature now contains a talk link in addition to the
user link.
* Searching blocked usernames in block log is now clearer.
* Better timezone recognition in user preferences.
* Extensions can now participate in the extraction of titles from URL paths.
* The command-line installer supports various RDBMSes better.
* The interwiki links table can now be accessed also when the interwiki
cache
is used (used in the API and the Interwiki extension).
Internationalization
- --------------------
* More gender support (for instance in user lists).
* Add languages: Canadian English.
* Language converter improved, e.g. it now works depending on the page
content language.
* Time and number-formatting magic words also now depend on the page
content language.
* Bidirectional support further improved after 1.18.
Release notes
- -------------
Full release notes:
https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/core.git;a=blob_plain;f=RE
LEASE-NOTES-1.19;hb=1.19.0beta2
https://www.mediawiki.org/wiki/Release_notes/1.19
Co-inciding with these security releases, the MediaWiki source code
repository has
moved from SVN (at https://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3)
to Git (https://gerrit.wikimedia.org/gitweb/mediawiki/core.git). So the
relevant
commits for these releases will not be appearing in our SVN repository. If
you use
SVN checkouts of MediaWiki for version control, you need to migrate these to
Git.
If you up are using tarballs, there should be no change in the process for
you.
Please note that any WMF-deployed extensions have also been migrated to Git
also, along with some other non WMF-maintained ones.
Please bear with us, some of the Git related links for this release may not
work instantly,
but should later on.
To do a simple Git clone, the command is:
git clone https://gerrit.wikimedia.org/r/p/mediawiki/core.git
More information is available at https://www.mediawiki.org/wiki/Git
For more help, please visit the #mediawiki IRC channel on freenode.netirc://irc.freenode.net/mediawiki or email The MediaWiki-l mailing list
at mediawiki-l(a)lists.wikimedia.org.
**********************************************************************
Download:
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.0beta2.tar.gz
Patch to previous version (1.19.0beta1), without interface text:
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.0beta2.patch.gz
Interface text changes:
http://download.wikimedia.org/mediawiki/1.19/mediawiki-i18n-1.19.0beta2.patc
h.gz
GPG signatures:
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.0beta2.tar.gz.si
g
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.0beta2.patch.gz.
sig
http://download.wikimedia.org/mediawiki/1.19/mediawiki-i18n-1.19.0beta2.patc
h.gz.sig
Public keys:
https://secure.wikimedia.org/keys.html