Are you good in swearing? WE NEED YOU
Huggle 3 comes with vandalism-prediction as it is precaching the diffs
even before they are enqueued including their contents. Each edit has
so called "score" which is a numerical value that if higher, the edit
is more likely a vandalism.
If you want to help us improve this feature, it is necessary to define
a "score words" list for every wiki where huggle is about to be used,
for example on English wiki.
Each list has following syntax:
(see https://en.wikipedia.org/w/index.php?title=Wikipedia:Huggle/Config&diff=573…)
score-words(score):
list of words separated by comma, can contain newlines but comma
must be present
example
score-words(200):
these, are, some, words, which, presence, of, increases, the, score,
each, word, by, 200,
So, if you know english better than me, which you likely do, go ahead
and improve the configuration file there, no worries, huggle's config
parser is very syntax-error proof.
If you have any other suggestion how to improve huggle's prediction,
go ahead and tell us!
I have a log of what happens on when the commands:
sudo apt-get install mediawiki2latex
mediawiki2latex -u https://en.wikipedia.org/wiki/Adam_Ries -o AdamRies.pdf
are entered on the command line of ubuntu (13.10) Better than TV...
Happy to send it to anyone.
Fred
Hi, in response to bug 54607 [1], we've changed the semantics of the
mobileformat parameter to action=parse
== Summary ==
Previously, it used to accept strings 'html' or 'wml', later just
'html' and modify the structure of output (see below). This was problematic
because you needed to retrieve the HTML from output in different ways,
depending on whether mobileformat is specified or not. Now,
mobileformat is a boolean parameter, that is if there's a 'mobileformat'
parameter in request, it will be treated as "the output should be
mobile-friendly", regardless of value. And the output structure will
be the same. For compatibility with older callers,
mobileformat=(html|wml) will be special-cased to return the older
structure at least for 6 month from now. These changes will start
being rolled out to the WMF sites starting from tomorrow, Tuesday
October 24th and this process will be complete by October 31st.
== Examples ==
=== Non-mobile parse ===
api.php?action=parse&format=xml
{
"parse": {
"title": "...",
"text": {
"*": "foo"
}
}
}
api.php?action=parse&format=json
<?xml version="1.0"?>
<api>
<parse title="..." displaytitle="...">
<text xml:space="preserve">foo</text>
</parse>
</api>
=== Parse that outputs mobile HTML, old style ===
api.php?action=parse&format=json&mobileformat=html
{
"parse": {
"title": "API",
"text": "foo"
}
}
api.php?action=parse&format=xml&mobileformat=html
<?xml version="1.0"?>
<api>
<parse title="..." text="foo" displaytitle="...">
</parse>
</api>
=== Parse that outputs mobile HTML, new style ===
api.php?action=parse&format=...&mobileformat
Same as for non-mobile parses.
== FAQ ==
Q: I didn't use mobileformat before, does anything change for me?
A: No.
Q: I use mobileformat=html, will my bot/tool be broken now?
A: No, you will have 6 months to switch to new style.
Q: I'm only planning to use mobileformat, what should I do?
A: Just use the new style.
Q: How did this format discrepancy appear in the first place?
A: To err is human.
-----
[1] https://bugzilla.wikimedia.org/show_bug.cgi?id=54607
--
Best regards,
Max Semenik ([[User:MaxSem]])
Currently I do:
* clone a repo
* setup git & hooks
# patch 1:
* apply my changes
* commit
* execute git-review
# patch 2:
* apply my changes
* commit
FAIL - the new commit it depending on previous commit - I can't push
What am I supposed to do in order to push multiple separate patches?
GIT-IDIOT way please, no long explanations, just commands and examples. Thanks
Following change I189ba71de[0], the hierarchical list in
Special:Allpages becomes a simple alphabetic pager if the total number
of pages exceeds a safety threshold. The threshold is designed to
protect wikis on which the load generated by the process of generating
the hierarchical list would be prohibitively expensive (bug 56840[1]).
I189ba71de resolved the immediate operational issue, but there is a
further question of whether we want to keep the hierarchical list at
all, especially given that it cannot be enabled (in its current
implementation, at least) on larger installations.
>From my perspective, the ideal outcome of this discussion would be
that we agree that the hierarchical list is a poor fit for the
MediaWiki of today, and we resolve to remove it from core.
According to stats.grok.se, enwiki's Special:Allpages receives
approximately 158 hits a day.[2]
[0]: https://gerrit.wikimedia.org/r/#/c/94690/
[1]: https://bugzilla.wikimedia.org/show_bug.cgi?id=56840
[2]: http://stats.grok.se/en/latest90/Special:Allpages
---
Ori Livneh
ori(a)wikimedia.org
I'm happy to announce the availability of the second beta release of the
new MediaWiki 1.19 release series.
Please try it out and let us know what you think. Don't run it on any
wikis that you really care about, unless you are both very brave and
very confident in your MediaWiki administration skills.
MediaWiki 1.19 is a large release that contains many new features and
bug fixes. This is a summary of the major changes of interest to users.
You can consult the RELEASE-NOTES-1.19 file for the full list of changes
in this version.
Five security issues were discovered.
It was discovered that the api had a cross-site request forgery (CSRF)
vulnerability in the block/unblock modules. It was possible for a user
account with the block privileges to block or unblock another user without
providing a token.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=34212
It was discovered that the resource loader can leak certain kinds of private
data across domain origin boundaries, by providing the data as an executable
JavaScript file. In MediaWiki 1.18 and later, this includes the leaking of
CSRF
protection tokens. This allows compromise of the wiki's user accounts, say
by
changing the user's email address and then requesting a password reset.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=34907
Jan Schejbal of Hatforce.com discovered a cross-site request forgery (CSRF)
vulnerability in Special:Upload. Modern browsers (since at least as early as
December 2010) are able to post file uploads without user interaction,
violating previous security assumptions within MediaWiki.
Depending on the wiki's configuration, this vulnerability could lead to
further
compromise, especially on private wikis where the set of allowed file types
is
broader than on public wikis. Note that CSRF allows compromise of a wiki
from
an external website even if the wiki is behind a firewall.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=35317
George Argyros and Aggelos Kiayias reported that the method used to generate
password reset tokens is not sufficiently secure. Instead we use various
more
secure random number generators, depending on what is available on the
platform. Windows users are strongly advised to install either the openssl
extension or the mcrypt extension for PHP so that MediaWiki can take
advantage
of the cryptographic random number facility provided by Windows.
Any extension developers using mt_rand() to generate random numbers in
contexts
where security is required are encouraged to instead make use of the
MWCryptRand class introduced with this release.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=35078
A long-standing bug in the wikitext parser (bug 22555) was discovered to
have
security implications. In the presence of the popular CharInsert extension,
it
leads to cross-site scripting (XSS). XSS may be possible with other
extensions
or perhaps even the MediaWiki core alone, although this is not confirmed at
this time. A denial-of-service attack (infinite loop) is also possible
regardless of configuration.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=35315
*********************************************************************
What's new?
*********************************************************************
MediaWiki 1.19 brings the usual host of various bugfixes and new features.
Comprehensive list of what's new is in the release notes.
* Bumped MySQL version requirement to 5.0.2.
* Disable the partial HTML and MathML rendering options for Math,
and render as PNG by default.
* MathML mode was so incomplete most people thought it simply didn't work.
* New skins/common/*.css files usable by skins instead of having to copy
piles of
generic styles from MonoBook or Vector's css.
* The default user signature now contains a talk link in addition to the
user link.
* Searching blocked usernames in block log is now clearer.
* Better timezone recognition in user preferences.
* Extensions can now participate in the extraction of titles from URL paths.
* The command-line installer supports various RDBMSes better.
* The interwiki links table can now be accessed also when the interwiki
cache
is used (used in the API and the Interwiki extension).
Internationalization
- --------------------
* More gender support (for instance in user lists).
* Add languages: Canadian English.
* Language converter improved, e.g. it now works depending on the page
content language.
* Time and number-formatting magic words also now depend on the page
content language.
* Bidirectional support further improved after 1.18.
Release notes
- -------------
Full release notes:
https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/core.git;a=blob_plain;f=RE
LEASE-NOTES-1.19;hb=1.19.0beta2
https://www.mediawiki.org/wiki/Release_notes/1.19
Co-inciding with these security releases, the MediaWiki source code
repository has
moved from SVN (at https://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3)
to Git (https://gerrit.wikimedia.org/gitweb/mediawiki/core.git). So the
relevant
commits for these releases will not be appearing in our SVN repository. If
you use
SVN checkouts of MediaWiki for version control, you need to migrate these to
Git.
If you up are using tarballs, there should be no change in the process for
you.
Please note that any WMF-deployed extensions have also been migrated to Git
also, along with some other non WMF-maintained ones.
Please bear with us, some of the Git related links for this release may not
work instantly,
but should later on.
To do a simple Git clone, the command is:
git clone https://gerrit.wikimedia.org/r/p/mediawiki/core.git
More information is available at https://www.mediawiki.org/wiki/Git
For more help, please visit the #mediawiki IRC channel on freenode.netirc://irc.freenode.net/mediawiki or email The MediaWiki-l mailing list
at mediawiki-l(a)lists.wikimedia.org.
**********************************************************************
Download:
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.0beta2.tar.gz
Patch to previous version (1.19.0beta1), without interface text:
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.0beta2.patch.gz
Interface text changes:
http://download.wikimedia.org/mediawiki/1.19/mediawiki-i18n-1.19.0beta2.patc
h.gz
GPG signatures:
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.0beta2.tar.gz.si
g
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.0beta2.patch.gz.
sig
http://download.wikimedia.org/mediawiki/1.19/mediawiki-i18n-1.19.0beta2.patc
h.gz.sig
Public keys:
https://secure.wikimedia.org/keys.html
Hi,
Just like every first Tuesday of the month [1], the Engineering
Community team has reviewed and fine tuned the monthly and quarterly plans:
https://www.mediawiki.org/wiki/Engineering_Community_Team/Meetings
With Google Code-in and Outreach Program for Women starting soon, this
month is expected to be very busy.
We also have our team's quarterly review, which we expect will be useful
to sync our longer term plans with our actual activities - see
https://www.mediawiki.org/wiki/Wikimedia_Engineering/2013-14_Goals#Wikimedi…
Last week we had the monthly ECT Showcase, just like every last Tuesday
of the month [1]. There we presented a selection of tasks completed in
October. You can watch the video at
http://www.youtube.com/watch?v=Y3Gl-oR2ucU (copy to Commons pending,
help welcome).
Next week we will have our monthly IRC meeting, just like every second
Tuesday of the month. You can propose topics to be discussed at
https://www.mediawiki.org/wiki/Engineering_Community_Team/Meetings#2013-11-…
That is all for now. We will use this thread for future updates.
Questions? Just ask.
[1] Just joking, we only completed a first iteration.
--
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
I've been bothered for awhile by the mess we have in resources/jquery/ –
3rd party libraries, custom libraries we have to maintain, and directly
MW related code using mediaWiki.* APIs all mixed together in the same
directory. So I went and audited the .js we have inside
resources/jquery/ and have wrote up an RFC on it:
https://www.mediawiki.org/wiki/Requests_for_comment/Isolate_custom_jQuery_l…
--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]
I came across Gerrit change 79948[1] today, which makes "VectorBeta"
use a pile of non-free fonts (with one free font thrown in at the end
as a sop). Is this really the direction we want to go, considering
that in many other areas we prefer to use free software whenever we
can?
Looking around a bit, I see this has been discussed in some "back
corners"[2][3] (no offense intended), but not on this list and I don't
see any place where free versus non-free was actually discussed rather
than being brought up and then seemingly ignored.
In case it helps, I did some searching through mediawiki/core and
WMF-deployed extensions for font-family directives containing non-free
fonts. The results are at
https://www.mediawiki.org/wiki/User:Anomie/font-family (use of
non-staff account intentional).
[1]: https://gerrit.wikimedia.org/r/#/c/79948
[2]: https://www.mediawiki.org/wiki/Talk:Wikimedia_Foundation_Design/Typography#…
[3]: https://bugzilla.wikimedia.org/show_bug.cgi?id=44394
TL;DR SUMMARY: check out this short, silent, black & white video:
https://brionv.com/misc/ogv.js/demo/ -- anybody interested in a side
project on in-browser audio/video decoding fallback?
One of my pet peeves is that we don't have audio/video playback on many
systems, including default Windows and Mac desktops and non-Android mobile
devices, which don't ship with Theora or WebM video decoding.
The technically simplest way to handle this is to transcode videos into
H.264 (.mp4 files) which is well supported by the troublesome browsers.
Unfortunately there are concerns about the patent licensing, which has held
us up from deploying any H.264 output options though all the software is
ready to go...
While I still hope we'll get that resolved eventually, there is an
alternative -- client-side software decoding.
We have used the 'Cortado <http://www.theora.org/cortado/>' Java applet to
do fallback software decoding in the browser for a few years, but Java
applets are aggressively being deprecated on today's web:
* no Java applets at all on major mobile browsers
* Java usually requires a manual install on desktop
* Java applets disabled by default for security on major desktop browsers
Luckily, JavaScript engines have gotten *really fast* in the last few
years, and performance is getting well in line with what Java applets can
do.
As an experiment, I've built Xiph's ogg, vorbis, and theora C libraries
cross-compiled to JavaScript using
emscripten<https://github.com/kripken/emscripten>and written a wrapper
that decodes Theora video from an .ogv stream and
draws the frames into a <canvas> element:
* demo: https://brionv.com/misc/ogv.js/demo/
* code: https://github.com/brion/ogv.js
* blog & some details:
https://brionv.com/log/2013/10/06/ogv-js-proof-of-concept/
It's just a proof of concept -- the colorspace conversion is incomplete so
it's grayscale, there's no audio or proper framerate sync, and it doesn't
really stream data properly. But I'm pleased it works so far! (Currently it
breaks in IE, but I think I can fix that at least for 10/11, possibly for
9. Probably not for 6/7/8.)
Performance on iOS devices isn't great, but is better with lower resolution
files :) On desktop it's screaming fast for moderate resolutions, and could
probably supplement or replace Cortado with further development.
Is anyone interested in helping out or picking up the project to move it
towards proper playback? If not, it'll be one of my weekend "fun" projects
I occasionally tinker with off the clock. :)
-- brion