Hi all,
With all the talk about turning on $wgSecureLogin for WMF sites, there has
been a lot of misconceptions about how the option works, and difference of
opinions about how they should work in the future.
I started:
https://www.mediawiki.org/wiki/Requests_for_comment/Login_security
It would be great to get feedback on the "Longer Term Questions" section.
Also, if anyone isn't entirely clear about how the preferences work,
hopefully this will provide some clarification.
The extension OpenID loads a CSS file via RL:
$myResourceTemplate = array(
'localBasePath' => $path . '/skin',
'remoteExtPath' => 'OpenID/skin',
'group' => 'ext.openid',
);
...
$wgResourceModules['ext.openid.icons'] = $myResourceTemplate + array(
'styles' => 'openid.css',
'dependencies' => array(
'ext.openid'
)
);
openid.css comprises lines such as
#openid_provider_OpenID_icon { /*@embed*/background-image:
url(icons/OpenID_large.png); }
Question:
=========
How to contruct the background-image filename from a value in one of the
OpenID PHP modules ?
Hi, I'm a grad student at CMU studying network security in general and
censorship / surveillance resistance in particular. I also used to work
for Mozilla, some of you may remember me in that capacity. My friend
Sumana Harihareswara asked me to comment on Wikimedia's plans for
hardening the encyclopedia against state surveillance. I've read the
discussion to date on this subject, but it was kinda all over the map,
so I thought it would be better to start a new thread. Actually I'm
going to start two threads, one for general site hardening and one
specifically about traffic analysis. This is the one about site
hardening, which should happen first. Please note that I am subscribed
to wikitech-l but not wikimedia-l (but I have read the discussion over
there).
The roadmap at
https://blog.wikimedia.org/2013/08/01/future-https-wikimedia-projects/
looks to me to have the right shape, but there are some missing things
and points of confusion.
The first step really must be to enable HTTPS unconditionally for
everyone (whether or not logged in). I see on the roadmap that there is
concern that this will lock out large groups of users, e.g. from China;
a workaround simply *must* be found for this. Everything else that is
worth doing is rendered ineffective if *any* application layer data is
*ever* transmitted over an insecure channel. There is no point worrying
about traffic analysis when an active man-in-the-middle can inject
malicious JavaScript into unsecured pages, or a passive one can steal
session cookies as they fly by in cleartext.
As part of the engineering effort to turn on TLS for everyone, you
should also provide SPDY, or whatever they're calling it these days.
It's valuable not only for traffic analysis' sake, but because it offers
server-side efficiency gains that (in theory anyway) should mitigate the
TLS overhead somewhat.
After that's done, there's a grab bag of additional security refinements
that are deployable immediately or with minimal-to-moderate engineering
effort. The roadmap mentions HTTP Strict Transport Security; that should
definitely happen. All cookies should be tagged both Secure and HttpOnly
(which renders them inaccessible to accidental HTTP loads and to page
JavaScript); now would also be a good time to prune your cookie
requirements, ideally to just one which does not reveal via inspection
whether or not someone is logged in. You should also do
Content-Security-Policy, as strict as possible. I know this can be a
huge amount of development effort, but the benefits are equally huge -
we don't know exactly how it was done, but there's an excellent chance
CSP on the hidden service would have prevented the exploit discussed
here:
https://blog.torproject.org/blog/hidden-services-current-events-and-freedom…
Several people raised concerns about Wikimedia's certificate authority
becoming compromised (whether by traditional "hacking", social
engineering, or government coercion). The best available cure for this
is called "certificate pinning", which is unfortunately only doable by
talking to browser vendors right now; however, I imagine they would be
happy to apply pins for Wikipedia. There's been some discussion of an
HSTS extension that would apply a pin
(http://tools.ietf.org/html/draft-evans-palmer-key-pinning-00) and it's
also theoretically doable via DANE (http://tools.ietf.org/html/rfc6698);
however, AFAIK no one implements either of these things yet, and I rate
it moderately likely that DANE is broken-as-specified. DANE requires
DNSSEC, which is worth implementing for its own sake (it appears that
the wikipedia.org. and wikimedia.org. zones are not currently signed).
Perfect forward secrecy should also be considered at this stage. Folks
seem to be confused about what PFS is good for. It is *complementary* to
traffic analysis resistance, but it's not useless in the absence of.
What it does is provide defense in depth against a server compromise by
a well-heeled entity who has been logging traffic *contents*. If you
don't have PFS and the server is compromised, *all* traffic going back
potentially for years is decryptable, including cleartext passwords and
other equally valuable info. If you do have PFS, the exposure is limited
to the session rollover interval. Browsers are fairly aggressively
moving away from non-PFS ciphersuites (see
<https://briansmith.org/browser-ciphersuites-01.html>; all of the
non-"deprecated" suites are PFS).
Finally, consider paring back the set of ciphersuites accepted by your
servers. Hopefully we will soon be able to ditch TLS 1.0 entirely (all
of its ciphersuites have at least one serious flaw). Again, see
<https://briansmith.org/browser-ciphersuites-01.html> for the current
thinking from the browser side.
zw
As mentioned earlier this week, we deployed an initial version of the OAuth
extension to the test wikis yesterday. I wanted to follow up with a few
more details about the extension that we deployed (although if you're just
curious about OAuth in general, I recommend starting at oauth.net, or
https://www.mediawiki.org/wiki/Auth_systems/OAuth):
* Use it: https://www.mediawiki.org/wiki/Extension:OAuth#Using_OAuth should
get you started towards using OAuth in your application.
* Demo: Anomie setup a excellent initial app (I think counts as our first
official, approved consumer) here
https://tools.wmflabs.org/oauth-hello-world/. Feel free to try it out, so
you can get a feel for the user experience as a user!
* Timeline: We're hoping to get some use this week, and deploy to the rest
of the WMF wikis next week if we don't encounter any issues.
* Bugs: Please open bugzilla tickets for any issues you find, or
enhancement requests--
https://bugzilla.wikimedia.org/enter_bug.cgi?product=MediaWiki%20extensions…
And some other details for the curious:
* Yes, you can use this on your own wiki right now! It's meant to be used
in a single or shared environment, so the defaults will work on a
standalone wiki. Input and patches are welcome, if you have any issues
setting this up on your own wiki.
* TLS: Since a few of you seem to care about https... The extension
currently implements OAuth 1.0a, which is designed to be used without https
(except to deliver the shared secret to the app owner, when the app is
registered). So calls to the API don't need to use https.
* Logging: All edits are tagged with the consumer's id (CID), so you can
see when OAuth was used to contribute an edit.
Enjoy!
<quote name="Sumana Harihareswara" date="2013-08-23" time="16:16:21 -0400">
> Thanks to Mark Holmquist for maintaining http://etherpad.wmflabs.org for
> the past long while. It is going down in 2 weeks, so please retrieve
> your text.
>
> I recommend that you:
>
> * go into your browser history
> * search it for etherpad.wmflabs.org
> * go to each of those pads and copy-and-paste the content someplace,
> preferably on a public wiki, even if it's just in your userspace
> * replace the content of the Etherpad with a link to the wiki page
> you've moved the text to
Has anyone made an automatic etherpad -> wiki script yet? I'd love to
have a list of etherpad urls in some txt file I maintain of important
pads/pads I want sync'd on a wiki page for public consumption. I guess
the text file might need to have the relationship of pad to wikipage.
My duckduckgo searching wasn't successful.
Greg
--
| Greg Grossmeier GPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @greg A18D 1138 8E47 FAC8 1C7D |
Hello!
Here is your deployment highlights email for next week!
The full schedule can be found at:
https://wikitech.wikimedia.org/wiki/Deployments#Week_of_August_26th
== Monday ==
* MediaWiki 1.22wmf14 will roll out to the second group of wikis:
All non-Wikipedia sites (Wiktionary, Wikisource, Wikinews, Wikibooks,
Wikiquote, Wikiversity, and a few other sites)
Information about MediaWiki 1.22wmf14:
https://www.mediawiki.org/wiki/MediaWiki_1.22/wmf14
* At the same time, Wikidata Phase 2 will be enabled on WikiVoyage.
== Tuesday ==
* The Mobile team will be enabling Notifications on mobile sites for all
Echo-enabled projects.
== Wednesday ==
* The much anticipated new search backend will be deployed to
mediawiki.org.
* SecureLogin for all wikis. The full (including technical) details are
available here:
https://www.mediawiki.org/wiki/Requests_for_comment/Login_security
The official 'plan of record' for what will be happening can be seen
in RobLa's email here:
http://lists.wikimedia.org/pipermail/wikitech-l/2013-August/071441.html
== Thursday ==
* MediaWiki 1.22wmf14 will be rolled out to all Wikipedias.
* CodeEditor support will be enabled for all JS and CSS on all wikis
As always, let me know if you have any questions.
Greg
--
| Greg Grossmeier GPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @greg A18D 1138 8E47 FAC8 1C7D |
Hi,
shameless plug to remind you of my regular blogging on Bugzilla use:
If you deal a lot with Wikimedia Bugzilla and if you deal with a lot of
bug reports with varying quality (for example because you triage bug
reports), I recommend to take a look at my latest blogpost covering
Greasemonkey scripts to save some time:
http://blogs.gnome.org/aklapper/2013/08/23/bugzillatips-triage-helpertools-…
Comments, testing, feedback, patches, unicorns welcome.
Cheers & enjoy the weekend!
andre
--
Andre Klapper | Wikimedia Bugwrangler
http://blogs.gnome.org/aklapper/
Hi,
It would be great if a few pairs of eyes could take a look at
https://meta.wikimedia.org/wiki/Tech/News/2013/34 before I send it to
translators, to check that I haven't missed anything super-important
or misunderstood what the commits are about.
The tech newsletter is aimed at non-expert Wikimedians whose knowledge
of English may be limited, so the language may seem vague or naive to
developers. If you see factual errors, please correct them (or let me
know directly), but please keep the language simple :)
Many thanks for your help.
--
Guillaume Paumier
Technical Communications Manager — Wikimedia Foundation
Hello,
The Wikimedia Language Engineering team will be hosting a bug triage
session on Wednesday, August 28th 2013 at 17:00 UTC (10:00 PDT) for
some of the bugs that exist in languages written from Right-to-Left
(RTL). During this 1 hour session we will be using the etherpad
linked below to collaborate. We have already listed some bugs, but
please feel free to add more bugs (or file new ones!), and comments
about what you’d like to see addressed during the session. You can
send questions directly to me on email or IRC (nick: arrbee). Please
see below for the event details.
Thank you.
regards
Runa
=== Event Details ===
# What: Bug triage session for RTL language bugs
# Date: August 28, 2013 (Wednesday)
# Time: 1700-1800 UTC, 1000-1100 PDT (Timezone conversion:
http://www.timeanddate.com/worldclock/fixedtime.html?iso=20130828T1700
)
# IRC Channel: #mediawiki-i18n (Freenode)
# Etherpad: https://etherpad.wikimedia.org/p/BugTriage-i18n-2013-08
Questions can be sent to: runa at wikimedia dot org
--
Language Engineering - Outreach and QA Coordinator
Wikimedia Foundation