Maybe you can try GNU Unifont.http://en.wikipedia.org/wiki/GNU_Unifont
TTF format:http://www.lgm.cl/trabajos/unifont/index.en.html
Original Message
Sender:Yuri Astrakhanyastrakhan(a)wikimedia.org
Recipient:Wikimedia developerswikitech-l(a)lists.wikimedia.org
Date:Wednesday, Jul 30, 2014 05:33
Subject:[Wikitech-l] Do we have a universal font in production?
I'm trying to render an image which uses characters from all of the languages supported by WP. Is there a single font deployed on production servers that include all scripts? Any simple font would do, preferably TTF arial-style. Thanks! _______________________________________________ Wikitech-l mailing list Wikitech-l(a)lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
The Wikimedia Research Hackathon on August 6 and 7 takes place parallel to
the general Wikimania Hackathon in London.
Wikimania Hackathon information is available at
https://wikimania2014.wikimedia.org/wiki/Hackathon
Research Hackathon information is available at
https://meta.wikimedia.org/wiki/Research:Labs2/Hackathons/August_6-7th,_2014
>From the Research Hackathon info page: this "is an opportunity for anyone
interested in research on wikis, Wikipedia, and other open collaborations
to meet, share ideas, and work together. It's being organized by
researchers in academia and the Wikimedia Foundation, but we want anyone
interested in research to participate. Whether or not you consider yourself
a researcher, or would ever want to be one, come with questions, answers,
data, code, crazy ideas... or just your insatiable curiosity."
Local participation will occur at Wikimania London and in Philadelphia, PA,
US. Remote participation is possible and will include researchers and
community members globally.
Please see the Research Hackathon information page for scheduling and
sign-up details.
Further questions may be directed to Aaron Halfaker (ahalfaker(a)wikimedia.org)
or Leila Zia (leila(a)wikimedia.org).*
Pine
*A $1 fine will be imposed by Oliver Keyes on anyone who misspells Leila's
name or misdirects emails to the WMF Executive Director.
Hi everyone,
I’d like to announce an organizational change at Wikimedia Foundation
in the Platform Engineering group. For those that aren't terribly
interested in how WMF's org chart looks, you can skip the rest of this
email. :-)
Yesterday, we formalized “Release Engineering” as a team, and promoted
Greg Grossmeier to “Release Team Manager” with everyone on the team
reporting to him.
In addition to Greg, the new team comprises:
* Antoine Musso
* Chris McMahon
* Dan Duvall
* Mukunda Modell
* Rummana Yasmeen
* Sam Reed
* Zeljko Filipin
They are broadly responsible for the lifecycle of code from the point
that a developer is ready to check it in through its deployment on our
site, maintaining the processes and tools that reduce negative user
impact of site software changes while simultaneously making software
change deployment efficient and joyful.
On a more detailed level, here’s just a few things the group is responsible for:
* Code and bug report hosting - currently Gerrit and Bugzilla, but in
the glorious future, Phabricator
* Test infrastructure - the team maintains the Beta Cluster, with help
from TechOps
* Test automation - building the Cucumber/RSpec-based infrastructure
for automating browser tests
* Manual testing - actually looking at the product and making sure it
does what all the robots tell us it should be doing
* Test tools - tools that developers can use to test their own code
such as Vagrant
* Deployment tooling - the infrastructure we use to push code out to
production, like scap
More information about the team can be found here:
https://www.mediawiki.org/wiki/Wikimedia_Release_and_QA_Team
You may notice that that page has been around a while (August 2013).
Greg and Chris McMahon have been leading this as a “virtual team” for
the past year, with a shared goal-setting and day-to-day organization.
This has demonstrated that there is a strong case for creating a
formalized team.
Please join me in congratulating Greg and wishing the newly formalized
team continued success!
Rob
This interesting bot showed up on hackernews today:
https://news.ycombinator.com/item?id=8018284
While in this instance the access to anonymous' editors IP addresses is
definitely useful in terms of identifying edits with probable conflict of
interest, it makes me wonder what the history is behind the fact that
anonymous editors are identified by their IP addresses on WMF-hosted wikis.
IP addresses are closely guarded for registered users, why wouldn't
anonymous users be identified by a hash of their IP address in order to
protect their privacy as well? The exact same functionality of being able
to see all edits by a given anonymous IP would still exist, the IP itself
just wouldn't be publicly available, protected with the same access rights
as registered users'.
The "use case" that makes me think of that is someone living in a
totalitarian regime making a sensitive edit and forgetting that they're
logged out. Or just being unaware that being anonymous on the wiki doesn't
mean that their local authorities can figure out who they are based on IP
address and time. Understanding that they're somewhat protected when logged
in and not when logged out requires a certain level of technical
understanding. The easy way out of this argument is to state that these
users should be using Tor or something similar. But I still wonder why we
have this double standard of protecting registered users' privacy in
regards to IP addresses and not applying the same for anonymous users, when
simple hashing would do the job.
Ambassadors (and developers),
I am tremendously happy to announce that the new PDF rendering service is
live for testing on the cluster. At this time, while we shake out
production bugs, it is only available via Special:Book using the 'e-book
(PDF, ocg latex renderer)'. You can also render a specific page by mangling
a 'Download as PDF' sidebar URL as shown in [1]. Specifically, you need to
change the 'writer' GET param to rdf2latex.
Among other things, this service should have significantly better RTL and
non latin language support.
We do have two known large bugs
* We do not yet have table support
* Lots of images fail to render -- this is a recent regression so we should
have a fix quickly.
If you have additional bugs to report; please file a bug in bugzilla under
the Collection MediaWiki extension [2].
For fun plots see ganglia [3] or graphite [4] under the ocg/pdf node.
Note: During the deployment the new renderer was available in the sidebar.
This was reverted fairly quickly, but some pages may still have the link in
cache. It will go away on the next page render / purge.
[1]
https://en.wikipedia.org/w/index.php?title=Special:Book&bookcmd=render_arti…
[2]
https://bugzilla.wikimedia.org/enter_bug.cgi?product=MediaWiki%20extensions…
[3] http://ganglia.wikimedia.org/latest/?c=PDF%20servers%20eqiad
[4] graphite.wikimedia.org
~Matt Walker
Wikimedia Foundation
I have experience with Ubuntu but MediaWiki says that Ubuntu is
unsupported. Which Linux distro would people recommend, and which distro of
Linux does WMF use for MediaWiki? I am thinking about installing Debian but
am open to any suggestions that have a friendly UX.
Solaris is an option also.
Pine
Hi!
I have the following setup: I develop on my local machine (PHPstorm) and I
save the files in one of the Vagrant shared folders. I load my js files
like that:
public static function onBeforePageDisplay( OutputPage &$out, Skin
&$skin ) {
$out->addModules("ext.Pydio");
}
The thing is that sometimes after I make a change I see that my javascript
files is either not fully loaded (and I have undexpected end of file
error), or loaded with some invisible characters (\u0) right after the end
of the file (and I have "unexpected toke ILLEGAL" error). It looks like
this:
http://i.imgur.com/GGxtD89.png - my code in phpstorm
http://i.imgur.com/I8UR8Ly.png - my error in Chrome developer tools
The javascript itself is totally ok. In fact the problem can appear even
when I just add a comment to the file.
My first thought was that it's vagrant shared folders bug. It doesn't seem
to be true - the file on the virtual machine looks normal.
Can it be the ResourceLoader issue? Did anyone see this behavior before?
Cheers,
-----
Yury Katkov
Hello Wikimanaias,
This is Osama Khalid, from the Arabic Wikipedia and Wikimedia Commons.
I was wondering whether anyone here has a good number of the raw
pagecounts downloaded so I can copy them to a hard disk during
Wikimania 2014. They have been very helpful for me in trying to
determine which articles need the most attention, but my sample size
has always been too small since they are very huge and
dumps.wikimedia.org restricts parallel downloads.
If you have a bunch of them that you can bring it, please let me know.
Regards,
Hi everybody,
I was on the brink of celebrating the one-year anniversary of a patch I submitted being open, but today it was finally merged!
https://gerrit.wikimedia.org/r/77645
The old User::comparePasswords() and User::crypt() functions have been replaced with a new password hashing API. This means MediaWiki now natively supports Bcrypt and PBKDF2 as replacement password hashing algorithms. Furthermore, the system allows seamless transitioning, meaning users’ password hashes will be updated automatically the next time they log in.
This means that MD5 is almost out the door, which is a big win (a follow up patch, https://gerrit.wikimedia.org/r/149658, changes the default to PBKDF2, which would mean any wiki that upgrades to 1.24 would automatically switch away from MD5).
I’d like to thank Aaron Schulz, Chris Steipp, Krinkle, and many others who helped get this through.
--
Tyler Romeo
0x405D34A7C86B42DF
I am pleased to announce (after far too long of a delay, my apologies),
that Mark Hershberger and Markus Glaser will take on the task of
managing the third-party releases of MediaWiki for another year.
Congrats Mark and Markus!
I'd like to explicitly thank the International Consortium team for their
proposal. The choice was indeed a hard one to make and I'm glad we had
the second proposal to add context and perspective to Mark's and
Markus's.
There will be a few changes to this work this coming year, mostly in
terms of transparency. Mark and Markus have agreed to conduct quarterly
reviews for their work and the output of those meetings will be publicly
shared. That means you'll (soon) be able to see the quarterly goals and
their progress. Also, we hope to see more collaboration between Mark and
Markus and the wider community in the supportive work; the work that
everyone can help with to make the releases as good as they can be.
We'll be monitoring this collaboration in review process over the course
of the year and we currently plan to call out collaboration goals in
next year's call for proposals.
I look forward to another year of working with Mark and Markus!
Best,
Greg
--
| Greg Grossmeier GPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @greg A18D 1138 8E47 FAC8 1C7D |