Welcome Yuri and Adam! Great to have you on the mobile team which is growing!
Best,
Alolita
On Mon, Mar 18, 2013 at 10:32 AM, Dan Foy <dfoy(a)wikimedia.org> wrote:
> Welcome Yuri and Adam! We're glad to have you join us!
>
> - Dan
>
>
>
> On Mon, Mar 18, 2013 at 10:29 AM, Tomasz Finc <tfinc(a)wikimedia.org> wrote:
>>
>> Greetings all,
>>
>> I'm pleased to announce that the mobile department has two new staff
>> members. Yuri Astrakhan & Adam Baso join as sr. software developers on
>> the mobile partner team. In this role Yuri and Adam will support
>> projects like Wikipedia Zero, SMS/USSD, and J2ME to further the reach
>> of our projects in geographic areas that have both financial and
>> technical impediments to access Wikipedia. They will be working
>> closely with Kul and Dan from the global development group.
>>
>> Yuri has been heavily involved in Wikipedia-related projects 2005-2007
>> developing the API framework and querying subsystem, contributing to
>> pywikibot code, and making millions of changes as yurikbot, while at
>> the same time working as a software consultant for several large
>> banks. In 2008 Yuri joined a small hedge-fund to lead the development
>> of an automated trading platform. While there, Yuri continued various
>> open source projects such as time-series database (timeseriesdb).
>>
>> After over five years, Yuri has rejoined the MediaWiki community and
>> will be working for us from New York.
>>
>> Adam spent the past seven years working in the field of information
>> security, specializing in application security, identity management,
>> and encryption in the retail, government, and banking sectors. Adam
>> led the OWASP Minneapolis-Saint Paul chapter for a couple of years,
>> and proudly organized the OWASP AppSec USA 2011 conference. Adam and
>> his wife are relocating to San Francisco from Minneapolis-Saint Paul,
>> and they look forward to the opportunity to live in such a thriving
>> software-friendly community.
>>
>> The mobile group is excited and proud to welcome both Yuri & Adam as
>> sr. engineers to the partner team.
>>
>> This completes the team and allows them to work aggressively to reach
>> our 4 billion page target through outreach projects like Wikipedia
>> Zero.
>>
>> Please join me in welcoming Yuri and Adam to the Wikimedia Foundation!
>>
>> --tomasz
>>
>> _______________________________________________
>> Wmfall mailing list
>> Wmfall(a)lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wmfall
>
>
>
>
> --
> Dan Foy
> Technical Manager, Mobile Partnerships
> Wikimedia Foundation
>
>
>
> _______________________________________________
> Wmfall mailing list
> Wmfall(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wmfall
>
--
Alolita Sharma
Director of Engineering
Language Engineering
Wikimedia Foundation
Hello,
I am going to teach a graduate-level course on quantum physics (quantum
simulation) at the University of Hannover, Germany. At the moment, I am
evaluating Wikiversity for collaboration with other lecturers and
students on the creation of lecture notes.
As the notes will contain a significant amount of mathematical content,
it would be extremely useful to export into LaTeX format for
distributing printed copies. As far as I know, the Wiki2LaTeX extension
to MediaWiki [1] provides this function. Would it be possible to include
this extension into Wikiversity?
Best regards,
Hendrik
[1] <http://www.mediawiki.org/wiki/Extension:Wiki2LaTeX>
--
Dr. Hendrik Weimer Phone: +49-511-762-4836
Institut für Theoretische Physik Fax: +49-511-762-3023
Leibniz Universität Hannover E-Mail: hweimer(a)itp.uni-hannover.de
Appelstr. 2, 30167 Hannover, GERMANY http://www.itp.uni-hannover.de/~weimer/
https://github.com/Wikinaut/MySimpleCertViewer
Because I couldn't find that on the net, I wrote a tiny certificate
viewer in PHP to inspect fingerprints (MD5, SHA1, SHA256) and other data.
Maybe a starting point for you to bake your own cert inspectors.
T.
On Thu, 14 Mar 2013 12:55:16 -0700, Brion Vibber <bvibber(a)wikimedia.org>
wrote:
> Text captchas will have a 'question' subfield to be presented; image
> captchas will have a 'url' field which should be loaded as the image.
> 'type' and 'mime' will vary, and probably shouldn't be used too closely.
Some captchas (iirc ReCaptcha) won't give you easy access to the image.
And this plan won't be compatible with the variety of new captcha types
like the KittenAuth-like category of CAPTCHAs.
Differentiating between the types just to support text CAPTCHAs (which are
really the easiest CAPTCHAs to break) also sounds unfortunate.
We might just have to do something that outputs a blob of html or a url to
a html document (either perhaps as a frame url or a url to fetch the blob
of html from).
--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]
Perhaps you've read that Google will drop their RSS services and also
RSS chrome extension.
I want to inform you about how MediaWiki can help.
+ https://www.mediawiki.org/wiki/Extension:RSS
consume RSS feeds from elsewhere and render them one or more MW wiki pages
+ https://www.mediawiki.org/wiki/Extension:WikiArticleFeeds
generate RSS Feed(s) from content of MW page(s): authoring feeds by
editing a MediaWiki page
I maintain both extensions and made them compatible a while ago, bot are
in git/gerrit.
Documentation is up-to-date, and they run with MediaWiki latest versions
(i.e. core master).
Before you ask, yes, you can render feeds in a MW using E:RSS,
which are generated in the same or different MW by using E:WikiArticleFeeds
T.
Hello,
I'm a undergraduate in China. I would like to learn about some detailed
information about your organization's information security methods and
regulations, because I want to use Wikimedia as an example and introduce
its security regulations on my Information security lecture.
If it is available, would you please give me some description on what
information security regulations and methods your organization applies, in
terms of software, hardware, network, data storage and personnel?
Thank you very much :)
--
发自我的 Gmail
Voldemort,徐中强
http://about.me/xzq
Is there some list or tool that identifies Wikipedia pages that are
slow to parse?
My interest is mainly in the context of Lua deployment. I would like
to identify non-obvious templates that may be having an appreciable
impact on performance. (As opposed to things like {{cite}}, where the
performance problems are already well-known.)
I don't suppose the database stores a "time for last parse" somewhere?
On enwiki we've already made Lua conversions with most of the string
templates, several formatting templates (e.g. {{rnd}}, {{precision}}),
{{coord}}, and a number of others. And there is work underway on a
number of the more complex overhauls (e.g. {{cite}}, {{convert}}).
However, it would be nice to identify problematic templates that may
be less obvious.
-Robert Rohde
aka Dragons_flight
Hi guys
Does anybody know about serial programming using termios?
I found this link in wikibook : http://en.wikibooks.org/wiki/Serial_Programming/termios
I have some questions. Please let me know if anybody has ever worked with termios.
Thanks