Also the old Google Maps doesn't show the Wikipedia layer any longer.
> Do you have a specific article that talks about this? Do remember the new Google Maps is still in beta. It's (still) impossible to use on older computers because it is so slow and laggy - it's possible the WMF could lobby them to keep it around.
> Mono
I have been thinking about this for a while, and now finally managed to
write it down as a proposal. Details are on meta on the following link,
below is the intro to the proposal:
<http://meta.wikimedia.org/wiki/A_proposal_towards_a_multilingual_Wikipedia>
I tried to anticipate some possible questions and provide answers on the
page. Besides that, I obviously hope that Wikimania could provide a place
to start this conversation. And yes, I am aware that the proposal would
lead to a very restrictive solution, but imagine what good it already could
achieve! And since it is not meant to replace anything, but enrich our
current projects... well, read for yourself.
Cheers,
Denny
Wikipedia provides knowledge in more than 200 languages. Whereas a small
number of languages are fortunate enough to have a large Wikipedia, many of
the language editions are far away from providing a comprehensive
encyclopedia by any measure. There are several approaches towards closing
this gap, mostly focusing on increasing the number of contributors to the
small language editions or to improve the provision of automatic or
semi-automatic translations of articles. Both are viable. In the following
we present a proposal for a different approach, which is based on the idea
of multilingual Wikipedia.
Imagine a small extension to the template system, where a template call
like *{{F12}}* would not be expanded by a call to the template
Template:F12, but rather to Template:F12/en, i.e. the template name with
the selected language code of the reader of the page. A template call such
as *{{F12:Q64|Q5519|Q183}}* can be expanded by Template:F12/en into *“Berlin
is the capital of Germany.”* and by Template:F12/de into *“Berlin ist die
Hauptstadt Deutschlands.”* (in the example, the template parameters Q5119,
Q64 and Q183 refer to the Wikidata items for capital, Berlin and Germany
respectively, which the templates query for the label in the respective
language). Sentence by sentence could be created in order to provide for a
simple article.
That wiki would consist of *content*, i.e. the article pages, possibly just
a simple series of template calls, and *frames*, i.e. the templates that
lexicalize the parameters of a given template call into a sentence (Note
that “sentence” here should not be considered literally. It could be a
table, an image, anything). The implementation of the frames can be done in
normal wiki template syntax, in Lua, in a novel mechanism, or a mix of
these. This would be up to the communities creating them.
Read the rest here:
<http://meta.wikimedia.org/wiki/A_proposal_towards_a_multilingual_Wikipedia>
--
Project director Wikidata
Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin
Tel. +49-30-219 158 26-0 | http://wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/681/51985.
Hello,
The Wikimedia Language Engineering team will be hosting a bug triage
session on Wednesday, August 28th 2013 at 17:00 UTC (10:00 PDT) for
some of the bugs that exist in languages written from Right-to-Left
(RTL). During this 1 hour session we will be using the etherpad
linked below to collaborate. We have already listed some bugs, but
please feel free to add more bugs (or file new ones!), and comments
about what you’d like to see addressed during the session. You can
send questions directly to me on email or IRC (nick: arrbee). Please
see below for the event details.
Thank you.
regards
Runa
=== Event Details ===
# What: Bug triage session for RTL language bugs
# Date: August 28, 2013 (Wednesday)
# Time: 1700-1800 UTC, 1000-1100 PDT (Timezone conversion:
http://www.timeanddate.com/worldclock/fixedtime.html?iso=20130828T1700
)
# IRC Channel: #mediawiki-i18n (Freenode)
# Etherpad: https://etherpad.wikimedia.org/p/BugTriage-i18n-2013-08
Questions can be sent to: runa at wikimedia dot org
--
Language Engineering - Outreach and QA Coordinator
Wikimedia Foundation
The Open Knowledge Foundation is looking for a part time (20 hours per
week) community manager to help build our growing network of cultural
heritage professionals, researchers and hackers working to open up
data and content held by galleries, libraries, archives and museums
(GLAMs) on the web.
Role description: The Open Knowledge Foundation is recruiting a
community manager with experience in the field of open cultural
heritage data and/or the Digital Humanities to work on the OpenGLAM
initiative and Digitised Manuscripts to Europeana project.
Dead line: Applications close on 16th September 2013.
More details at: http://okfn.org/jobs/#OpenGLAMCommunityManager
I think it would be useful to share with your local Wikimedia groups,
mainly those good in GLAM activities.
Tom
--
Everton Zanella Alvarenga (also Tom)
OKF Brasil - Rede pelo Conhecimento Livre
http://br.okfn.org
The Berkman Center just came out with a report on the public
discussions surrounding the SOPA-PIPA actions; drawing on the Media
Cloud work by Yochai Benkler and others.
It provides context for the discussions on the English Wikipedia, and
captures the differences between the grassroots and top-down decisions
by different organizations and media channels who took part in the
blackout.
An interactive time-visual shows how the conversation was driven at
different times by different communities:
http://cyber.law.harvard.edu/research/mediacloud/2013/mapping_sopa_pipa/#
SJ
---------- Forwarded message ----------
Publication Release: July 25
Social Mobilization and the Networked Public Sphere: Mapping the
SOPA-PIPA Debate
Dear Friends and Colleagues,
The Berkman Center for Internet & Society is pleased to announce the
release of a new publication from the Media Cloud project, Social
Mobilization and the Networked Public Sphere: Mapping the SOPA-PIPA
Debate, authored by Yochai Benkler, Hal Roberts, Rob Faris, Alicia
Solow-Niederman, and Bruce Etling.
Social Mobilization and the Networked Public Sphere: Mapping the
SOPA-PIPA Debate
>From the abstract: In this paper, we use a new set of online research
tools to develop a detailed study of the public debate over proposed
legislation in the United States that was designed to give prosecutors
and copyright holders new tools to pursue suspected online copyright
violations. Our study applies a mixed-methods approach by combining
text and link analysis with human coding and informal interviews to
map the evolution of the controversy over time and to analyze the
mobilization, roles, and interactions of various actors.
This novel, data-driven perspective on the dynamics of the networked
public sphere supports an optimistic view of the potential for
networked democratic participation, and offers a view of a vibrant,
diverse, and decentralized networked public sphere that exhibited
broad participation, leveraged topical expertise, and focused public
sentiment to shape national public policy.
We also offer an interactive visualization that maps the evolution of
a public controversy by collecting time slices of thousands of
sources, then using link analysis to assess the progress of the debate
over time. We used the Media Cloud platform to depict media sources
(“nodes”, which appear as circles on the map with different colors
denoting different media types). This visualization tracks media
sources and their linkages within discrete time slices and allows
users to zoom into the controversy to see which entities are present
in the debate during a given period as well as who is linking to whom
at any point in time.
The authors wish to thank the Ford Foundation and the Open Society
Foundation for their generous support of this research and of the
development of the Media Cloud platform.
About Media Cloud
Media Cloud, a joint project of the Berkman Center for Internet &
Society at Harvard University and the Center for Civic Media at MIT,
is an open source, open data platform that allows researchers to
answer complex quantitative and qualitative questions about the
content of online media. Using Media Cloud, academic researchers,
journalism critics, and interested citizens can examine what media
sources cover which stories, what language different media outlets use
in conjunction with different stories, and how stories spread from one
media outlet to another. We encourage interested readers to explore
Media Cloud.
The Berkman Center for Internet & Society at Harvard University was
founded to explore cyberspace, share in its study, and help pioneer
its development. For more information, visit
http://cyber.law.harvard.edu/.
This is a forward from the wikitech-ambassadors list.
https://meta.wikimedia.org/wiki/HTTPS is now updated with the new date.
-------- Original Message --------
Subject: [Wikitech-ambassadors] Fwd: HTTPS for logged in users delayed.
New date: August 28
Date: Wed, 21 Aug 2013 11:30:51 -0700
From: Rob Lanphier <robla(a)wikimedia.org>
Reply-To: Coordination of technology deployments across
languages/projects <wikitech-ambassadors(a)lists.wikimedia.org>
To: Coordination of technology deployments across languages/projects
<wikitech-ambassadors(a)lists.wikimedia.org>
Hi everyone,
After assessing the current readiness (or lack thereof) of our HTTPS
code, we've decided to postpone the deployment for a week. We have a
number of things that we'd like to get cleaner resolution on:
* Use of GeoIP vs enabling on per wiki basis
* Use of a preference vs login form checkbox vs hidden option vs
sensible default
* How interactions with login.wikimedia.org will work
* Validation of our HTTPS test methodology
The new plan is to deploy on Wednesday, August 28 between 20:00 UTC
and 23:00 UTC. Prior to that, we plan on having a very limited
deployment to our test wikis, and we're also planning to deploy to
mediawiki.org. Assuming this is sorted out and we have made our test
deployments by end of day Monday, August 26, we should have time to
validate our assumptions and give people time to see the new system in
action.
More info is (or will be) available here:
https://meta.wikimedia.org/wiki/HTTPS
(or here if you prefer: http://meta.wikimedia.org/wiki/HTTPS )
Thanks everyone for your patience.
Rob
_______________________________________________
Wikitech-ambassadors mailing list
Wikitech-ambassadors(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-ambassadors
On 21 August 2013 07:49, Terry Chay <tchay(a)wikimedia.org> wrote:
...
> Luckily, the standard for the Movement is consensus, not catering to every extremist view with 100% buy-in.
As a Commons user responsible for over 2.5 million edits, I would hope
that the WMF do not label or quickly dismiss me as an "extremist" if I
raise some questions about this notification.
I am concerned about how many valuable bot activities a mandated move
to https might break. Some will be fixed by operators such as myself
changing account preferences to force an opt-out or re-writing code,
however many useful bot activities have semi-retired operators,
particularly on Commons, and some are bound to just never be fixed and
their value will be lost. In planning this change, has some support
effort been allocated to fixing or re-hosting the bots that break
(such as taking the option of 'remotely' setting community-identified
useful bots to opt-out of https, at least for a test period, rather
than forcing an opt-in) and has there been a survey of this impact?
Though I agree we don't expect "100% buy-in", as an active volunteer,
batch uploader and bot writer, I would have expected to have been
given a friendly, non-confrontational and relaxed opportunity to raise
and consider these issues in a RFC or other consensus building
discussion on my home project and engage in discussion there, rather
than, apparently, no buy-in needed from us unpaid volunteers and
content creators.
Thanks,
Fae
--
faewik(a)gmail.com http://j.mp/faewm