FYI
---------- Forwarded message ---------
From: Antoine Musso <hashar+wmf(a)free.fr>
Date: Fri, Sep 16, 2016 at 6:00 PM
Subject: [Wikitech-l] Wikis back to 1.28.0-wmf.18 (was: Re: Upgrade of
1.28.0-wmf.19 to group 1 is on hold)
To: <wikitech-l(a)lists.wikimedia.org>
On 14/09/16 23:05, Antoine Musso wrote:
> <snip>
> For now. I am holding the train. Will reassess tomorrow and ideally
> push group1 at 19:00 UTC then follow with group2 at 20:00UTC.
..
>
> Up-to-date status:
> https://tools.wmflabs.org/versions/
>
> MW-1.28.0-wmf.19 deployment blockers
> https://phabricator.wikimedia.org/T143328
Hello,
I have pushed 1.28.0-wmf.19 on thursday at 19:10 UTC. It quickly got
noticed that Wikidata was no more able to dispatch updates to the wikis
which is tracked in:
https://phabricator.wikimedia.org/T145819
This Friday, I was busy debugging the issue and did not notice a task
about account creation being broken:
https://phabricator.wikimedia.org/T145839
As soon as I seen that, I have reverted to 1.28.0-wmf.18 at 12:50 UTC.
Account creation has been impossible from Thursday 19:10 UTC until
Friday 12:50 UTC. Please pass the word around as needed.
My deep apologizes for not having found out earlier the impact was so
important.
I will write an incident report this afternoon with actionables.
--
Antoine "hashar" Musso
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
The Wikimedia Foundation's Discovery and Research teams recently hosted an
introductory workshop on the SPARQL query language and the Wikidata Query
Service.
We made the video stream <https://www.youtube.com/watch?v=NaMdh4fXy18> and
materials
<https://www.wikidata.org/wiki/Wikidata:SPARQL_query_service/2016_SPARQL_Wor…>
(demo
queries, slidedecks) from this workshop publicly available.
Guest speakers:
- Ruben Verborgh, *Ghent University* and *Linked Data Fragments*
- Benjamin Good, *Scripps Research Institute* and *Gene Wiki*
- Tim Putman, *Scripps Research Institute* and *Gene Wiki*
- Lucas, *@WikidataFacts*
Dario and Stas
*Dario Taraborelli *Head of Research, Wikimedia Foundation
wikimediafoundation.org • nitens.org • @readermeter
<http://twitter.com/readermeter>
Hello,
We have a small taskforce effort at Schema.org (myself and a few others)
that have been adding Schema.org mappings into Wikidata to help with
alignment.
However, an issue came up where we did not have a complete 'equivalent
class' between the 2 topics but had the need to say 'external subclass'
instead. Example: https://www.wikidata.org/wiki/Q474191
We searched and searched through Talk pages and querying Wikidata
properties...but did not find something close to the meaning of 'external
subclass' or a property that expresses 'skos:closeMatch' ?
Does anyone know if a useful property like 'external subclass' is available
to use to help us with our mapping ?
Thad
+ThadGuidry <https://www.google.com/+ThadGuidry>
Tomorrow I plan to apply the following update to the Stable Interface Policy:
https://www.wikidata.org/wiki/Wikidata_talk:Stable_Interface_Policy#Propose…
Please comment there if you have any objections.
Thanks!
--
Daniel Kinzler
Senior Software Developer
Wikimedia Deutschland
Gesellschaft zur Förderung Freien Wissens e.V.
Dear users, developers and all people interested in semantic wikis,
We are very happy to announce the program of the 13th Semantic MediaWiki
Conference in Frankfurt am Main, Germany! People from business, academia
and non-profit organizations will give a variety of very interesting
talks and presentations about applying semantic wikis as well as about
newest developments in the field.
We would also like to remind you of the upcoming end to the early bird
registration registration period on September 14, 2016. Please register
via Tito [0] if you have not done so already.
Important facts reminder:
* Dates: September 28th to 30th 2016 (Wednesday to Friday)
* Location: German Institute for International Educational Research
(DIPF), Schloßstraße 29, Frankfurt am Main, Germany.
* Conference page: https://www.semantic-mediawiki.org/wiki/SMWCon_Fall_2016
* Participants: Everybody interested in semantic wikis, especially in
Semantic MediaWiki, e.g. users, developers, consultants, business
representatives and researchers.
News on the program:
* The tutorial program has been announced and made available [1]
* The conference program has now also been announced and made available
[2]. We are very happy that the keynote will be held by Prof. Dr. Sören
Auer of Fraunhofer Institute for Intelligent Analysis and Information
Systems IAIS, Enterprise Information Systems.
* A social program is also being prepared [3].
* We encourage contributions for poster sessions about semantic wikis;
for a list of topics, see [4]. Please add your proposals by e-mail to
one of the program chairs. (CC)
* Presentations will generally be video and audio recorded and made
available for others after the conference.
Organization:
* Deutsches Institut für Internationale Pädagogische Forschung (DIPF)
[5] and Open Semantic Data Association e. V. [6] have become the
official organisers of SMWCon Fall 2016.
* Very special thanks go to Wikimedia Deutschland [7] for being our
platinum as well as ArchiXL [8] for being our gold supporter.
If you have questions you can contact Sabine Melnicki, Kendra Sticht and
Christoph Schindler (Program Chairs), Karsten Hoffmeyer (General Chair)
or Lia Veja (Local Chair) by e-mail (CC).
We will be happy to see you in Frankfurt!
Sabine Melnicki, Kendra Sticht and Christoph Schindler (Program Board)
[0] <https://ti.to/smwconfall2016/frankfurt>
[1] <https://www.semantic-mediawiki.org/wiki/SMWCon_Fall_2016/Tutorial_Day>
[2]
<https://www.semantic-mediawiki.org/wiki/SMWCon_Fall_2016/Conference_Days>
[3]
<https://www.semantic-mediawiki.org/wiki/SMWCon_Fall_2016#Social_program>
[4] <https://www.semantic-mediawiki.org/wiki/SMWCon_Fall_2016/Announcement>
[5] <http://www.dipf.de/en/about-us/contact>
[6] <http://www.opensemanticdata.org/>
[7] <http://www.wikimedia.de/>
[8] <http://www.archixl.nl/>
hellowhat is the programming language used in template of infobox?how can transmit/map from infobox in english to other language? Best RegardsEng.Amany SlamaaTeaching AssistantComputer and Information Science
Hi,
I had this idea for some time now but never got to test/write it down.
DBpedia extracts detailed context information in Quads (where possible) on
where each triple came from, including the line number in the wiki text.
Although each DBpedia extractor is independent, using this context there is
a small window for combining output from different extractors, such as the
infobox statements we extract from Wikipedia and the very recent citation
extractors we announced [1]
I attach a very small sample from the article about Germany where I filter
out the related triples and order them by the line number they were
extracted from e.g.
dbr:Germany dbo:populationTotal "82175700"^^xsd:nonNegativeInteger <
http://en.wikipedia.org/wiki/Germany?oldid=736355524#*absolute-line=66*
&template=Infobox_country&property=population_estimate&split=1&
wikiTextSize=10&plainTextSize=10&valueSize=8> .
<https://www.destatis.de/DE/PresseService/Presse/Pressemitteilungen/2016/08/
PD16_295_12411pdf.pdf;jsessionid=996EC2DF0A8D510CF89FDCBC74DBAE
9F.cae2?__blob=publicationFile> dbp:isCitedBy dbr:Germany <
http://en.wikipedia.org/wiki/Germany?oldid=736355524#*absolute-line=66*> .
Looking at the wikipedia article we see:
|population_estimate = 82,175,700<ref>{{cite web|url=https://www.destatis.
de/DE/PresseService/Presse/Pressemitteilungen/2016/08/PD16_295_12411pdf.pdf;
jsessionid=996EC2DF0A8D510CF89FDCBC74DBAE9F.cae2?__blob=
publicationFile|title=Population at 82.2 million at the end of 2015 –
population increase due to high immigration|date=26 August 2016|work=
destatis.de}}</ref>
Could this approach be a good candidate reference suggestions in Wikidata?
(This particular one is already a reference but the anthem and GDP in the
attachment are not for example)
There are many things that can be done to improve the matching but before
getting into details I would like to see if this idea is worth exploring
more or not
Cheers,
Dimitris
[1] http://www.mail-archive.com/dbpedia-discussion%40lists.sourceforge.net/
msg07739.html
--
Dimitris Kontokostas
Department of Computer Science, University of Leipzig & DBpedia Association
Projects: http://dbpedia.org, http://rdfunit.aksw.org,
http://aligned-project.eu
Homepage: http://aksw.org/DimitrisKontokostas
Research Group: AKSW/KILT http://aksw.org/Groups/KILT
Hi Wikidatans,
After going past my 500th edit on Wikidata #Whee! I was hoping to dip my
toe into doing something that would involve a larger scale project, like
adding database information to Wikidata.
There's a database I use all the time that is excellent, rich, deep, and
well-deployed -- at JewishGen.org
main search page: http://www.jewishgen.org/Communities/Search.asp
example page:
http://data.jewishgen.org/wconnect/wc.dll?jg~jgsys~community~-524980
I started a Property proposal here:
https://www.wikidata.org/wiki/Wikidata:Property_proposal/Place#JewishGen_Lo…
I have also contacted the folks over at JewishGen to ask if they might
provide me with raw data, initially even just with the locality page IDs,
then hopefully more rich / fuller data that's in the database.
I was wondering if this is
(a) the typical approach people use when importing data
(b) if you have any advice / best practices to share
(c) also, if I should try and do a wget to scrub for this data (if that's
even possible)? do people do this to grab data?
This information, I envision being used as part of a unique identifier that
could be built into infoboxes, and might also be a sort of templatized box
even (although I don't hugely love the issue of restricted / redirected
editing away from Wikipedia). But I would really like to see this
information in a pathway to Wikipedia. I think it would improve a lot of
these town pages, a lot of which are stubs.
Best -- and thanks in advance for any advice,
Erika
*Erika Herzog*
Wikipedia *User:BrillLyle <https://en.wikipedia.org/wiki/User:BrillLyle>*