Forwarding because this (ambitious!) proposal may be of interest to people
on other lists. I'm not endorsing the proposal at this time, but I'm
curious about it.
Pine
( https://meta.wikimedia.org/wiki/User:Pine )
---------- Forwarded message ---------
From: Denny Vrandečić <vrandecic(a)gmail.com>
Date: Sat, Sep 29, 2018 at 6:32 PM
Subject: [Wikimedia-l] Wikipedia in an abstract language
To: Wikimedia Mailing List <wikimedia-l(a)lists.wikimedia.org>
Semantic Web languages allow to express ontologies and knowledge bases in a
way meant to be particularly amenable to the Web. Ontologies formalize the
shared understanding of a domain. But the most expressive and widespread
languages that we know of are human natural languages, and the largest
knowledge base we have is the wealth of text written in human languages.
We looks for a path to bridge the gap between knowledge representation
languages such as OWL and human natural languages such as English. We
propose a project to simultaneously expose that gap, allow to collaborate
on closing it, make progress widely visible, and is highly attractive and
valuable in its own right: a Wikipedia written in an abstract language to
be rendered into any natural language on request. This would make current
Wikipedia editors about 100x more productive, and increase the content of
Wikipedia by 10x. For billions of users this will unlock knowledge they
currently do not have access to.
My first talk on this topic will be on October 10, 2018, 16:45-17:00, at
the Asilomar in Monterey, CA during the Blue Sky track of ISWC. My second,
longer talk on the topic will be at the DL workshop in Tempe, AZ, October
27-29. Comments are very welcome as I prepare the slides and the talk.
Link to the paper: http://simia.net/download/abstractwikipedia.pdf
Cheers,
Denny
_______________________________________________
Wikimedia-l mailing list, guidelines at:
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l(a)lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
<mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe>
Hello,
In the pre-conference of SWIB18 [1], Stacy Allison-Cassin and Dan Scott
have lead yesterday a great workshop on "Wikibase: configure, customize,
and collaborate".
Among others, the discussion on the panel have show the big interest on
a decentralized mode to use Wikidata throug a network of Wikibase instances.
To implement it, we have identify the following needs:
* Wikibase instance on Docker have to be update to current version of
the software.
* A users' community have to be build and remain in close connecting
interactions with the development team.
* Performing Import and export script between Wikidata and Wikibase
have to be achieve.
* Connecting Properties have to be developping in the way to
interoperate the instances.
Is the Wikidata Community agree with this proposal? The development team
also?
Is it necessary to open a second mailing-list dedicate to Wikibase?
Where is the best place to discuss of all this things?
Best Regards
Baptiste
[1] http://swib.org/swib18/index.html
Pour le lieu imaginaire
Baptiste de Coulon
conseiller en gestion de l'information
le lieu imaginaire
rue des Oeillets 14
2502 Bienne
Suisse
+41 78 636 32 17
bdc(a)lelieuimaginaire.ch <mailto:bdc@lelieuimaginaire.ch>
lelieuimaginaire.ch <https://lelieuimaginaire.ch>
Dear all,
Here at DBpedia, we are about to apply for a Wikimedia grant under the code name GlobalFactSyncRE. The ultimate goal of the project is to extract all infobox facts and their references found in over 120 Wikipedia language editions and produce a tool for Wikipedia editors that detects and displays differences across infobox facts in an intelligent way to help sync infoboxes between languages and/or Wikidata. The extracted references will also be used to enhance Wikidata.
This is our third, and hopefully a successful, application for a Wikimedia grant, which is only available at https://meta.wikimedia.org/wiki/Grants:Project/DBpedia/GlobalFactSyncRE
The evaluation review from the previous application was already quite good and the reviewers were interested in supporting the project, however, the quality of the project idea had to be improved in order to reach better scoring and receive the funding. In this direction, we did following updates to the proposal:
i) Updates regarding technological capabilities: created prototypes for some core ideas in the previous proposal, switched to a monthly release cycle of all extractions.
ii) Changes to the proposal itself: updated the template mapping statistics, elaborated more on the Freebase inclusion in Wikidata.
iii) New prototype and new ideas: the UI is updated, sync symbols can be shown with a gadget, new tests and included data from the Dutch and German National library, live version - on request extracting data from Wikipedia and its comparison to Wikidata.
A complete summary of all the updates can be found at https://meta.wikimedia.org/wiki/Grants_talk:Project/DBpedia/GlobalFactSyncRE
We are happy to hear your opinion and discuss particular aspects of the project idea.
We are looking forward to a fruitful collaboration with you and we thank you for your feedback!
Thank you and all the best,
Milan
===================================
Project grant: GlobalFactSyncRE
Grantee: Sebastian Hellmann (hellmann(a)informatik.uni-leipzig.de<mailto:hellmann@informatik.uni-leipzig.de>)
===================================
--
Milan Dojchinovski
http://www.dojchinovski.mk/
Institute for Applied Informatics (InfAI) @ Leipzig University
Knowledge Integration and Linked Data Technologies (KILT) / AKSW
Goerdelerring 9, Leipzig, Germany
Hello,
An advanced Wikidata training will take place on 15-16 December 2018 in
Mumbai (most possibly). Asaf Bartov (User:Ijon) will be the primary trainer
of this workshop.
Background and objectives
In India a few Wikidata trainings took place in recent years, notably User:Asaf
(WMF)/2017 Technical trainings in India
<https://meta.wikimedia.org/wiki/User:Asaf_(WMF)/2017_Technical_trainings_in…>.
After this training of Asaf, we saw several Wikidata developments in India
including the formation of WikiProject India, a few label-a-thons, and
Indian Independence label-a-thon. Now, we are trying to take the Wikidata
experience in India to the next level.
The plan is:
Wikimedians in India who are
- actively working on Wikidata and
- aware of basic Wikidata editing
will attend the training to
- learn more about advanced Wikidata editing
- plan future Wikidata activities in India
Please learn more about this workshop:
https://meta.wikimedia.org/wiki/CIS-A2K/Events/Advanced_Wikidata_Training_2…
Note: Wikimedians in India are eligible to join.
The link above hopefully covers required details about the event, and, if
you have questions, please let me know. Kindly spread the words to
deserving Wikimedians.
Thanks
Tito Dutta
Note: If I don't reply to your email in 2 days, please feel free to remind
me over email or phone call.
Hi,
I just noticed that BlazeGraph takes an undue amount of time for a
rather simple type of queries. The following times out:
SELECT ?item ?itemLabel
WHERE
{
?item wdt:P31 []
SERVICE wikibase:label { bd:serviceParam wikibase:language "en". }
} LIMIT 10
Manually forcing a specific query plan makes the query work in <200ms:
SELECT ?item ?itemLabel
WHERE
{
{ SELECT * WHERE { ?item wdt:P31 [] } LIMIT 10 }
SERVICE wikibase:label { bd:serviceParam wikibase:language "en". }
}
But of course the original query should normally be streaming and not
depend on any such smartness to push LIMIT inwards.
Cheers,
Markus