Hi Martynas and all, 

Thanks for this engaging Wikidata RDF conversation. 

Wikidata RDF developments are exciting especially for 
eventually coding with IBM's Watson and related AI. See this related conversation, for example - 

"Poster Title: Not Elementary, My Dear Watson - Extending Watson for Question Answering on Linked Open Government Data

RPI doctoral student Amar Viswanathan Kannan, working with Prof. James Hendler, presented this poster at yesterday's Cognitive Colloquium at Yorktown:

Linked Data, stored as RDF graphs lets users to traverse through heterogeneous knowledge bases with relative ease. In addition it also allows for data to be viewed from different perspectives and is also able to provide multiple conceptualizations of data. This becomes very important owing to the heterogeneous nature of the web. While traditional linked data technologies like the Simple Protocol and RDF Query Language(SPARQL) allow us to access the Linked Data knowledge bases, it requires considerable skill to design queries to access Linked Data triple stores. It is also a shift from looking at data as traditional RDBMS databases to knowledge graphs. The growing acceptance of Linked Data triple stores as general purpose knowledge bases for a variety of domains has necessitated the need for accessing such knowledge with greater ease. Enter Watson, IBMs flagship Question Answering System. It has been at the forefront of Question Answering systems for being able to answer factoid questions on the “Jeopardy!” quiz game show with pinpoint precision. The architecture of the DeepQA system, of which Watson is an application has captured the imaginations of the Artificial Intelligence Community, which has long strived to build Cognitive Systems. The DeepQA system excels at generating hypotheses and gathering evidence to refute or support these hypotheses. It also evaluates all the evidence and provides analytics."

https://www.linkedin.com/groups/Poster-Title-Not-Elementary-My-6729452.S.5933952670367760387?trk=groups_most_popular-0-b-ttl&goback=%2Egmp_6729452

Cheers, Scott



On Thu, Oct 30, 2014 at 1:03 PM, Cristian Consonni <kikkocristian@gmail.com> wrote:
2014-10-30 17:34 GMT+01:00 Markus Krötzsch <markus@semantic-mediawiki.org>:
> On 30.10.2014 11:49, Cristian Consonni wrote:
>>
>> 2014-10-29 22:59 GMT+01:00 Lydia Pintscher <lydia.pintscher@wikimedia.de>:
>>>
>>> Help with this would be awesome and totally welcome. The tracking bug
>>> is at https://bugzilla.wikimedia.org/show_bug.cgi?id=48143
>>
>>
>> Speaking of totally awesome (aehm :D):
>> * see: http://wikidataldf.com
>> * see this other thread:
>> https://lists.wikimedia.org/pipermail/wikidata-l/2014-October/004920.html
>>
>> (If I can ask, having the RDF dumps in HDT format [again, see the
>> other thread] would be really helpful)
>
>
> We are using OpenRDF. Can it do HDT? If yes, this would be easy to do. If
> no, it would be easier to use a standalone tool to transform our dumps. We
> could still do this. Do you have any recommendation what we could use there
> (i.e., a memory-efficient command-line conversion script for N3 -> HDT)?

It seems that OpenRDF does not support HDT creation (see [1]).
I have been using the rdf2hdt tool, obtained compiling the devel
branch of the hdt-cpp library[2].
Which is developed by the group who is proposing the standard
implementation to the W3C.
C

[1] https://openrdf.atlassian.net/browse/SES-1874
[2] https://github.com/rdfhdt/hdt-cpp/tree/devel

_______________________________________________
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l



--
- Scott MacLeod - Founder & President  
- 415 480 4577
http://worlduniversityandschool.org 
- World University and School - like Wikipedia with MIT OpenCourseWare (not endorsed by MIT OCW) - incorporated as a nonprofit university and school in California, and is a U.S. 501 (c) (3) tax-exempt educational organization, both effective April 2010. 

World University and School is sending you this because of your interest in free, online, higher education. If you don't want to receive these, please reply with 'unsubscribe' in the subject line. Thank you.