Hoi Ghislain,
On Sat, Oct 28, 2017 at 9:54 AM, Ghislain ATEMEZING <
ghislain.atemezing(a)gmail.com> wrote:
Hello emarx,
Many thanks for sharing KBox. Very interesting project!
thanks
One question, how do you deal with different versions
of the KB, like the
case here of wikidata dump?
KBox works with the so called KNS (Knowledge Name Service) servers, so any
dataset publisher can have his own KNS.
Each dataset has its own KN (Knowledge Name) that is distributed over the
KNS (Knowledge Name Service).
E.g. wikidata dump is
https://www.wikidata.org/20160801.
Do you fetch their repo every xx time?
No, the idea is that each organization will have its own KNS, so users can
add the KNS that they want.
Currently all datasets available in KBox KNS are served by KBox team.
You can check all of them here kbox.tech, or using the command line (
https://github.com/AKSW/KBox#how-can-i-list-available-knowledge-bases).
Also, for avoiding your users to re-create the models,
you can pre-load
"models" from LOV catalog.
We plan to share all LOD datasets in KBox, we are currently in discussing
this with W3C,
DBpedia might have its own KNS soon.
Regarding LOV catalog, you can help by just asking them to publish their
catalog in KBox.
best,
<emarx/>
http://emarx.org
Cheers,
Ghislain
2017-10-27 21:56 GMT+02:00 Edgard Marx <digamarx(a)gmail.com>om>:
Hey guys,
I don't know if you already knew about it,
but you can use KBox for Wikidata, DBpedia, Freebase, Lodstats...
https://github.com/AKSW/KBox
And yes, you can also use it to merge your graph with one of those....
https://github.com/AKSW/KBox#how-can-i-query-multi-bases
cheers,
<emarx>
On Oct 27, 2017 21:02, "Jasper Koehorst" <jasperkoehorst(a)gmail.com>
wrote:
I will look into the size of the jnl file but should that not be located
where the blazegraph is running from the sparql endpoint or is this a
special flavour?
Was also thinking of looking into a gitlab runner which occasionally
could generate a HDT file from the ttl dump if our server can handle it but
for this an md5 sum file would be preferable or should a timestamp be
sufficient?
Jasper
On 27 Oct 2017, at 18:58, Jérémie Roquet
<jroquet(a)arkanosis.net> wrote:
2017-10-27 18:56 GMT+02:00 Jérémie Roquet <jroquet(a)arkanosis.net>et>:
2017-10-27 18:51 GMT+02:00 Luigi Assom
<itsawesome.yes(a)gmail.com>om>:
> I found and share this resource:
>
http://www.rdfhdt.org/datasets/
>
> there is also Wikidata dump in HDT
The link to the Wikidata dump seems dead, unfortunately :'(
… but there's a file on the server:
http://gaia.infor.uva.es/hdt/wikidata-20170313-all-BETA.hdt.gz (ie.
the link was missing the “.gz”)
--
Jérémie
_______________________________________________
Wikidata mailing list
Wikidata(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
_______________________________________________
Wikidata mailing list
Wikidata(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
_______________________________________________
Wikidata mailing list
Wikidata(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
--
"*Love all, trust a few, do wrong to none*" (W. Shakespeare)
_______________________________________________
Wikidata mailing list
Wikidata(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata