Thanks Guillaume!
One question more, what is the CPU frequency (GHz)?
Le mar. 4 juin 2019 à 12:25, Guillaume Lederrey
<glederrey(a)wikimedia.org> a écrit :
Hello,
Does somebody know the minimal hardware requirements (disk size and
RAM) for loading wikidata dump in Blazegraph?
The actual hardware requirements will depend on your use case. But for
comparison, our production servers are:
* 16 cores (hyper threaded, 32 threads)
* 128G RAM
* 1.5T of SSD storage
The downloaded dump file
wikidata-20190513-all-BETA.ttl is 379G.
The bigdata.jnl file which stores all the triples data in Blazegraph
is 478G but still growing.
I had 1T disk but is almost full now.
The current size of our jnl file in production is ~670G.
Hope that helps!
Guillaume
Thanks,
Adam
_______________________________________________
Wikidata mailing list
Wikidata(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
--
Guillaume Lederrey
Engineering Manager, Search Platform
Wikimedia Foundation
UTC+2 / CEST
_______________________________________________
Wikidata mailing list
Wikidata(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
_______________________________________________
Wikidata mailing list
Wikidata(a)lists.wikimedia.org