On Fri, Mar 1, 2013 at 12:12 AM, Bináris <wikiposta(a)gmail.com> wrote:
Hi Benjamin,
there is a similar task. I am not sure this fits here. Hungarian Wikipedia
lists the redlist status of species of plants and animals (threatened, not
threatened etc.). There is a request to update this by bot from
http://www.iucnredlist.org/.
My bot imported a bunch of these over the past two days from English
Wikipedia categories. You can see some of the logs at [1] and [2]. I don't
think it is fully complete, but it is a huge start. Since my bot can import
from any Wikipedia project if Hungarian Wikipedia has it sorted into
categories as well, my bot can complement the enwp data with it.
Also biological property, also listed in templates and
also originates
from an external database. I told them to wait for Wikidata phase 2, which
is now alive, but what next? Do you think these tasks are related?
I am planning on importing taxonomy data from enwp (and possibly other
projects) hopefully next week. There's a quick page at [3] that an enwp
user who is familiar with the taxonomy templates put together for me. If
other languages also have similar templates it would be good to add to that
list as well. I am also going to look into supplementing with information
from Wikispecies if possible.
2013/3/1 Benjamin Good <ben.mcgee.good(a)gmail.com>
I am considering the task of converting the
templates from the gene
articles in Wikipedia (
http://en.wikipedia.org/wiki/Portal:Gene_Wiki) to
use/create wikidata assertions. This involves an extensive update of the
template structure as well as the code for the bot that keeps them in sync
with external public databases. (
https://bitbucket.org/sulab/pygenewiki)
I am not very familiar with enwp's coverage of genes. Does this require
updates on Wikidata side or Wikipedia side?
More specifically I'm thinking about working with a Google Summer of Code
student on
this project.
Given a time frame of now through August, would it make sense for us to
pursue this objective directly in the context of wikidata (through the
public API). Or would it be better for us to install our own version of
the wikibase software (kept in sync with code updates) and develop the new
gene wiki bot code locally with the aim of switching to the public API
later? Or is it too early to consider this project?
I would think that as soon as the data can go on to the main Wikidata
database, the better.
I want to get involved and support wikidata with
this important data, but
I'm hesitant to ramp up development (especially with a student) in a moving
target situation.
Any thoughts?
thanks!
-Ben
--
Bináris
[1]
https://www.wikidata.org/wiki/User:Legobot/properties.js/Archive/2013/02/27
[2]
https://www.wikidata.org/wiki/User:Legobot/properties.js/Archive/2013/02/28
[3]
https://www.wikidata.org/wiki/User:Legobot/taxon
--Legoktm