Hello everyone,

Thank you for all your replies and suggestions! I will test them.

@Magnus: sorry for posting in the wrong issue tracker. I couldn’t figure out the structure of Bitbucket… 

If I use a simple query like https://www.wikidata.org/w/api.php?action=query&list=search&format=json&srwhat=text&srinfo=totalhits&srqiprofile=classic&srprop=titlesnippet&srlimit=1&srsearch=Ensor then indeed it works fine. This is how it looks like in Open Refine:


But the results of such query are poor and difficult to work with in Open Refine if we are talking about trying to controle the matches for 37.000 persons… 

The other option in Open Refine - the reconciliation service - doesn’t want to launch at this moment. This is how it looks:
- I have added the service in Open Refine just like any other service:


- and all I see know is this ‘working' screen even before I can actually launch the reconciliation service:


- in oktober and november 2016 I could choose a ‘type’ like in this screenshot from github converstation:


- And these were the very good results form the three tests that I then got:


I understand that the service can be unstable, but it is just a pity that I have seen it working so good and now it doesn’t anymore… Hopefully somebody can figure this out.

In the meantime I will test Mix’n’match and other suggestions. Thank you for the advice. 

Best regards,
Alina
 
--
Aanwezig ma, di, wo, do

PACKED vzw - Expertisecentrum Digitaal Erfgoed
Rue Delaunoystraat 58 bus 23
B-1080 Brussel
Belgium

e alina@packed.be
t: +32 (0)2 217 14 05
www.packed.be

On 28 Jan 2017, at 13:00, wikidata-request@lists.wikimedia.org wrote:

Send Wikidata mailing list submissions to
wikidata@lists.wikimedia.org

To subscribe or unsubscribe via the World Wide Web, visit
https://lists.wikimedia.org/mailman/listinfo/wikidata
or, via email, send a message with subject or body 'help' to
wikidata-request@lists.wikimedia.org

You can reach the person managing the list at
wikidata-owner@lists.wikimedia.org

When replying, please edit your Subject line so it is more specific
than "Re: Contents of Wikidata digest..."


Today's Topics:

  1. Re: Wikidata reconciliation service and Ope Refine
     (Sandra Fauconnier)
  2. Re: Wikidata reconciliation service and Ope Refine
     (Antonin Delpeuch (lists))
  3. Re: Wikidata reconciliation service and Ope Refine
     (Antonin Delpeuch (lists))


----------------------------------------------------------------------

Message: 1
Date: Fri, 27 Jan 2017 14:12:53 +0100
From: Sandra Fauconnier <sandra.fauconnier@gmail.com>
To: "Discussion list for the Wikidata project."
<wikidata@lists.wikimedia.org>
Subject: Re: [Wikidata] Wikidata reconciliation service and Ope Refine
Message-ID: <2DB623F2-DBA1-4271-AEC4-0EC90B341298@gmail.com>
Content-Type: text/plain; charset="utf-8"

+1 from someone who would be so extremely happy (and much more productive) if such a service were implemented in OpenRefine.

I also added it as a task to Phabricator, feel free to comment, add suggestions… https://phabricator.wikimedia.org/T146740 <https://phabricator.wikimedia.org/T146740>

Best, Sandra/User:Spinster

On 26 Jan 2017, at 19:00, Thad Guidry <thadguidry@gmail.com> wrote:

Everyone,

Yes, our OpenRefine API can use Multiple Query Mode (reconciling an Entity by using multiple columns/ WD properties)

https://github.com/OpenRefine/OpenRefine/wiki/Reconciliation-Service-API#multiple-query-mode <https://github.com/OpenRefine/OpenRefine/wiki/Reconciliation-Service-API#multiple-query-mode>
I do not think that Magnus has implemented our Multiple Query Mode yet, however.
The bounty issue https://github.com/OpenRefine/OpenRefine/issues/805 <https://github.com/OpenRefine/OpenRefine/issues/805>  that I created and funded on BountySource.com is to fully implement the Mutliple Query Mode API and ensure that it works correctly in OpenRefine 2.6 RC2 latest.

Happy Hacking anyone :)
Let us know if we can answer any questions regarding OpenRefine or the Reconcile API , on our own mailing list.
http://groups.google.com/group/openrefine/ <http://groups.google.com/group/openrefine/>

-Thad


On Thu, Jan 26, 2017 at 11:18 AM AMIT KUMAR JAISWAL <amitkumarj441@gmail.com <mailto:amitkumarj441@gmail.com>> wrote:
Hey Alina,

Thanks for letting us know about this.

I'll start testing it after configuring OpenRefine(as it's API is
implemented in WMF).

Can you share me the open task related to this?

Cheers,
Amit Kumar Jaiswal

On 1/26/17, Antonin Delpeuch (lists) <lists@antonin.delpeuch.eu <mailto:lists@antonin.delpeuch.eu>> wrote:
Hi Magnus,

Mix'n'match looks great and I do have a few questions about it. I'd like
to use it to import a dataset, which looks like this (these are the 100
first lines):
http://pintoch.ulminfo.fr/34f8c4cf8a/aligned_institutions.txt <http://pintoch.ulminfo.fr/34f8c4cf8a/aligned_institutions.txt>

I see how to import it in Mix'n'match, but given all the columns I have
in this dataset, I think that it is a bit sad to resort to matching on
the name only.

Do you see any way to do some fuzzy-matching on, say, the URLs provided
in the dataset against the "official website" property? I think that it
would be possible with the (proposed) Wikidata interface for OpenRefine
(if I understand the UI correctly).

In this context, I think it might even be possible to confirm matches
automatically (when the matches are excellent on multiple columns). As
the dataset is rather large (400,000 lines) I would not really want to
validate them one after the other with the web interface. So I would
need a sort of batch edit. How would you do that?

Finally, once matches are found, it would be great if statements
corresponding to the various columns could be created in the items (if
these statements don't already exist). With the appropriate reference to
the dataset, ideally.

I realise this is a lot to ask - maybe I should just write a bot.

Alina, sorry to hijack your thread. I hope my questions were general
enough to be interesting for other readers.

Cheers,
Antonin


On 26/01/2017 16:01, Magnus Manske wrote:
If you want to match your list to Wikidata, to find which entries
already exist, have you considered Mix'n'match?
https://tools.wmflabs.org/mix-n-match/ <https://tools.wmflabs.org/mix-n-match/>

You can upload your names and identifiers at
https://tools.wmflabs.org/mix-n-match/import.php <https://tools.wmflabs.org/mix-n-match/import.php>

There are several mechanisms in place to help with the matching. Please
contact me if you need help!

On Thu, Jan 26, 2017 at 3:58 PM Magnus Manske
<magnusmanske@googlemail.com <mailto:magnusmanske@googlemail.com> <mailto:magnusmanske@googlemail.com <mailto:magnusmanske@googlemail.com>>> wrote:

   Alina, I just found your bug report, which you filed under the wrong
   issue tracker. The git repo (source code, issue tracker etc.) are
here:
   https://bitbucket.org/magnusmanske/reconcile <https://bitbucket.org/magnusmanske/reconcile>

   The report says it "keeps hanging", which is so vague that it's
   impossible to debug, especially since the example linked on
   https://tools.wmflabs.org/wikidata-reconcile/ <https://tools.wmflabs.org/wikidata-reconcile/>
   works perfectly fine for me.

   Does it not work at all for you? Does it work for a time, but then
   stops? Does it "break" reproducibly on specific queries, or at
   random? Maybe it breaks for specific "types" only? At what rate are
   you hitting the tool? Do you have an example query, preferably one
   that breaks?

   Please note that this is not an "official" WMF service, only parts
   of the API are implemented, and there are currently other technical
   limitations on it.

   Cheers,
   Magnus

   On Thu, Jan 26, 2017 at 3:35 PM Antonin Delpeuch (lists)
   <lists@antonin.delpeuch.eu <mailto:lists@antonin.delpeuch.eu> <mailto:lists@antonin.delpeuch.eu <mailto:lists@antonin.delpeuch.eu>>> wrote:

       Hi,

       I'm also very interested in this. How did you configure your
       OpenRefine
       to use Wikidata? (Even if it does not currently work, I am
       interested in
       the setup.)

       There is currently an open issue (with a nice bounty) to improve
the
       integration of Wikidata in OpenRefine:
       https://github.com/OpenRefine/OpenRefine/issues/805 <https://github.com/OpenRefine/OpenRefine/issues/805>

       Best regards,
       Antonin

       On 26/01/2017 12:22, Alina Saenko wrote:
Hello everyone,

I have a question for people who are using the Wikidata
       reconciliation
service: https://tools.wmflabs.org/wikidata-reconcile/ <https://tools.wmflabs.org/wikidata-reconcile/> It was
       working
perfectly in my Open Refine in november 2016, but since
       december is
stopped working. I already have contacted Magnus Manske, but
       he hasn’t
responded yet. Does anyone else experience problems with the
       service and
know how to fix it?

I’m using this service to link big lists of Belgian artists
       (37.000) and
performance art organisations (1.000) to Wikidata as a
       preparation to
upload contextual data about these persons and organisations to
Wikidata. This data wil come from Kunstenpunt database
(http://data.kunsten.be/people <http://data.kunsten.be/people>). Wikimedia user Romaine
(https://meta.wikimedia.org/wiki/User:Romaine <https://meta.wikimedia.org/wiki/User:Romaine>) is helping us
       with this
project.

Best regards,
Alina


--
Aanwezig ma, di, wo, do

PACKED vzw - Expertisecentrum Digitaal Erfgoed
Rue Delaunoystraat 58 bus 23
B-1080 Brussel
Belgium

e alina@packed.be <mailto:alina@packed.be> <mailto:alina@packed.be <mailto:alina@packed.be>>
       <mailto:alina@packed.be <mailto:alina@packed.be> <mailto:alina@packed.be <mailto:alina@packed.be>>>
t: +32 (0)2 217 14 05 <tel:+32%202%20217%2014%2005> <tel:+32%202%20217%2014%2005>
w www.packed.be <http://www.packed.be/> <http://www.packed.be <http://www.packed.be/>> <http://www.packed.be/ <http://www.packed.be/>>



_______________________________________________
Wikidata mailing list
Wikidata@lists.wikimedia.org <mailto:Wikidata@lists.wikimedia.org>
<mailto:Wikidata@lists.wikimedia.org <mailto:Wikidata@lists.wikimedia.org>>
https://lists.wikimedia.org/mailman/listinfo/wikidata <https://lists.wikimedia.org/mailman/listinfo/wikidata>



       _______________________________________________
       Wikidata mailing list
       Wikidata@lists.wikimedia.org <mailto:Wikidata@lists.wikimedia.org> <mailto:Wikidata@lists.wikimedia.org <mailto:Wikidata@lists.wikimedia.org>>
       https://lists.wikimedia.org/mailman/listinfo/wikidata <https://lists.wikimedia.org/mailman/listinfo/wikidata>



_______________________________________________
Wikidata mailing list
Wikidata@lists.wikimedia.org <mailto:Wikidata@lists.wikimedia.org>
https://lists.wikimedia.org/mailman/listinfo/wikidata <https://lists.wikimedia.org/mailman/listinfo/wikidata>



_______________________________________________
Wikidata mailing list
Wikidata@lists.wikimedia.org <mailto:Wikidata@lists.wikimedia.org>
https://lists.wikimedia.org/mailman/listinfo/wikidata <https://lists.wikimedia.org/mailman/listinfo/wikidata>



--
Amit Kumar Jaiswal
Mozilla Representative : http://reps.mozilla.org/u/amitkumarj441 <http://reps.mozilla.org/u/amitkumarj441>
Kanpur | Uttar Pradesh | India
Contact No : +91-8081187743 <tel:+91%2080811%2087743>
Web : http://amitkumarj441.github.io <http://amitkumarj441.github.io/> | Twitter : @AMIT_GKP
LinkedIn : http://in.linkedin.com/in/amitkumarjaiswal1 <http://in.linkedin.com/in/amitkumarjaiswal1>
PGP Key : EBE7 39F0 0427 4A2C

_______________________________________________
Wikidata mailing list
Wikidata@lists.wikimedia.org <mailto:Wikidata@lists.wikimedia.org>
https://lists.wikimedia.org/mailman/listinfo/wikidata <https://lists.wikimedia.org/mailman/listinfo/wikidata>
_______________________________________________
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.wikimedia.org/pipermail/wikidata/attachments/20170127/7818bb2e/attachment-0001.html>

------------------------------

Message: 2
Date: Fri, 27 Jan 2017 13:13:27 +0000
From: "Antonin Delpeuch (lists)" <lists@antonin.delpeuch.eu>
To: wikidata@lists.wikimedia.org
Subject: Re: [Wikidata] Wikidata reconciliation service and Ope Refine
Message-ID: <588B4777.9050309@antonin.delpeuch.eu>
Content-Type: text/plain; charset=utf-8

Hi Magnus,

The dataset is essentially this one: http://isni.ringgold.com/

I am currently augmenting it with Ringgold IDs (P3500) using ORCID (this
should be completed in a few days). This alignment only adds two columns
(Ringgold ID and organization type) which should not impact the task of
matching it with Wikidata (as there are virtually no Ringgold IDs in
Wikidata yet).

Cheers,
Antonin

On 27/01/2017 09:18, Magnus Manske wrote:
Hi Antonin,

mix'n'match is designed to work with almost any dataset, thus uses the
common denominator, which is names, for matching.

There are mechanisms to match on other properties, but writing an
interface for public consumption for this would be a task that could
easily keep an entire team of programmers busy :-)

If you can give me the whole list to download, I will see what I can do
in terms of auxiliary data matching. Maybe a combination of that, manual
matches (or at least confirmations on name matches), and the OpenRefine
approach will give us maximum coverage.

It appears Kunstenpunt has no Wikidata property yet. Maybe Romaine could
star setting one up? That would help in terms of synchronisation, I believe.

Cheers,
Magnus



On Thu, Jan 26, 2017 at 4:44 PM Antonin Delpeuch (lists)
<lists@antonin.delpeuch.eu <mailto:lists@antonin.delpeuch.eu>> wrote:

   Hi Magnus,

   Mix'n'match looks great and I do have a few questions about it. I'd like
   to use it to import a dataset, which looks like this (these are the 100
   first lines):
   http://pintoch.ulminfo.fr/34f8c4cf8a/aligned_institutions.txt

   I see how to import it in Mix'n'match, but given all the columns I have
   in this dataset, I think that it is a bit sad to resort to matching on
   the name only.

   Do you see any way to do some fuzzy-matching on, say, the URLs provided
   in the dataset against the "official website" property? I think that it
   would be possible with the (proposed) Wikidata interface for OpenRefine
   (if I understand the UI correctly).

   In this context, I think it might even be possible to confirm matches
   automatically (when the matches are excellent on multiple columns). As
   the dataset is rather large (400,000 lines) I would not really want to
   validate them one after the other with the web interface. So I would
   need a sort of batch edit. How would you do that?

   Finally, once matches are found, it would be great if statements
   corresponding to the various columns could be created in the items (if
   these statements don't already exist). With the appropriate reference to
   the dataset, ideally.

   I realise this is a lot to ask - maybe I should just write a bot.

   Alina, sorry to hijack your thread. I hope my questions were general
   enough to be interesting for other readers.

   Cheers,
   Antonin


   On 26/01/2017 16:01, Magnus Manske wrote:
If you want to match your list to Wikidata, to find which entries
already exist, have you considered Mix'n'match?
https://tools.wmflabs.org/mix-n-match/

You can upload your names and identifiers at
https://tools.wmflabs.org/mix-n-match/import.php

There are several mechanisms in place to help with the matching.
   Please
contact me if you need help!

On Thu, Jan 26, 2017 at 3:58 PM Magnus Manske
<magnusmanske@googlemail.com <mailto:magnusmanske@googlemail.com>
   <mailto:magnusmanske@googlemail.com
   <mailto:magnusmanske@googlemail.com>>> wrote:

   Alina, I just found your bug report, which you filed under the
   wrong
   issue tracker. The git repo (source code, issue tracker etc.)
   are here:
   https://bitbucket.org/magnusmanske/reconcile

   The report says it "keeps hanging", which is so vague that it's
   impossible to debug, especially since the example linked on
   https://tools.wmflabs.org/wikidata-reconcile/
   works perfectly fine for me.

   Does it not work at all for you? Does it work for a time, but then
   stops? Does it "break" reproducibly on specific queries, or at
   random? Maybe it breaks for specific "types" only? At what
   rate are
   you hitting the tool? Do you have an example query, preferably one
   that breaks?

   Please note that this is not an "official" WMF service, only parts
   of the API are implemented, and there are currently other
   technical
   limitations on it.

   Cheers,
   Magnus

   On Thu, Jan 26, 2017 at 3:35 PM Antonin Delpeuch (lists)
   <lists@antonin.delpeuch.eu <mailto:lists@antonin.delpeuch.eu>
   <mailto:lists@antonin.delpeuch.eu
   <mailto:lists@antonin.delpeuch.eu>>> wrote:

       Hi,

       I'm also very interested in this. How did you configure your
       OpenRefine
       to use Wikidata? (Even if it does not currently work, I am
       interested in
       the setup.)

       There is currently an open issue (with a nice bounty) to
   improve the
       integration of Wikidata in OpenRefine:
       https://github.com/OpenRefine/OpenRefine/issues/805

       Best regards,
       Antonin

       On 26/01/2017 12:22, Alina Saenko wrote:
Hello everyone,

I have a question for people who are using the Wikidata
       reconciliation
service: https://tools.wmflabs.org/wikidata-reconcile/
   It was
       working
perfectly in my Open Refine in november 2016, but since
       december is
stopped working. I already have contacted Magnus Manske, but
       he hasn’t
responded yet. Does anyone else experience problems with the
       service and
know how to fix it?

I’m using this service to link big lists of Belgian artists
       (37.000) and
performance art organisations (1.000) to Wikidata as a
       preparation to
upload contextual data about these persons and
   organisations to
Wikidata. This data wil come from Kunstenpunt database
(http://data.kunsten.be/people). Wikimedia user Romaine
(https://meta.wikimedia.org/wiki/User:Romaine) is helping us
       with this
project.

Best regards,
Alina


--
Aanwezig ma, di, wo, do

PACKED vzw - Expertisecentrum Digitaal Erfgoed
Rue Delaunoystraat 58 bus 23
B-1080 Brussel
Belgium

e alina@packed.be <mailto:alina@packed.be>
   <mailto:alina@packed.be <mailto:alina@packed.be>>
       <mailto:alina@packed.be <mailto:alina@packed.be>
   <mailto:alina@packed.be <mailto:alina@packed.be>>>
t: +32 (0)2 217 14 05 <tel:+32%202%20217%2014%2005>
   <tel:+32%202%20217%2014%2005>
w www.packed.be <http://www.packed.be>
   <http://www.packed.be> <http://www.packed.be/>



_______________________________________________
Wikidata mailing list
Wikidata@lists.wikimedia.org
   <mailto:Wikidata@lists.wikimedia.org>
   <mailto:Wikidata@lists.wikimedia.org
   <mailto:Wikidata@lists.wikimedia.org>>
https://lists.wikimedia.org/mailman/listinfo/wikidata



       _______________________________________________
       Wikidata mailing list
       Wikidata@lists.wikimedia.org
   <mailto:Wikidata@lists.wikimedia.org>
   <mailto:Wikidata@lists.wikimedia.org
   <mailto:Wikidata@lists.wikimedia.org>>
       https://lists.wikimedia.org/mailman/listinfo/wikidata



_______________________________________________
Wikidata mailing list
Wikidata@lists.wikimedia.org <mailto:Wikidata@lists.wikimedia.org>
https://lists.wikimedia.org/mailman/listinfo/wikidata



   _______________________________________________
   Wikidata mailing list
   Wikidata@lists.wikimedia.org <mailto:Wikidata@lists.wikimedia.org>
   https://lists.wikimedia.org/mailman/listinfo/wikidata



_______________________________________________
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata





------------------------------

Message: 3
Date: Fri, 27 Jan 2017 21:44:52 +0000
From: "Antonin Delpeuch (lists)" <lists@antonin.delpeuch.eu>
To: wikidata@lists.wikimedia.org
Subject: Re: [Wikidata] Wikidata reconciliation service and Ope Refine
Message-ID: <588BBF54.9080204@antonin.delpeuch.eu>
Content-Type: text/plain; charset=utf-8

I have just rounded up the bounty to $300. This is a dream feature, we
need it! :)

Antonin

On 27/01/2017 13:12, Sandra Fauconnier wrote:
+1 from someone who would be so extremely happy (and much more
productive) if such a service were implemented in OpenRefine.

I also added it as a task to Phabricator, feel free to comment, add
suggestions… https://phabricator.wikimedia.org/T146740

Best, Sandra/User:Spinster

On 26 Jan 2017, at 19:00, Thad Guidry <thadguidry@gmail.com
<mailto:thadguidry@gmail.com>> wrote:

Everyone,

Yes, our OpenRefine API can use Multiple Query Mode (reconciling an
Entity by using multiple columns/ WD properties)

https://github.com/OpenRefine/OpenRefine/wiki/Reconciliation-Service-API#multiple-query-mode

I do not think that Magnus has implemented our Multiple Query Mode
yet, however.
The bounty issue https://github.com/OpenRefine/OpenRefine/issues/805
that I created and funded on BountySource.com
<http://BountySource.com> is to fully implement the Mutliple Query
Mode API and ensure that it works correctly in OpenRefine 2.6 RC2 latest.

Happy Hacking anyone :)
Let us know if we can answer any questions regarding OpenRefine or the
Reconcile API , on our own mailing list.
http://groups.google.com/group/openrefine/

-Thad


On Thu, Jan 26, 2017 at 11:18 AM AMIT KUMAR JAISWAL
<amitkumarj441@gmail.com <mailto:amitkumarj441@gmail.com>> wrote:

   Hey Alina,

   Thanks for letting us know about this.

   I'll start testing it after configuring OpenRefine(as it's API is
   implemented in WMF).

   Can you share me the open task related to this?

   Cheers,
   Amit Kumar Jaiswal

   On 1/26/17, Antonin Delpeuch (lists) <lists@antonin.delpeuch.eu
   <mailto:lists@antonin.delpeuch.eu>> wrote:
Hi Magnus,

Mix'n'match looks great and I do have a few questions about it.
   I'd like
to use it to import a dataset, which looks like this (these are
   the 100
first lines):
http://pintoch.ulminfo.fr/34f8c4cf8a/aligned_institutions.txt

I see how to import it in Mix'n'match, but given all the columns
   I have
in this dataset, I think that it is a bit sad to resort to
   matching on
the name only.

Do you see any way to do some fuzzy-matching on, say, the URLs
   provided
in the dataset against the "official website" property? I think
   that it
would be possible with the (proposed) Wikidata interface for
   OpenRefine
(if I understand the UI correctly).

In this context, I think it might even be possible to confirm
   matches
automatically (when the matches are excellent on multiple
   columns). As
the dataset is rather large (400,000 lines) I would not really
   want to
validate them one after the other with the web interface. So I would
need a sort of batch edit. How would you do that?

Finally, once matches are found, it would be great if statements
corresponding to the various columns could be created in the
   items (if
these statements don't already exist). With the appropriate
   reference to
the dataset, ideally.

I realise this is a lot to ask - maybe I should just write a bot.

Alina, sorry to hijack your thread. I hope my questions were general
enough to be interesting for other readers.

Cheers,
Antonin


On 26/01/2017 16:01, Magnus Manske wrote:
If you want to match your list to Wikidata, to find which entries
already exist, have you considered Mix'n'match?
https://tools.wmflabs.org/mix-n-match/

You can upload your names and identifiers at
https://tools.wmflabs.org/mix-n-match/import.php

There are several mechanisms in place to help with the
   matching. Please
contact me if you need help!

On Thu, Jan 26, 2017 at 3:58 PM Magnus Manske
<magnusmanske@googlemail.com
   <mailto:magnusmanske@googlemail.com>
   <mailto:magnusmanske@googlemail.com
   <mailto:magnusmanske@googlemail.com>>> wrote:

   Alina, I just found your bug report, which you filed under
   the wrong
   issue tracker. The git repo (source code, issue tracker
   etc.) are
here:
   https://bitbucket.org/magnusmanske/reconcile

   The report says it "keeps hanging", which is so vague that it's
   impossible to debug, especially since the example linked on
   https://tools.wmflabs.org/wikidata-reconcile/
   works perfectly fine for me.

   Does it not work at all for you? Does it work for a time,
   but then
   stops? Does it "break" reproducibly on specific queries, or at
   random? Maybe it breaks for specific "types" only? At what
   rate are
   you hitting the tool? Do you have an example query,
   preferably one
   that breaks?

   Please note that this is not an "official" WMF service,
   only parts
   of the API are implemented, and there are currently other
   technical
   limitations on it.

   Cheers,
   Magnus

   On Thu, Jan 26, 2017 at 3:35 PM Antonin Delpeuch (lists)
   <lists@antonin.delpeuch.eu
   <mailto:lists@antonin.delpeuch.eu>
   <mailto:lists@antonin.delpeuch.eu
   <mailto:lists@antonin.delpeuch.eu>>> wrote:

       Hi,

       I'm also very interested in this. How did you configure
   your
       OpenRefine
       to use Wikidata? (Even if it does not currently work, I am
       interested in
       the setup.)

       There is currently an open issue (with a nice bounty)
   to improve
the
       integration of Wikidata in OpenRefine:
       https://github.com/OpenRefine/OpenRefine/issues/805

       Best regards,
       Antonin

       On 26/01/2017 12:22, Alina Saenko wrote:
Hello everyone,

I have a question for people who are using the Wikidata
       reconciliation
service:
   https://tools.wmflabs.org/wikidata-reconcile/ It was
       working
perfectly in my Open Refine in november 2016, but since
       december is
stopped working. I already have contacted Magnus
   Manske, but
       he hasn’t
responded yet. Does anyone else experience problems
   with the
       service and
know how to fix it?

I’m using this service to link big lists of Belgian
   artists
       (37.000) and
performance art organisations (1.000) to Wikidata as a
       preparation to
upload contextual data about these persons and
   organisations to
Wikidata. This data wil come from Kunstenpunt database
(http://data.kunsten.be/people). Wikimedia user Romaine
(https://meta.wikimedia.org/wiki/User:Romaine) is
   helping us
       with this
project.

Best regards,
Alina


--
Aanwezig ma, di, wo, do

PACKED vzw - Expertisecentrum Digitaal Erfgoed
Rue Delaunoystraat 58 bus 23
B-1080 Brussel
Belgium

e alina@packed.be <mailto:alina@packed.be>
   <mailto:alina@packed.be <mailto:alina@packed.be>>
       <mailto:alina@packed.be <mailto:alina@packed.be>
   <mailto:alina@packed.be <mailto:alina@packed.be>>>
t: +32 (0)2 217 14 05 <tel:+32%202%20217%2014%2005>
   <tel:+32%202%20217%2014%2005>
w www.packed.be <http://www.packed.be/>
   <http://www.packed.be <http://www.packed.be/>> <http://www.packed.be/>



_______________________________________________
Wikidata mailing list
Wikidata@lists.wikimedia.org
   <mailto:Wikidata@lists.wikimedia.org>
<mailto:Wikidata@lists.wikimedia.org
   <mailto:Wikidata@lists.wikimedia.org>>
https://lists.wikimedia.org/mailman/listinfo/wikidata



       _______________________________________________
       Wikidata mailing list
       Wikidata@lists.wikimedia.org
   <mailto:Wikidata@lists.wikimedia.org>
   <mailto:Wikidata@lists.wikimedia.org
   <mailto:Wikidata@lists.wikimedia.org>>
       https://lists.wikimedia.org/mailman/listinfo/wikidata



_______________________________________________
Wikidata mailing list
Wikidata@lists.wikimedia.org <mailto:Wikidata@lists.wikimedia.org>
https://lists.wikimedia.org/mailman/listinfo/wikidata



_______________________________________________
Wikidata mailing list
Wikidata@lists.wikimedia.org <mailto:Wikidata@lists.wikimedia.org>
https://lists.wikimedia.org/mailman/listinfo/wikidata



   --
   Amit Kumar Jaiswal
   Mozilla Representative : http://reps.mozilla.org/u/amitkumarj441
   Kanpur | Uttar Pradesh | India
   Contact No : +91-8081187743 <tel:+91%2080811%2087743>
   Web : http://amitkumarj441.github.io
   <http://amitkumarj441.github.io/> | Twitter : @AMIT_GKP
   LinkedIn : http://in.linkedin.com/in/amitkumarjaiswal1
   PGP Key : EBE7 39F0 0427 4A2C

   _______________________________________________
   Wikidata mailing list
   Wikidata@lists.wikimedia.org <mailto:Wikidata@lists.wikimedia.org>
   https://lists.wikimedia.org/mailman/listinfo/wikidata

_______________________________________________
Wikidata mailing list
Wikidata@lists.wikimedia.org <mailto:Wikidata@lists.wikimedia.org>
https://lists.wikimedia.org/mailman/listinfo/wikidata



_______________________________________________
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata





------------------------------

Subject: Digest Footer

_______________________________________________
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


------------------------------

End of Wikidata Digest, Vol 62, Issue 23
****************************************