Would the WikiData community be interested in linking together Wikidata
pages with Freebase entities? I've proposed a new property to link the two
datasets here:
http://www.wikidata.org/wiki/Wikidata:Property_proposal/all#Freebase_identi…
Freebase already has interwiki links for many entities so it wouldn't be
too hard to automatically determine the corresponding Wikidata pages. This
would allow people to mash up both datasets and cross-reference facts more
easily.
--
Shawn Simister
Knowledge Developer Relations
Google
In the past weeks the community of the Dutch Wikipedia has been working hard to solve interwiki conflicts. A few months ago we had more than 14000 interwiki conflicts, today less than 10800.
With fixing these interwiki conflicts, often we also fix them on other wikis. But are other Wikipedias also active on massively fixing interwiki conflicts?
Romaine
Heya folks :)
Here's your weekly dose of Wikidata updates:
http://meta.wikimedia.org/wiki/Wikidata/Status_updates/2013_06_14
Have a great weekend!
Cheers
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Community Communications for Technical Projects
Wikimedia Deutschland e.V.
Obentrautstr. 72
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
Heya folks :)
We've just deployed new code to wikidata.org. We fixed a few bugs, most notably:
* https://bugzilla.wikimedia.org/show_bug.cgi?id=45244 - Shows
statements from "oldid" version, not "diff" version
* Fixed a number of issues in translatable messages that made it hard
to translate them
In addition to that there is also shiny new stuff:
* Bene* wrote a new special page to set language links without
JavaScript enabled (Special:SetSiteLink)
* Made first version of the geocoordinates datatype available \o/
This means that you will be able to enter the coordinates of a city
for example as soon as someone created the necessary property for it.
(There are a few waiting on
http://www.wikidata.org/wiki/Wikidata:Property_proposal.) It's still a
bit wonky so please do let me know about any issues you find. We're
already aware of:
** https://bugzilla.wikimedia.org/show_bug.cgi?id=49385 - cardinal
directions in geocoordinate datatype are not localized
** https://bugzilla.wikimedia.org/show_bug.cgi?id=49386 - apostrophe
issue in geocoordinate UI
** https://bugzilla.wikimedia.org/show_bug.cgi?id=49387 - property
parser function needs to support geocoordinate datatype
Please subscribe to these bugs to stay up-to-date and please vote on
the ones you care about most to help us prioritize. (You can find a
list of all open bugs here:
https://bugzilla.wikimedia.org/buglist.cgi?emailcc1=1&resolution=---&emailt…)
Cheers
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Community Communications for Technical Projects
Wikimedia Deutschland e.V.
Obentrautstr. 72
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
I think Poland may do better than average because Polish people, out of
national pride, have made a special effort to be well documented in English
Wikipedia and represent a Polish point-of-view on topics like the city of
Gdansk.
One fascinating thing about Wikidata is that it provides access to all of
the wonderful concepts shared in the Wikiverse, so now sites like Ookaboo
can collect pictures of many beautiful places that don't exist in en
Wikipedia.
On the other hand I'm also interested in the other end of the curve,
those elite concepts which are represented widely across the Wikipedias.
Surely this is connected with subjective importance, with some flavor
towards "global" appeal, whatever that would turn out to mean. Any chance
you could run a report on those?
-----Original Message-----
From: Mathieu Stumpf
Sent: Thursday, June 13, 2013 4:51 AM
To: wikidata-l(a)lists.wikimedia.org
Subject: Re: [Wikidata-l] Visualisations of The Most Unique Wikipedias
According to Wikidata
Le 2013-06-12 22:22, Klein,Max a écrit :
> Hello Wikidatians,
>
> I made a few visualizations of the distributions of language links
> in Wikidata Items. You can also use these stats to see which Items
> represent wikipedia articles which are unique to a language and
> compare the uniquenesses of all languages. Also I investigate all the
> items with just two language links, to look at Wikipedia "pairs"
>
> See the full analysis:
> http://notconfusing.com/the-most-unique-wikipedias-according-to-wikidata/
> [1]
Interesting! Could you also create that kind of visualisations by
topics : how much uniqueness come from biographies of local football
people, compared with history events or abstract concepts ?
Also, in a completly unrelated topic, you may explain me in private
what you mean with "Create a communal house to live in" which is in your
public todo list, it sounds interesting. :P
--
Association Culture-Libre
http://www.culture-libre.org/
_______________________________________________
Wikidata-l mailing list
Wikidata-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Based on all feedback gathered during the RfC about a possible
inter-project links interface [1], User:Tpt has created a content card
prototype (image: [2])
To activate it, follow these steps:
1. Go to your common.js file. For a user named "Test" in English
Wikipedia, it would be: https://en.wikipedia.org/wiki/User:Test/common.js
2. Modify it and paste this line: mw.loader.load('//
www.wikidata.org/w/index.php?title=User:Tpt/interproject.js&action=raw&ctyp…'
);
3. Save and go to any Wikipedia page
4. You should see an icon next to the article title, if you don't,
refresh your browser cache. *Instructions*: Internet Explorer: hold down
the Ctrl key and click the Refresh or Reload button. Firefox: hold down the
Shift key while clicking Reload (or press Ctrl-Shift-R). Google Chrome and
Safari users can just click the Reload button.
What it does:
- It displays an icon next to the article title
- When you hover your mouse over the icon it shows a *content card*.
- The content card displays information from Wikidata: label, image,
link to Commons gallery, and link to edit Wikidata.
What it is supposed to do in the future when Wikidata supports sister
projects:
- It will display contents or links to sister projects
Please leave your feedback on the Request for comments, thanks!
http://meta.wikimedia.org/wiki/Requests_for_comment/Interproject_links_inte…
Cheers,
Micru
[1]
http://meta.wikimedia.org/wiki/Requests_for_comment/Interproject_links_inte…
[2] http://commons.wikimedia.org/wiki/File:Content-card-prototype.png
Hi Daniel,
I started working on the DBpedia release and just wanted to check
what's the current status of the Wikidata dumps. I saw that RDF data
and RDF URIs like http://www.wikidata.org/entity/Q1 are already
available. Cool! Do you think there will be RDF dumps soon, i.e. in
the next few weeks?
If not, could you guys prepare a dump of the sitelinks table, as you
suggested below? If it's not too much effort, it would be cool if you
could generate CSV or a similar simple format. We won't put the stuff
into a DB, we just extract the data, and we would have to write a
parser for SQL insert statements. CSV would be much simpler.
Thanks a lot for your help!
Christopher
On 4 May 2013 23:36, Daniel Kinzler <daniel.kinzler(a)wikimedia.de> wrote:
> On 04.05.2013 19:13, Jona Christopher Sahnwaldt wrote:
>> We will produce a DBpedia release pretty soon, I don't think we can
>> wait for the "real" dumps. The inter-language links are an important
>> part of DBpedia, so we have to extract data from almost all Wikidata
>> items. I don't think it's sensible to make ~10 million calls to the
>> API to download the external JSON format, so we will have to use the
>> XML dumps and thus the internal format.
>
> Oh, if it's just the language links, this isn't an issue: there's an additional
> table for them in the database, and we'll soon be providing a separate dump of
> that at table http://dumps.wikimedia.org/wikidatawiki/
>
> If it's not there when you need it, just ask us for a dump of the sitelinks
> table (technically, wb_items_per_site), and we'll get you one.
>
>> But I think it's not a big
>> deal that it's not that stable: we parse the JSON into an AST anyway.
>> It just means that we will have to use a more abstract AST, which I
>> was planning to do anyway. As long as the semantics of the internal
>> format will remain more or less the same - it will contain the labels,
>> the language links, the properties, etc. - it's no big deal if the
>> syntax changes, even if it's not JSON anymore.
>
> Yes, if you want the labels and properties in addition to the links, you'll have
> to do that for now. But I'm working on the "real" data dumps.
>
> -- daniel
>