Hi,
regarding an actual topic in Germany about publication of the
timetable-data of Deutsche Bahn (German national railway company) and
their willingness of a discussion with other Open-Data-Supporters it may
be a good idea of providing an expiration dates for Wikidata-records.
In their open letter to Mr. Kreil [1] they announced that it may cause
problems providing the timetable-data in an open way if e.g. anybody
uses old data.
Marco
[1] http://www.db-vertrieb.com/db_vertrieb/view/service/open_plan_b.shtml
Heya folks :)
Denny and I will be doing another round of Wikidata office hours. You
can come and ask your questions about Wikidata - technical and
non-technical. The next ones will be:
* 5. November at 17:30 UTC
(http://www.timeanddate.com/worldclock/fixedtime.html?hour=17&min=30&sec=0&d…)
in German
* 6. November at 17:30 UTC
(http://www.timeanddate.com/worldclock/fixedtime.html?hour=17&min=30&sec=0&d…)
in English
Both of them will happen on IRC in #wikimedia-office on freenode. Logs
will be published afterwards for everyone who can't attend.
I hope to see many of you there.
Cheers
Lydia
PS: Don't forget the Wikidata Main Page design that needs your hand:
http://lists.wikimedia.org/pipermail/wikidata-l/2012-October/001104.html
;-)
--
Lydia Pintscher - http://about.me/lydia.pintscher
Community Communications for Wikidata
Wikimedia Deutschland e.V.
Obentrautstr. 72
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
Heya folks :)
The Wikidata team is excited to let you know that http://wikidata.org
is now online and running the repository code \o/ You can now create
items and add language links to the various Wikipedias to them. Thank
you to everyone who helped.
There are some caveats! We're still testing things and fixing minor
bugs. Please let us know about any problems you find and go easy on
the new wiki.
We're now working on phase two (infoboxes) and at the same time are
working on getting the client ready for deployment on the Hungarian
Wikipedia. I unfortunately can't yet give you a date for when that'll
happen.)
Cheers
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Community Communications for Wikidata
Wikimedia Deutschland e.V.
Obentrautstr. 72
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
Hello again,
[crosspost to wikidata-l]
At Tuesday 30 October 2012 17:16:45 DaB. wrote:
> Hello all,
>
> wikidata has made it on the toolserver today. The database is called
> "wikidatawiki_p" (don't ask me…) and its is home on s3. Because I guessed
> that many joins to it will happen in future, I created also copies on all
> clusters with the exception of s6 and s7 (will happen later); so you can
> join it in the same way as commons.
I found the following additional useful tables in the wikidata-database and
created views for them. If I missed something, please tell me. I also
interested in an URL to the wikidata-table-documentation.
*site_identifiers
*sites
*wb_changes
*wb_id_counters
*wb_items_per_site
*wb_terms
Sincerely,
DaB.
--
Userpage: [[:w:de:User:DaB.]] — PGP: 2B255885
I have added some additional code to my bot to create more localized labels.
Simple Example:
My bot creates an item containing a german (de) label that contains the
letter "ß". Then my bot additional creates a label for "de-ch" having
this letter replaced by "ss".
What i am not sure about is how to handle wikis having language
variants. Lets take
<http://sr.wikipedia.org/wiki/%D0%92%D0%B8%D0%BA%D0%B8%D0%BF%D0%B5%D0%B4%D0%…>
as example page.
What my bot know is:
* srwiki page title is "Википедија"
* label in "sr-ec" is "Википедија"
* label in "sr-el" is "Vikipedija"
* srwiki has local language code "sr" and displays variant sr-ec by default
So which labels should i add to wikidata for an item linking this srwiki
page? Only sr-ec and sr-el?
Or should i additionally add a label using code sr having the same
content as sr-ec?
Merlissimo
P.S.: If different wikis have the same language code like enwiki and
simplewiki (and specieswiki when enabled) my algorithm is not
deterministic in choosing a normalized title as label. The normalized
titles of the other wikis having the same language code are added as
aliases. I hope that is ok.
> Any feature requests?
Yes, I can think of several ones. To begin with, I would like a feature where the bot operator only needs to give the bot information on a wikipedia page. The bot would then fetch the interwiki links from that page and add them as sitelinks to wikidata.
I am thinking of an command like this one: script.py -site:Hermony Granger -lang:fr
Also, I have the same questions as Merlissimo has allready raised on the "features needed by langlink bots" thread.
Cheers,
Snaevar
----- Original Message -----
From: Joan Creus
Sent: 10/24/12 05:08 PM
To: Discussion list for the Wikidata project.
Subject: [Wikidata-l] New release of Pywikidata: feature requests?
Hi all,
I've released a new version of Pywikidata @
https://github.com/jcreus/pywikidata (not really a release, since there is
no version numbering; I should begin, maybe...).
Hi all,
I've released a new version of Pywikidata @
https://github.com/jcreus/pywikidata (not really a release, since there is
no version numbering; I should begin, maybe...).
New features include:
- Changes pushed to the server are only the properties which have
changed. This means that it is really faster, and it is the way it should
be (according to the spec).
- 'uselang' is no longer accepted, per spec.
- Bot flag allowed.
Right now there's some more stuff left to do (including adapting to changes
to the API, and adding the claim system [still beta]), but is there any
feature request of something which would be useful for bots?
In regards to Pywikipediabot integration I think someone is working on it
(thanks!); yet I think it would be better to wait. Pywikipedia is a mature
project and Pywikidata is still evolving constantly (mostly due to API
changes, which break it). So I'd wait until Pywikidata has matured too, and
Wikidata is deployed to WMF wikis.
By the way, pull requests will be gladly accepted :).
Joan Creus (joancreus@freenode, Joancreus@wikipedia)
Hi all,
I have a question or rather proposal regarding the JSON representation [1].
The "Geo" example on the JSON page[1] implies that there won't be a
fixed representation for data types. Instead of the "value" key that
all the other examples use, the Geo example uses "longitude" and
"latitude". Wouldn't a representation like the following be more
appropriate?
"value": {
"latitude" : 32.233,
"longitude" : -2.233,
}
That is, if "snaktype": "value", then there has to be a "value" key
with a data type specific value object.
Something that imho would also be useful, is a way to specify the data
type - this could be optional. For the Geo example something like the
following would make sense:
"datatype": "geo"
Without such a definition, a consumer would have to derive the data
type from the keys and/or the lexical representation of the values,
which would usually be a tough task.
Cheers,
Andreas
1. https://meta.wikimedia.org/wiki/Wikidata/Development/Phase_2_JSON
--
Telefon/Phone +49 6441 87087-32 · Telefax/Fax +49 6441 87087-17
E-Mail a.schultz(a)mes-semantics.com · Web http://mes-semantics.com
________________________________________________________________
MediaEvent Services GmbH & Co. KG
Berlin Office: Stendaler Straße 4 · 10559 Berlin · Germany
Wetzlar Office: Charlotte-Bamberg-Str. 6 · 35578 Wetzlar · Germany
Handelsregister/Commercial Register: Amtsgericht Wetzlar HRA 4015
USt-IdNr./VAT Reg.No. DE206509024
Komplementärin: MediaEvent Services Verwaltungs GmbH
Handelsregister/Commercial Register: Amtsgericht Wetzlar HRB 5079
Geschäftsführer/Managing Director: Tim Ebert
________________________________________________________________
Hi,
Lydia mentioned in her summary a major discussion about Wikidata in
the Hebrew Wikipedia. The discussion was in Hebrew of course, so I'll
bring a little summary of it.
Eleven people supported the installation of Wikidata. Nobody objected \o/
Despite the wide support, some issues and questions were raised:
1. How is the coordination with interwiki links bot operators progressing?
Will the bots be smart enough not to do anything to articles that are
already listed in the repository and have the correct links displayed?
Will the bots be smart enough to update the repo in the transition
period, when some Wikipedias have Wikidata and some don't?
Will the bots be smart enough not to do anything with articles that
have interwiki conflicts (multiple links, non-1-to-1 linking etc.)?
2. What are the numbers after the Q in the titles in the repo site? -
I replied that they are just sequential identifiers without any
additional meaning. Maybe it can be added to the FAQ.
3. Several people complained about instability in the links editing
pages in the repo: They saw messages about network problems when they
tried to edit links. I experienced this a couple of times, too. I also
saw a complete crash with a "memory full" error once.
4. Somebody noticed that the testing sites don't support unified
accounts (CentralAuth). The production system will, right?
5. Somebody complained that it's too easy to remove a link from a repo
- clicking the "remove" link is enough. I mentioned it in a bug
report:
https://bugzilla.wikimedia.org/show_bug.cgi?id=40200
6. And this is probably the biggest issue: The workflow for adding an
interlanguage link is cumbersome and in some cases the interface
elements are undiscoverable.
--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
“We're living in pieces,
I want to live in peace.” – T. Moore