Q
Le 22 mai 2018 13:01, <wikitech-l-request(a)lists.wikimedia.org> a écrit :
> Send Wikitech-l mailing list submissions to
> wikitech-l(a)lists.wikimedia.org
>
> To subscribe or unsubscribe via the World Wide Web, visit
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> or, via email, send a message with subject or body 'help' to
> wikitech-l-request(a)lists.wikimedia.org
>
> You can reach the person managing the list at
> wikitech-l-owner(a)lists.wikimedia.org
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Wikitech-l digest..."
>
>
> Today's Topics:
>
> 1. TechCom Radar 2018-05-19 (Kate Chapman)
> 2. [nominations needed] Wikimedia Technical Conference
> (Deborah Tankersley)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Mon, 21 May 2018 13:42:49 -0700
> From: Kate Chapman <kate(a)cascadiatm.com>
> To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
> Subject: [Wikitech-l] TechCom Radar 2018-05-19
> Message-ID: <a56da236-11a4-6898-9102-a171a15882b8(a)cascadiatm.com>
> Content-Type: text/plain; charset=utf-8
>
> Hi All,
>
> Here are the minutes from this week's TechCom meeting:
>
> TechCom held a public meeting at the Hackathon in Barcelona. These
> minutes are the result of that meeting.
>
> * Detailed Meeting Notes:
> <https://etherpad.wikimedia.org/p/wmhack2018-techcom>
>
> * No public IRC RFC meeting on 2018-05-23
>
> * No internal TechCom meeting on 2018-05-23
>
> * There is a request for feedback on the future of extension management
> (not an official TechCom RFC yet, but is being monitored)
> <https://www.mediawiki.org/wiki/Requests_for_comment/
> Extension_management_feedback>
>
> * Held a public discussion of TechCom’s Platform Architecture Principles
> <https://www.mediawiki.org/wiki/Wikimedia_Technical_
> Committee/MediaWiki_Platform_Architecture_Principles>
> Notes:
> <https://www.mediawiki.org/wiki/Wikimedia_Technical_
> Committee/MediaWiki_Platform_Architecture_Principles/Hackathon_Notes_2018>
>
> * TechCom-RFC Inbox was empty so no new RFCs were triaged:
> <https://phabricator.wikimedia.org/tag/techcom-rfc/>
>
> You can also find our meeting minutes at
> <https://www.mediawiki.org/wiki/Wikimedia_Technical_Committee/Minutes>
>
> See also the TechCom RFC board
> <https://phabricator.wikimedia.org/tag/mediawiki-rfcs/>.
>
> --
> Kate Chapman
> TechCom Facilitator (Contractor)
>
>
>
>
> ------------------------------
>
> Message: 2
> Date: Tue, 22 May 2018 00:47:53 +0200
> From: Deborah Tankersley <dtankersley(a)wikimedia.org>
> To: wikitech-l <wikitech-l(a)lists.wikimedia.org>, A public mailing
> list about Wikimedia Search and Discovery projects
> <discovery(a)lists.wikimedia.org>
> Subject: [Wikitech-l] [nominations needed] Wikimedia Technical
> Conference
> Message-ID:
> <CABo2faawcr0soHZn4ju5xThQPD2dog+rypLYG3mMm9fAGfK4og@mail.
> gmail.com>
> Content-Type: text/plain; charset="UTF-8"
>
> *Hello,We recently announced the new Wikimedia Technical Conference
> (TechConf) during the closing session of the Barcelona Hackathon on May 20,
> 2018. We are sending this email to give an update on the planning and
> organization, and to also let everyone know how the nomination process will
> work for those interested in attending. The Wikimedia Technical Conference
> will take place in Portland, OR, USA on October 22-25, 2018. And, as
> mentioned in previous emails [1][2] and on the wiki page [3], this
> conference will be focused on the cross-departmental program called
> Platform Evolution. We will be providing more information and context as we
> go along in the process.For this conference, we are looking for diverse
> stakeholders, perspectives, and experiences that will help us to make
> informed decisions for the future evolution of the platform. We need people
> who can create and architect solutions, as well as those who actually make
> decisions on funding and prioritization for the projects.Later this week,
> we will send out a form to provide more detailed information on the
> nomination process and how to nominate people (or it can be yourself) to
> attend this conference, along with the skills, experiences, and/or
> backgrounds that we are looking for. Due to the time needed for visa
> application and other constraints, the deadline for nominations will be
> June 8th. Please make sure that you don’t miss the deadline!If you have any
> questions, please post them on the talk page [4][1]
> https://lists.wikimedia.org/pipermail/mediawiki-l/2018-April/047367.html
> <https://lists.wikimedia.org/pipermail/mediawiki-l/2018-April/047367.html>
> [2] https://lists.wikimedia.org/pipermail/wikitech-l/2018-
> April/089738.html
> <https://lists.wikimedia.org/pipermail/wikitech-l/2018-April/089738.html>
> [3] https://mediawiki.org/wiki/Wikimedia_Technical_Conference/2018
> <https://mediawiki.org/wiki/Wikimedia_Technical_Conference/2018> [4]
> https://www.mediawiki.org/wiki/Talk:Wikimedia_Technical_Conference/2018
> <https://www.mediawiki.org/wiki/Talk:Wikimedia_Technical_Conference/2018>
> *
>
> *Cheers from the Program Committee:*
> *Kate, Corey, Joaquin, Greg, Birgit and TheDJ*
>
> --
>
> deb tankersley
>
> Program Manager, Engineering
>
> Wikimedia Foundation
>
>
> ------------------------------
>
> Subject: Digest Footer
>
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
> ------------------------------
>
> End of Wikitech-l Digest, Vol 178, Issue 36
> *******************************************
>
The Wikidata development team released to production the first version of
Wikidata support of lexicographical data: https://lists.wikimedia.org/
pipermail/wikidata/2018-May/012090.html.
A large issue of *The Signpost* was published with several
thought-provoking pieces:
https://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2018-05-24.
Multilingual captions will be an early feature of structured data on
Commons. A request for feedback is here: https://commons.wikimedia.org/
wiki/Commons:Structured_data/Get_involved/Feedback_requests/Multilingual_
Captions_and_MediaInfo
What's making you happy this week? You are welcome to comment in any
language.
Pine
( https://meta.wikimedia.org/wiki/User:Pine )
Hello everyone,
to better serve the technical communities that build free and open source software for the movement as well as the communities who use Wikimedia's APIs to interact with our projects, the Wikimedia Foundation is making some structural changes. The Technical Engagement team is a new team in the Technology department of the Wikimedia Foundation reporting to the Foundation's Chief Technology Officer (CTO), Victoria Coleman. This new team has two sub-teams: the Wikimedia Cloud Services team and the Technical Advocacy team. Bryan Davis will manage the Technical Engagement teams. He will also lead the hiring process for a new Developer Advocacy Manager position, which will take over some of the management duties.
The Wikimedia Cloud Services team will continue to focus on maintaining the Wikimedia Cloud VPS infrastructure as a service <https://en.wikipedia.org/wiki/Cloud_computing#Infrastructure_as_a_service_.…> platform, the Toolforge platform as a service <https://en.wikipedia.org/wiki/Platform_as_a_service> project, and additional supporting technologies used in the Cloud Services environment such as the Wiki Replica databases and the hosting infrastructure for dumps.wikimedia.org <https://dumps.wikimedia.org/>. The existing team of Andrew Bogott, Arturo Borrero Gonzalez, Brooke Storm, and Chase Pettet will be joined by James Hare in the role of Product Manager. The team is also hiring for a fifth Operations Engineer and for a part-time technical support contractor.
The Technical Advocacy team will focus on creating improved documentation for Wikimedia APIs and services as well as providing support for technical contributors and API consumers. The new team is being formed by moving the Foundation's Developer Relations team to the Technology department, with the exception of Rachel Farrand who will remain in Community Engagement in close collaboration with other event organizers. Andre Klapper and Srishti Sethi are both taking the role of Developer Advocate in the new team. A developer advocate is someone whose primary responsibility is to make it easy for developers to use a platform. Typically they do this by producing example software, tutorials, and other documentation explaining how to use the platform's products and services. Sarah R. Rodlund will also be joining the team as a Technical Writer. Technical writing has many subspecialties. Sarah will be focusing on improving our existing documentation by helping create a style guide and editing existing documentation to fit with that guide. She will also be supporting volunteers who are interested in practicing their technical writing skills on Wikimedia documentation. The team will be hiring for a Developer Advocacy Manager role in July. This new person will help round out the skills of the team and will take the lead in developing their programs.
The Technical Engagement team will work with other teams inside the Wikimedia Foundation as well as groups at affiliate organizations and the larger Wikimedia volunteer community to provide technical outreach services and support. We hope to continue to grow the number of people involved in our programs until we can confidently say that we are providing the best help possible to the hundreds of volunteer developers, designers, technical writers, and end users of the Wikimedia movement's APIs and services. We will continue to be involved in existing programs to attract and support new technical contributors like the Wikimedia Hackathons, Outreachy, and Google Summer of Code. We also hope to find new ways to connect with new and existing technical contributors as we support the Wikimedia movement's 2030 strategic direction and the shared goals of knowledge as a service and knowledge equity.
Very excited to be getting started down the path of strengthening our developer advocacy program!
Best wishes,
Victoria Coleman
Chief Technology Officer
Wikimedia Foundation
1 Montgomery Street, Suite 1600
San Francisco, CA 94104
+1-650-703-8112
vcoleman(a)wikimedia.org
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA512
Hello!
With MediaWiki core now requiring PHP 7.0 or HHVM as the minimum
version, CodeSniffer will now also require the same, and targets those
versions as the minimum for the code it will be checking starting with
version 20.0.0.
19.0.0 contains new sniffs and bug fixes, and is the last version to
support PHP 5.5 and 5.6. I will have libraryupgrader upgrade all
repositories directly to 20.0.0. If you want to continue to support
older PHP versions in your project, upgrade your project to use 19.0.0
directly. libraryupgrader will skip projects using that version, since
it means you explicitly want to support older PHP versions.
Here's the changelog for 19.0.0:
* Add break and continue to ParenthesesAroundKeywordSniff (Umherirrender
)
* Check if the default of null is in the type list of @param
(Umherirrender)
* Do not enforce name for traits with phpunit annotations (Umherirrender
)
* Don't allow 'iterable' type hint (Kunal Mehta)
* Prevent usage of nullable and void type hints (Kunal Mehta)
* Prohibit PHP's vanilla execution (Max Semenik)
* Reorganize PHP 7.0 compatibility sniffs into a category (Kunal Mehta)
And for 20.0.0:
* Require PHP 7 or HHVM to run (Kunal Mehta)
* Document why we still need to keep ScalarTypeHintUsageSniff (Kunal
Mehta)
* Drop PHP7UnicodeSyntaxSniff sniff (Kunal Mehta)
- -- Legoktm
-----BEGIN PGP SIGNATURE-----
iQIzBAEBCgAdFiEE+h6fmkHn9DUCyl1jUvyOe+23/KIFAlsHy48ACgkQUvyOe+23
/KICmA/+OCzvr49gSu8VeB4+dQZR2qHp25rdShszsmsqr6iXKeObxUahLM0TcgtO
3x/TNCuT2uGnXrwVhzfqIglj0xs85niRx11tvCAm1rFPCMZKW8k1usA/s7nI5yQ4
3vnlkyZxhTKL+4VG0U0v2aMcQh8h+WhLyGqFZ9BW9CxzjZpxXuNJ/mP7KJtqaCZp
EH/23gzf/fez1szV6ZIeh42Bv11HRxZBGpiJGnh7Vs5xKmwzfhUZQsvKBq5vBit0
+snbt+c4QpqZ9hEsa5vFqYQKDRj6EybuFlTLRmPYXScfm0udm+6h5EqCfuz2QdUB
n88Cc/FApsgtUlve0LsOalpU/ePvEd3TPb33LU1mc/+bhBZs4j1E5mHlgmI28pWA
1UxuK8V5KP63ZcaWQsemU7vQ4wnS0PyPF+gabDxLmJNtW929AZe52mZxKq7lBO4K
rUdqHoGZylmIM/etM5LLnEpZ2rNPL9+S+JAtFV/kke60F6VLVFgfDMIVH3phvj7t
HrO2J+jq5eRQ8C4K1zvMXpyz1+vBtxma1yBsZMcrZZOlMaxoTjczWoLro3izBqs4
6BIZD4koEdtOJhcsz1GgnyaF6jGpWESirjwf9zRoruoZfGMrozEpvrk4SRiuHVoF
PqBs9bSUJ+IjrzQZD4dtdBEcb1m6sZ6UlQhxJ+8TQ3ChnHIGM5I=
=jP1N
-----END PGP SIGNATURE-----
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA512
Hi,
MediaWiki core's master branch now requires a minimum of PHP 7.0 or
HHVM[1], as decided upon in the RfC. CI has been updated no longer run
PHP 5.5 tests.
There are already quite a few patches[2] to remove code that isn't
necessary now, or start using new features :) But out of caution,
we're waiting until after next week's Wikimedia train deploy before
merging those to ensure it doesn't need to be reverted for any reason.
It's important to remember that while nearly all of the new PHP 7.0
features can now be used, they still need to be supported in HHVM
(with hhvm.php7.all disabled). Notably, this means that scalar type
hints cannot be used. PHPCS should prevent those features from being
used (more on that in a separate email).
Thanks to everyone who made this finally happen, especially the
Wikimedia Operations team :)
- -- Legoktm
[1] https://gerrit.wikimedia.org/r/#/c/405216/
[2] https://gerrit.wikimedia.org/r/#/q/topic:new-php
-----BEGIN PGP SIGNATURE-----
iQIzBAEBCgAdFiEE+h6fmkHn9DUCyl1jUvyOe+23/KIFAlsHydgACgkQUvyOe+23
/KJvEBAAj3ljVLOuSQitaB+6xakUTlZZpkcjRLyrbJ8KbfxyycB94pzlZ4fRdqHY
nBnrokrawqQqgVEai9hq7nfVngWYH2eC11dhIXrpj2oklxswVaOxxwhPF92SJ9IP
cuDawz9v6sqgMKDnQmHmTkpNmsWzez4d7LZMPW6UHKSGBl48ZZc5ZiTYlFyBKeLB
BY5CNHY41flx21Z+BTQBQ2+tZ/Fm2kv6tAgI9gj8wpS5lviq1bhO+AH2/7qEcpjm
LHZTcvrxHgctgU8XbEWXeGbNKK9/JUyuHApiDcgSXi81qGRHgaUytmYunXNQmsP8
51GzpGBL8R7YVP6uPFeYEcEJgvFAFa+xyeGHjDtTNniUj5co3eVixWvD668jCFc9
Lfd4O73vNMQluksNw81DEZQ5Pt+MMhA1DMqmAw+rwRYVdunfbumhh1uze03nJuCj
Pk386sJkXVmawbc8PV9U8tnUoNIY+ngdzjl0ivOGwlxRpt30Usv0XxiA2i1R9IUU
+BNGETmgpRg9i7cA4yqYGeOUeiYQhh4PHQeKIURp0M7BHChYsRCUqg6+blSNRqf4
TXpwIzAZ7Y3DjCxJLmn2zF1wHu5qSF20Af+hCf6A3WzYrnULGe2UtAk2BdOpJJ7h
zgtG/lLuQPHCXxAfrP8gkVtfkTL+LGD1ZZm+AGlpDWYNNnB5pqE=
=Xzun
-----END PGP SIGNATURE-----
Sorry for cross-posting!
Reminder: Technical Advice IRC meeting again **today, Wednesday 3-4 pm
UTC** on #wikimedia-tech.
The Technical Advice IRC meeting is open for all volunteer developers,
topics and questions. This can be anything from "how to get started" over
"who would be the best contact for X" to specific questions on your project.
If you know already what you would like to discuss or ask, please add your
topic to the next meeting:
https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting
Hope to see you there!
Michi (for WMDE’s tech team)
--
Michael F. Schönitzer
Wikimedia Deutschland e.V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Tel. (030) 219 158 26-0
http://wikimedia.de
Stellen Sie sich eine Welt vor, in der jeder Mensch an der Menge allen
Wissens frei teilhaben kann. Helfen Sie uns dabei!
http://spenden.wikimedia.de/
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/681/51985.
INTRODUCTION
Some technical topics are broached toward the crowdsourcing of dialogue system content and behavior.
COLLABORATIVE AUTHORING
The content and behavior of a dialogue system can be represented in a number of ways.
Firstly, the content and behavior of a dialogue system can be represented in programming language source code files. Collaborative authoring, in this case, is a matter of integrated development environments and source code repositories, version control systems.
Secondly, some or all of the content and behavior of a dialogue system can be separated from the source code files as data stored in some data format or in a database. Collaborative authoring, in this case, could require custom software tools.
Thirdly, a number of services, cognitive services, can encapsulate the content and behavior of a dialogue system. Collaborative authoring, in this case, could require utilization of such services or related user interfaces.
Fourthly, the content and behavior of a dialogue system can be represented as a set of interrelated, URL-addressable, editable pages. Servers can provide content for a number of different content types, for example hypertext for content authors and other formats for dialogue system user agents. Server-side scripting can be utilized to generate pages and generated pages can contain client-side scripting.
Fifthly, the content and behavior of a dialogue system can be represented as a set of interrelated, URL-addressable, editable diagrams.
Sixthly, the content and behavior of a dialogue system can be represented in transcript form. Transcript-based user interfaces may resemble instant messaging applications, scrollable sequences of speech bubbles, with speech bubbles coming from the left and right sides, such that users can edit the content in dialogue systems’ speech bubbles. Users could opt to view more than plain text in speech bubbles. There could also be vertical, colored bands in one or both margins, visually indicating discourse behaviors, moves, objectives or plans which span one or multiple utterances.
COLLABORATIVE DEBUGGING
Debugging dialogue systems is an important topic. Debugging scenarios include switching from interactions with dialogue systems to authoring processes such that dialogue context data is preserved.
NATURAL LANGUAGE GENERATION AND UNDERSTANDING
Natural language generation can produce editable structured documents from the data stored in databases and knowledgebases. Generated content can contain, beyond natural language, data and program logic to facilitate the processing of constrained or unconstrained edits. Edits to generated content can result in changes to stored data.
COMPUTER-AIDED WRITING
Computer-aided writing can convenience content authors and assure quality. Software can, generally speaking, provide users with information, warnings and errors with regard to tentative edits. Software can support users including with regard to their spelling, grammar, word selection, readability, text coherence and cohesion. Software can measure the neutral point of view of natural language. Software can also process tentative edits with regard to their logical consistency with respect to data stored in databases and knowledgebases.
WIKI DIALOGUE SYSTEMS
Exploration into the collaborative authoring and debugging of dialogue systems could result in new wiki technologies. Wiki dialogue systems could resemble spoken language dialogue systems with transcript-based user interfaces, users able to easily switch between dialogue-based interactions and the editing of dialogue system content and behavior.
Best regards,
Adam Sobieski
http://www.phoster.com/contents/
We have a very old wiki which has basically never been updated for the past
decade and which was proving stubbornly resistant to updating several years
ago. And now the owner of the server has drifted away, but we do still
have control over the domain name itself. The best way that we can think
of to update everything is to scrape all of the pages/file, add them to a
brand new updated wiki on a new server, then point the domain to that new
server. Yes, user accounts will be broken, but we feel that this is the
most feasible solution unless someone else has another idea.
However, there's a lot of pages on meritbadge.org -- which is the wiki I'm
talking about. Any suggestions for how to automate this scraping process?
I can scrape the HTML off every page, but what I really want is to get the
wikitext off of every page.
Bart Humphries
bart.humphries(a)gmail.com
(909)529-BART(2278)