I'm referencing [1] in an attempt to get Cirrus Search (Elasticsearch) to
search and return results from Flow boards in MW 1.25. Some of the scripts
like FlowSearchConfig.php don't seem to be in REL1_25, but they do seem to
be in REL1_26. Does this mean I won't be able to get this to work in 1.25
or is there another (older) set of instructions on how to configure so ES
will search Flow content?
Daren
[1] https://www.mediawiki.org/wiki/Flow/Architecture/Search
--
__________________
http://enterprisemediawiki.orghttp://mixcloud.com/darenwelshhttp://www.beatportfolio.com
On Wed, Dec 2, 2015 at 1:58 PM, Grace Gellerman <ggellerman(a)wikimedia.org>
wrote:
<snip>
> === Reading Infrastructure ===
> * Block: ApiSandbox is still blocked on whoever owns oojs-ui for
> https://phabricator.wikimedia.org/T91148 - ACTION! let's figure out an
> alternative approach (James F suggests separate component in mw.Widgets)
</snip>
Could people weigh in on the ticket about what it might take to get this
into mw.Widgets instead? James noted, as has been discussed on this list,
that at the moment people are leery of adding any further bloat to oojs-ui
(particularly if it's a nontrivial number of KB like this). A fuller
solution would entail splitting out critical oojs-ui components in a manner
where they're more easily loaded piecemeal, but it sounded like that's more
involved.
I do wonder though if we've spent much time studying the ease of getting at
least some part of oojs-ui split out or making it so that new stuff going
forward is part of the oojs-ui family but it's not as monolithich?
-Adam
I'm not very familar with this, but wouldn't this need a bigger change in LinksUpdate? Or the question: how would a wiki know, if a page get's created after it was linked and mark it blue instead of red?
Gesendet mit meinem HTC
----- Nachricht beantworten -----
Von: "Alex Monk" <krenair(a)gmail.com>
An: "Wikimedia developers" <wikitech-l(a)lists.wikimedia.org>
Betreff: [Wikitech-l] Ifexists across wikis
Datum: So., Dez. 6, 2015 18:04
I don't think there is a way to get a database name from an interwiki
prefix.
Also, whether a page is known or not does not just depend on a simple
database lookup. Extensions can add arbitrary rules about which titles
should be considered known or not. EducationProgram, GlobalUserPage, and
WikimediaIncubator all do this.
On 6 December 2015 at 16:26, Lars Aronsson <lars(a)aronsson.se> wrote:
> If I write a [[link]] it will be blue if the page exists and red otherwise.
> But if I write [[:sw:link]] that will be an external or cross-wiki link,
> that is never red, as if it were impossible to know whether that page
> existed in Swahili Wikipedia.
>
> But determining the existence of a page is just a quick database table
> lookup, and all databases run on WMF's servers, so it shouldn't be more
> expensive to look up a cross-wiki link, as long as it is one of WMF's
> wikis.
>
> In Wiktionary, it is common to link to entries in foreign languages both
> on the local wiki and to the native wiki for that language. For example,
> in English Wikitionary the entry for "blue" links to the Swahili word
> "bluu"
> both on en.wiktionary and on sw.wiktionary, using the template
> {{t+|sw|bluu}}.
>
> https://en.wiktionary.org/wiki/blue#Translations
>
> But since the Afrikaans translation "blou" doesn't have an entry on the
> Afrikaans Wiktionary, another template is used: {{t|af|blou}}. And it is
> a pain to know which one of these two templates to use. If it was possible
> in {{#ifexists}} to determine the existence of a page in another wiki,
> only one template would be needed, and the bot job to change to the right
> template would not be needed.
>
> #ifexist already works across namespaces (well, of course), so is there any
> good reason it shouldn't work across wikis?
>
> Oddly, the documentation says #ifexist is an "expensive" parser function.
> That doesn't make much sense to me. It's as if red/blue links were
> expensive, and most of our list pages should be banned.
> https://www.mediawiki.org/wiki/Help:Extension:ParserFunctions#.23ifexist
>
>
> --
> Lars Aronsson (lars(a)aronsson.se)
> Aronsson Datateknik - http://aronsson.se
>
>
>
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> * Two questions for Services or Multimedia or whoever:
> https://phabricator.wikimedia.org/T66214#1842439 - ACTION! email thread,
> keep discussion on task, though
Here's another one from the Scrum of Scrums a couple days earlier. Just a
nudge to keep the discussion moving. Please do comment.
-Adam
Please see below if you want to follow test failure messages for some of
the key Reading Web (formerly Mobile web) technologies
-Adam
---------- Forwarded message ----------
From: Bahodir Mansurov <bmansurov(a)wikimedia.org>
Date: Wed, Nov 25, 2015 at 5:53 PM
Hello all,
As you may know, we used to get emails about browser test failures for the
MobileFrontend, Gather, and QuickSurveys to this mailing list (along with
to qa-alerts(a)lists.wikimedia.org). Some people seem to have muted those
(rightly so if they are not interested), other interested people may not
have been getting those emails because they are not subscribed to
reading-wmf. So, in order to reach a broader audience and stop bugging
folks who are not interested in these emails, we will no longer send those
automated emails to this list.
I encourage the reading web team, its product manager, editing team members
who contribute to MobileFrontend, and others who like reading emails that
start with the word "FAILURE:" in bold and red, please subscribe to
qa-alerts(a)lists.wikimedia.org.
I used the following filter to only get the emails I'm interested in:
to:(qa-alerts@lists.wikimedia.org) (MobileFrontend OR QuickSurveys OR
Gather)
Thank you,
Baha
Hello,
We often have the case of a change to an extension depending on a
pending patch to MediaWiki core. I have upgraded our CI scheduler -
Zuul - a couple weeks ago and it now supports marking dependencies even
in different repositories.
Why does it matter? To make sure the dependency is fulfilled one
usually either:
* CR-2 the patch until dependent change is merged
* write a test that exercise the required patch in MediaWiki.
With the first solution (lack of test), once both are merged, nothing
prevent one from cherry picking a patch without its dependent patch.
For example for MediaWiki minor releases or Wikimedia deployment branches.
When a test covers the dependency, it will fail until the dependent one
is merged which is rather annoying.
Zuul now recognizes the header 'Depends-On' in git messages, similar to
'Change-Id' and 'Bug'. 'Depends-On' takes as parameter a change-id and
multiple ones can be added.
When a patch is proposed in Gerrit, Zuul looks for Gerrit changes
matching the 'Depends-On' and verify whether any are still open. In such
a case, it will craft references for the open patches so all the
dependencies can be tested as if they got merged.
Real world example
------------------
The ContentTranslation extension is tested with the Wikidata one and was
not passing the test. Wikidata created a patch and we did not want to
merge it until we confirm the ContentTranslation one is passing properly.
The Wikidata patch is https://gerrit.wikimedia.org/r/#/c/252227/
Change-Id: I0312c23628d706deb507b5534b868480945b6163
On ContentTranslation we indicated the dependency:
https://gerrit.wikimedia.org/r/#/c/252172/1..2//COMMIT_MSG
+ Depends-On: I0312c23628d706deb507b5534b868480945b6163
Which is the Wikidata patch.
Zuul:
* received the patch for ContentTranslation
* looked up the change-id and found the Wikidata
* created git references in both repo to point to the proper patches
Jenkins:
* zuul-cloner cloned both repos and fetched the references created by
the Zuul service
* run tests
* SUCCESS
That confirmed us the Wikidata patch was actually fixing the issue for
ContentTranslation. Hence we CR+2 both and all merged fine.
Please take a moment to read upstream documentation:
http://docs.openstack.org/infra/zuul/gating.html#cross-repository-dependenc…
Wikidata/ContentTranslation task:
https://phabricator.wikimedia.org/T118263
--
Antoine "hashar" Musso