Hey guys,
I'm running a own mediawiki for a few months now.
For mobile devices we use the MobileFrontend Extension, which works
great out of the box.
Now I got a request to serve the Desktop and Mobile Frontend on
different domains.
It is explained here:
https://www.mediawiki.org/wiki/Extension:MobileFrontend
Therefore I disabled the autodetection($wgMFAutodetectMobileView) and
set the mobile url template($wgMobileUrlTemplate) for our domain
like this: de-m.%h0.%h1 which should result in rendering the mobile view
on de-m.induux.com.
Did I forget something?
Thanks for your help in advance.
Best regards
Daniel Hauck
Hi,
Is there a way to get a list of messages that are used in the wiki, for example with {{int: a-message }} but that don't have a message defined for it?
Thanks!
Ad
On ArchWiki, specifying rdnamespace to any value (for example [1]) results in
internal database query error:
{
"error": {
"code": "internal_api_error_DBQueryError",
"info": "[e0c8b223] Database query error"
}
}
The query works fine without the rdnamespace parameter. Also note that the same
query on e.g. Wikipedia is completely fine even with rdnamespace parameter [2].
I've been able to reproduce this on localhost on my testing wiki by simply
upgrading from 1.22 to 1.25 (ArchWiki upgraded to 1.25 from 1.23). I've tried to
run the refreshLinks.php maintenance script [3], but it did not fix the issue.
Any ideas on what is going on and how to fix this?
[1]: https://wiki.archlinux.org/api.php?action=query&titles=Main%20page&prop=red…
[2]: https://en.wikipedia.org/w/api.php?action=query&titles=Main%20Page&prop=red…
[3]: https://www.mediawiki.org/wiki/Manual:RefreshLinks.php
--
jlk
Hey guys,
I'm having a bit of trouble connecting to a mysql database. And I'd love a
little help with this problem!
I have my wiki site spread out amongst three t2 micros on the AWS free
tier. Free of fee is the way to go! ;)
The site is load balanced behind two varnish nodes which are themselves
load balanced behind two HA/Proxy nodes.
The mysql database was housed on one of the nodes that all three web nodes
could access. That worked absolutely fine!! But the db would have the
annoying tendency to crash every once in a while.
So I came up with the idea to make the database HA as well. So I fired up
some more free tier nodes. I gave two nodes to haproxy, two more nodes for
master/master mysql. I then imported my mediawiki database to both mysql
nodes.
Whenever I try to use one of the new databases I setup, the site goes down
with the message:
"Sorry! This site is experiencing technical difficulties.
Try waiting a few minutes and reloading.
(Cannot access the database)"
However I found out that I could access the database from each of the web
nodes on the mysql command line. I can access the both the database VIP
that's load balanced behind keepalived/HAproxy. Or either of the two
databases sitting behind the VIP individually.
For example from the 1st web host to the database VIP using the database
credentials I have in LocalSettings.php:
[root@ops:~] #mysql -uadmin -p -h db.example.com -e "show databases"
Enter password:
+--------------------+
| Database |
+--------------------+
| certs |
| information_schema |
| jfwiki |
| mysql |
| performance_schema |
+--------------------+
Again from the first web host I can do the same to both database nodes
individually:
[root@ops:~] #mysql -uadmin -p -h db1.example.com -e "show databases"
Enter password:
+--------------------+
| Database |
+--------------------+
| information_schema |
| jfwiki |
| mysql |
| performance_schema |
+--------------------+
[root@ops:~] #mysql -uadmin -p -h db2.example.com -e "show databases"
Enter password:
+--------------------+
| Database |
+--------------------+
| certs |
| information_schema |
| jfwiki |
| mysql |
| performance_schema |
+--------------------+
I've also verified that I can use the wiki database with the account
information in LocalSettings.php.
So what I would like to know is, if I can contact and use the wiki database
from each host to each database using the command line, VIP and all, why
can't Mediawiki work with it? I would think that mediawiki could use any
database you can contact and use on the command line. I've also disabled
SELinux on all hosts just to be sure that wasn't causing an issue.
I'd be very glad to get any suggestions on how to fix this!!
Thanks,
TIm
How can I troubleshoot this?
--
GPG me!!
gpg --keyserver pool.sks-keyservers.net --recv-keys F186197B
Hi,
I just submitted my site to
https://developers.google.com/speed/pagespeed/insights/ and I get:
Consider Fixing: Leverage browser caching: Setting an expiry date
or a maximum age in the HTTP headers for static resources instructs the
browser to load previously downloaded resources from local disk
rather than over the network. Leverage browser caching for the following
cacheable resources:
http://www.xxx.com/w/load.php?debug=false&lang=en&modules=mediawiki.legacy.…
(5 minutes)
http://www.xxx.com/w/load.php?debug=false&lang=en&modules=site&only=scripts…
(5 minutes)
http://www.xxx.com/w/load.php?debug=false&lang=en&modules=site&only=styles&…
(5 minutes)
http://www.xxx.com/w/load.php?debug=false&lang=en&modules=startup&only=scri…
(5 minutes)
This is because by not logged in users, the resource modules are being
requested without "&version=..." since in OutputPage.php it sets
$version = null;
if ( $group === 'user' ) {
// Get the maximum timestamp
$timestamp = 1;
foreach ( $grpModules as $module ) {
$timestamp = max( $timestamp,
$module->getModifiedTime( $context ) );
}
// Add a version parameter so cache will break when
things change
$version = wfTimestamp( TS_ISO_8601_BASIC, $timestamp );
}
with the comment:
we shouldn't be putting timestamps in Squid-cached HTML
and then in the end when the request is handled it runs:
if ( is_null( $context->getVersion() ) || $errors ) {
$maxage = $wgResourceLoaderMaxage['unversioned']['client'];
$smaxage = $wgResourceLoaderMaxage['unversioned']['server'];
// If a version was specified we can use a longer expiry time
since changing
// version numbers causes cache misses
} else {
$maxage = $wgResourceLoaderMaxage['versioned']['client'];
$smaxage = $wgResourceLoaderMaxage['versioned']['server'];
}
which defaults to 5 minutes for unversioned modules and 30 days for
versioned modules.
My problem is that we have a shared server and no Squid caching, so I
think that we do want versioning so that we can get the normal 30 day
maxage.
Can anyone give me advice?
Maybe MediaWiki should contain an optional variable
$wgVersionedModules = true;
so that the line above can read:
if ( $group === 'user' || $wgVersionedModules === true) {
Thanks for any feedback,
Eli
Hi,
I want to import wiki pages into wikis which got exported by the dump
script.
Maybe I missed a parameter or so, but is there a possibility to
overwrite all existing
pages with the import versions? Is there an existing solution to solve this?
Greetings
Frank
I am installing Mediawiki 1.25 on FreeBSD 10-STABLE, Apache 2.24,
PHP 5.6.11, and Postgres 9.3.9.
I can step through all the data entry screens of the mw-config
script, but after I click Continue in response to the screen:
'By pressing "Continue", you will begin the installation of
MediaWiki.'
I get no response from MediaWiki. Apache takes one of my eight
cores to 100% utilization, but there is never any HTTP response from
the Mediawiki configuration script. After about 15 or 16 minutes,
the browser times out, but the Apache process continues running at
100% of CPU.
Can anyone suggest some possible sources of this problem, or issues
that can be eliminated as the source of this trouble? The web
database username and password can successfully connect to Postgres
manually and create a table, etc. The pgsql (installation) database
user and password can successfully create and drop databases, etc.
The database to be used already exists, and is owned by the web
database username (role).
But the installation script never completes, so I have not been
able to generate a LocalSettings.php file. How can I get past
this hurdle?
Thank you for your time. Please holler if I can provide more
information.
Jim