Hello,
I installed the latest version of MediaWiki to a sub-doman called wiki, on
my Hostgator Virtual Private Server.
In the \wiki\config.php, to allow uploads.
I can upload files, but the do not display on any of the wiki pages.
Here is an example
http://wiki.marylandshallissue.org/index.php?title=Maryland_Handgun_Qualifi…
Any help appreciated.
Hi,
Thanks for your answer!
The current revision ID for the Main Page listed in the page table is
555 (field page_latest). This is also the highest value of the field
rev_id for that page in the revision table.
Any other possible sources for the error?
Best,
Sebastian
Am 08.08.15 um 14:00 schrieb mediawiki-l-request(a)lists.wikimedia.org:
> Message: 2 Date: Fri, 7 Aug 2015 09:05:41 -0400 From: John
> <phoenixoverride(a)gmail.com> To: MediaWiki announcements and site admin
> list <mediawiki-l(a)lists.wikimedia.org> Subject: Re: [MediaWiki-l] Cannot
> edit Main Page Message-ID:
> <CAP-JHp=ebKbeMa=5MzoppgtqsGb6J2r4WKc+nRVsHBF0-F5abw(a)mail.gmail.com>
> Content-Type: text/plain; charset=UTF-8 Does the revision table have a
> row for the current revision ID listed in the page table? On Fri, Aug 7,
> 2015 at 8:51 AM, Sebastian Sulger < sebastian.sulger(a)uni-konstanz.de>
> wrote:
>> >Hi,
>> >
>> >Hoping that someone can help.
>> >
>> >I have recently restored two wikis from a backup after a harddisk failure.
>> >The wikis make use of a single MySQL database, each of them using prefixed
>> >tables. One of the wikis is running just fine after the restore; the other
>> >one is not.
>> >
>> >The issue is that all of the content on the malfunctioning wiki's main
>> >page is gone; the page just displays the sample text "There is currently no
>> >text in this page. You can search ..." Even worse, I cannot edit the wiki's
>> >main page: clicking the edit button when registered brings up the message
>> >"No such section --- You tried to edit a section that does not exist. Since
>> >there is no section , there is no place to save your edit."
>> >
>> >What's also confusing is that all of the other pages seem to be fine; I
>> >can search for pages, and clicking on "Random page" does what it's supposed
>> >to do. I can also edit all other pages except for the main page.
>> >
>> >Has anyone come across anything remotely similar?
>> >
>> >I am pretty sure it has to be an error within the database. If I export an
>> >XML dump of all the pages, the main page is not among the pages in the XML.
>> >However, the page table contains an entry for the main page. How can I test
>> >the validity of that entry?
>> >
>> >Best,
>> >Sebastian
>> >
>> >Version information:
>> >
>> >MediaWiki: 1.14.0
>> >
>> >PHP: 5.2.11
>> >
>> >MySQL: 5.1.37
>> >Reply a day ago
>> >134.34.80.80 (talkcontribs)
>> >
>> >
>> >--
>> >Sebastian Sulger
>> >FB Sprachwissenschaft
>> >Universität Konstanz
>> >http://ling.uni-konstanz.de/pages/home/sulger
>> >
>> >
>> >_______________________________________________
>> >MediaWiki-l mailing list
>> >To unsubscribe, go to:
>> >https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>> >
--
Sebastian Sulger
FB Sprachwissenschaft
Universität Konstanz
http://ling.uni-konstanz.de/pages/home/sulger
Here is one that just bit me bad for several days of head-scratching while installing MariaDB 10.1.6. It kept complaining about a permissions problem trying to access the data directory, which I checked multiple times, shotgunning changes until I had all the files "777", and it STILL wouldn't work!
Keep in mind that you need at least "r-x" permission *all the way up to root* in order to access a directory.
In my case, the data directory was in "/usr/local/var/mysql" and some smart-ass package management system had assumed ownership/groupship of "/usr/local/var" and set the perms to "rwxr-x---". I changed that to "rwxr-xr-x" and it worked.
Be certain whatever user is running your webserver (certainly NOT "root") has at least "r-x" permission all the way up to "/".
:::: The idea that the genetics of novel lifeforms will stay put in GMOs released on farms... is laughable... We are unable to stop the accumulation of risk of catastrophic change resulting from nuclear radiation, the greenhouse effect, or genetic engineering. -- David Holmgren
:::: Jan Steinman, EcoReality Co-op ::::
Hi,
Hoping that someone can help.
I have recently restored two wikis from a backup after a harddisk
failure. The wikis make use of a single MySQL database, each of them
using prefixed tables. One of the wikis is running just fine after the
restore; the other one is not.
The issue is that all of the content on the malfunctioning wiki's main
page is gone; the page just displays the sample text "There is currently
no text in this page. You can search ..." Even worse, I cannot edit the
wiki's main page: clicking the edit button when registered brings up the
message "No such section --- You tried to edit a section that does not
exist. Since there is no section , there is no place to save your edit."
What's also confusing is that all of the other pages seem to be fine; I
can search for pages, and clicking on "Random page" does what it's
supposed to do. I can also edit all other pages except for the main page.
Has anyone come across anything remotely similar?
I am pretty sure it has to be an error within the database. If I export
an XML dump of all the pages, the main page is not among the pages in
the XML. However, the page table contains an entry for the main page.
How can I test the validity of that entry?
Best,
Sebastian
Version information:
MediaWiki: 1.14.0
PHP: 5.2.11
MySQL: 5.1.37
Reply a day ago
134.34.80.80 (talkcontribs)
--
Sebastian Sulger
FB Sprachwissenschaft
Universität Konstanz
http://ling.uni-konstanz.de/pages/home/sulger
Hi,
For those who are interested in the Cargo extension, I just released a
two-page "quick reference"/"cheat sheet" that covers the functionality of
Cargo, as well as that of the extensions Semantic Forms, External Data and
ParserFunctions, and a little bit of core MediaWiki. It's available in both
PDF and PNG versions. You can find links to it here:
https://www.mediawiki.org/wiki/Extension:Cargo/Other_documentation
-Yaron
Hey guys,
I've setup mediawiki behind Varnish 4. And I've figured out a way to get
updates to the wiki to pass through to the web server. But of course any
edits you make to the wiki site don't get updated unless you clear the
varnish cache.
I tried putting this into the varnish config under vcl_recv in an attempt
to get allow the site to be updated when you edit the wiki:
# Allows you to edit the wiki
if (req.url ~ "&action=submit($|/)") {
return (pass);
ban(req.url);
}
And in the LocalSettings.php file on the mediawiki setup, I have the
following:
$wgUseSquid = true;
$wgSquidServers = array( xx.xx.xx.xx', 'xx.xx.xx.xx' );
$wgUsePrivateIPs = true;
Any ideas on how I can get mediawiki to update it's content automaticaly
from behind varnish?
Thanks,
Tim
List members:
I have a strange problem to report tonight. I will try to give as much
information as possible.
As part of an upgrade of a MediaWiki 1.14 installation to 1.25, I
installed the Math extension on my server. This is a fully-dedicated
server to which I have root and ssh access. I even made sure to place
instances of every conceivably required executable in the /usr/bin
directory, where any user, including system users, could access it. That
includes texvc, texvccheck, and texvc-everything else, in addition to
the classic LaTeX executables.
I just went through the full routine to troubleshoot math rendition errors:
https://www.mediawiki.org/wiki/Manual:Troubleshooting_math_display_errors#.…
ls -lH `which gs` `which latex` `which dvips` `which convert`
Check.
ls -lH `which dvipng`
Check.
texvc /home/wiki/tmp /home/wiki/math "y=x+2"
And here is where I start having some problems.
When I create a directory called "images" in a /private/ directory
(~/private), and in that directory I create the directories "math" and
"tmp," and then run
texvc images/tmp images/math "y=x+2"
I get an output that shows the steps it took to render that math. The
output ends with one of the generated tags. And when I go to the
directory ~/private/images/math, there lies a new file. I download that
onto my own machine, and it shows the display I expected. In fact, here
it is, attached.
Now: I go to public_html, or, say, public_html_ru (or any of several
second-language directories for second-language versions of my wiki),
and execute
texvc images/tmp images/math "y=x+2"
I get almost the same output as before. Almost. To the last tag, I see
appended a minus sign.
Then I look in the math subdirectory. NO FILE.
So I tried building a new math subdirectory in a dummy images file. The
texvc program executed flawlessly. Then I copied that recursively to the
Russian CreationWiki images directory, and ran the texvc command again.
NO FILE. And a minus sign at the end.
Now I see the modern MediaWiki puts a directory called "lockdir" in the
images file (but not the math directory).
Why should that make any difference?!?
Let me anticipate the solution some of you are going to recommend: why
don't I just use "Mathoid" for all maths? Well, a funny thing happened
over the weekend when I tried that. The Mathoid service had a crash.
It's down. And you know what? It's still down. For everybody. I checked
that out with the site "Is It Down for Everyone or Just Me." Result:
"It's not just you! This site looks down from here."
http://mathoid.testme.wmflabs.org%27/
Besides: my associates and I pay good money for a fully dedicated server
with root and secure-shell service. We expect to be able to install
LaTeX on it and to have it work consistently whenever I specify
temporary and permanent math-image-storage directories that do exist. We
can't understand why the executable should work sometimes, but not other
times. Nor why it should quit working when all I do is copy in another
directory!
I've asked my associate to submit a trouble ticket with our Web hosts. I
start to wonder whether I am seeing some kind of security issue that is
stopping LaTeX and the texvc program from executing on any directory
that "looks funny."
But I thought I'd check with you guys as well, in case I'm missing
something.
Other data: the setenforce 0 and setenforce 1 commands both return
"selinux has been disabled." Make of that what you will, but to me is
looks as though our hosts didn't want to encumber us with that. (I don't
know exactly what distro of Linux we're running--maybe CentOs. It's some
kind of Red Hat-derived service, because it uses the "rpm" package
system and "yum" to install. Both of which I know like the back of my
hand. I used those to install most LaTeX executables, plus "ocaml" so I
could make "texvc" and its associated applications.
User "Temlakos"
Hello all
I have a fresh 1.25 installation and I just can't upload any files to it.
LocalSettings.php, php.ini etc must all be good, because the files are
actually uploaded and go through the validation procedure. But then they
are rejected for lack of metadata in the database. This is what the
$wgDebugLogFile log says:
FSFile::getProps: /tmp/phphGwtp2 loaded, 1092779 bytes, application/pdf.
mime: <application/pdf> extension: <pdf>
UploadBase::detectScript: checking for embedded scripts and HTML stuff
UploadBase::detectScript: no scripts found
ZipDirectoryReader: Fatal error: zip file lacks EOCDR signature. It probably isn't a zip file.
UploadBase::detectVirus: virus scanner disabled
FSFile::getProps: Getting file info for /tmp/phphGwtp2
MimeMagic::doGuessMimeType: analyzing head and tail of /tmp/phphGwtp2 for magic numbers.
MimeMagic::doGuessMimeType: magic header in /tmp/phphGwtp2 recognized as application/pdf
MimeMagic::guessMimeType: guessed mime type of /tmp/phphGwtp2: application/pdf
MimeMagic::improveTypeFromExtension: improved mime type for .pdf: application/pdf
wfShellExec: /bin/bash '/path/to/wiki/includes/limit.sh' ''\''pdfinfo'\'' -enc UTF-8 -l 9999999 -meta '\''/tmp/phphGwtp2'\''' 'MW_INCLUDE_STDERR=;MW_CPU_LIMIT=180; MW_CGROUP='\'''\''; MW_MEM_LIMIT=1048576; MW_FILE_SIZE_LIMIT=102400; MW_WALL_CLOCK_LIMIT=180; MW_USE_LOG_PIPE=yes'
PdfImage::retrieveMetaData: 'pdftotext' '/tmp/phphGwtp2' -
wfShellExec: /bin/bash '/path/to/wiki/includes/limit.sh' ''\''pdftotext'\'' '\''/tmp/phphGwtp2'\'' - ' 'MW_INCLUDE_STDERR=;MW_CPU_LIMIT=180; MW_CGROUP='\'''\''; MW_MEM_LIMIT=1048576; MW_FILE_SIZE_LIMIT=102400; MW_WALL_CLOCK_LIMIT=180; MW_USE_LOG_PIPE=yes'
wfShellExec: /bin/bash '/path/to/wiki/includes/limit.sh' ''\''pdfinfo'\'' -enc UTF-8 -l 9999999 -meta '\''/tmp/phphGwtp2'\''' 'MW_INCLUDE_STDERR=;MW_CPU_LIMIT=180; MW_CGROUP='\'''\''; MW_MEM_LIMIT=1048576; MW_FILE_SIZE_LIMIT=102400; MW_WALL_CLOCK_LIMIT=180; MW_USE_LOG_PIPE=yes'
PdfImage::retrieveMetaData: 'pdftotext' '/tmp/phphGwtp2' -
wfShellExec: /bin/bash '/path/to/wiki/includes/limit.sh' ''\''pdftotext'\'' '\''/tmp/phphGwtp2'\'' - ' 'MW_INCLUDE_STDERR=;MW_CPU_LIMIT=180; MW_CGROUP='\'''\''; MW_MEM_LIMIT=1048576; MW_FILE_SIZE_LIMIT=102400; MW_WALL_CLOCK_LIMIT=180; MW_USE_LOG_PIPE=yes'
FSFile::getProps: /tmp/phphGwtp2 loaded, 1092779 bytes, application/pdf.
UploadBase::verifyExtension: mime type application/pdf matches extension pdf, passing file
Pulling file metadata from cache key mydbname:file:a6811db2a20b11dd8f7546a86ef2d92f
[exception] [c096eb4d] /wiki/index.php/%CE%95%CE%B9%CE%B4%CE%B9%CE%BA%CF%8C:%CE%91%CE%BD%CE%AD%CE%B2%CE%B1%CF%83%CE%BC%CE%B1 MWException from line 419 of /path/to/wiki/includes/filerepo/file/LocalFile.php: Could not find data for image 'Fek-1996-a-0007-hocr.pdf'.
The exception occurs in function loadExtraFromDB():
/**
* Load lazy file metadata from the DB.
* This covers fields that are sometimes not cached.
*/
Does anyone know why is it trying to pull metadata from the database for
a file that is just being uploaded? What metadata would that be?
Line 6 from the bottom of the log, "PdfImage::retrieveMetaData", says
that PdfImage uses the term "metadata" for the text layer of the pdf,
but that seems unrelated to the doings of LocalFile.php.
I'm using Special:Upload for the upload (the URL-encoded string you see
in the last log line says "Ειδικό:Ανέβασμα", which is the Greek built-in
Special:Upload) and I have no upload-related extensions installed.
Any ideas?
Z