On Mediawiki.org and on Wikipedia too, this works:
[[File:Nuvola apps download manager.png|40px|link=http://www.cnn.com/]]
It uses that image and links the image to the URL. On my default MW (1.13) though, this doesnt work. Can the default MW version also have this functionality please? Or is this something that was introduced in 1.14?
Yea, I've tried image link extensions but they're hard to work with.
thanks
Eric
We just updated from 1.13.2 to 1.14.0 and I see that a feature has disappeared from Special:MovePage. It used to display a checkbox, "Update any redirects that point to the original title." Now that checkbox is gone, even when a page has redirects. Is this a regression error? That checkbox is great - it reduced our double-redirects significantly.
DanB
Hi,
I have mediawiki on a windows based, apache server. When I set it up,
the url was:
http://server/mediawiki
I then went into my AD DNS and added an alias to server called "wiki".
In apache, I added a virtual host for wiki which set the document root
to the mediawiki folder.
The virtual host works because when I go to http:/wiki/ it resets the
url to http://wiki/mediawiki/index.php/Main_Page. The problem is, of
course, that /mediawiki/ should not be part of the URL. I would need it
to be http://wiki/index.php/Main_Page.
The original URL still works but I am confused as to why setting it up
with a vhost doesn't work.
Any ideas how I can get this to work?
Hi List,
i'm new here, this is my first post.
Last week i was asked to move one of our wikis (see
http://uwe-kern.de/winfwiki/index.php/Spezial:Version) from
"uwe-kern.de/winfwiki/index.php/" to the new domain
"wiki.intermoves.de/index.php/". The server is still the same, only the
document root changed (namebased virtual hosting).
So i copied all the wiki-files into the new directory, removed the images
directory from the old location and linked it into the new, and changed
$wgScriptPath in the new location's LocalSettings.php to match the new
URL:
old:
$wgScriptPath = "/winfwiki";
new:
$wgScriptPath = "";
All other pathes still have default values.
So the wiki is now accessable via both URLs.
When an article, that links to other articles using the "[[articlename]]",
is written by someone using the "uwe-kern.de/winfwiki/index.php/" URL, and
someone else using "wiki.intermoves.de/index.php/" clicks the links, he
gets the "not found" apache error page:
Not Found
The requested URL /winfwiki/index.php/Testseite was not found on this
server.
Apache Server at wiki.intermoves.de Port 80
This isn't always the case, sometimes it works and sometimes it doesn't.
If a page that contains false links is then opened for reading using the
new "wiki.intermoves.de/index.php/" URL and then saved again, the links
always work for "wiki.intermoves.de/index.php/" but not always for
"uwe-kern.de/winfwiki/index.php/".
Can it be that paths are also stored in the DB and not runtime-calculated?
We deactivated all caching and accelerating engines.
This is what LocalSettings says about caching:
## Shared memory settings
$wgMainCacheType = CACHE_NONE;
$wgMemCachedServers = array();
# When you make changes to this configuration file, this will make
# sure that cached pages are cleared.
$configdate = gmdate( 'YmdHis', @filemtime( __FILE__ ) );
$wgCacheEpoch = max( $wgCacheEpoch, $configdate );
Any help/hint is very welcome as we can't move our wiki until this issue
is resolved :(
Best regards,
Heiner Wulfhorst
PS:
I'm also available via ICQ (130715519) or AIM (just2blue4u)
How can I prevent pages from showing op in the Special:NewPages?
2009/3/2 Daniel Barrett <danb(a)vistaprint.com>
> How about subscribing to the RSS feed produced by Special:NewPages?
>
>
> _______________________________________________
> MediaWiki-l mailing list
> MediaWiki-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
Hello,
Thanks for the response. I did some more testing and it seems that diff3 is not the cause (the test server experiences the same slowdown whether the variable is set to false or not.) Is there an internal php routine that compares the 2 revisions when a hist/cur/undo link is clicked?
Thanks
Tim
> ------------------------------
>
> Message: 6
> Date: Sun, 01 Mar 2009 12:58:52 +1100
> From: Tim Starling <tstarling(a)wikimedia.org>
> Subject: Re: [Mediawiki-l] possible revision comparison optimization
> with diff3?
> To: mediawiki-l(a)lists.wikimedia.org
> Message-ID: <gocq4u$tst$1(a)ger.gmane.org>
> Content-Type: text/plain; charset=ISO-8859-1
>
> tlg wrote:
> > Hello, I run a sort of semi busy wiki, and I have been experiencing
> > difficulties with its CPU load lately, with load jumping to as high as 140
> > at noon (not 1.4, not 14, but ~140). Obviously this brought the site to a
> > crawl. After investigation I have found the course- multiple diff3
> > comparisons were called at the same time.
> >
> > To explain the cause of this needs a little background explanation. The wiki
> > I run deals with the edit of large text files. It is common to see pages
> > with hundreds of kb of pure text on any given wiki page. Normally my servers
> > would be able to handle the edit requests of these pages.
> >
> > However, it seems that searchbots/crawlbots (from both search engines and
> > individual users) have been hitting my wiki pretty hard lately. Each of
> > these bots tries to copy all the pages, this include Revision History of
> > each of these 100kb sized wiki text pages. Since each page could have
> > potentially hundreds of edits, for every single large text files, hundreds
> > of Revision history diff (from lighttpd/apache -> php5 -> diff3? ) are
> > spawned.
>
> diff3 is invoked in two cases: on page save when there is an edit
> conflict, and when someone clicks "undo". Neither is particularly
> vital to the operation of the wiki, so the first thing you should do
> is turn them both off, using
>
> $wgDiff3 = false;
>
> in LocalSettings.php. Then see if that fixes your load problems. If it
> does, then you were right about diff3 being the problem. Next you
> should look at your logs to find out where the edits or undo requests
> are coming from.
>
> If the problem is undo requests from search engine crawlers, you could
> fix the problem by disabling anonymous edits. This will prevent the
> bots from accessing the undo link.
>
> Please tell us what you find, because it's likely that you're not the
> only one having this problem.
>
> -- Tim Starling
Hello,
Our setting of MediaWiki includes a number of languages, with a common DB called MC_WIKICAFE, and DBs per
language called MC_WIKICAFE_EN, MC_WIKICAFE_HE, and so on. So far, it works flawlessly. The only problem we have
is running the runJobs.php script. I was unable to find a way to set the language with which runJobs is supposed to be run with,
or make runJobs run for all the languages in the system.
I know we're not the only ones with such a setup ,so how do people do that usually?
Thanks,
Dave
Hey guys.
For my thesis project I am doing a ground up redesign of mediawiki's
interaction. (blue prints, not construction)
Ill be looking at how people use it, from a user / contributer /
administrator role and how we can help make it batter and easier.
From the many wikis I have run, It seems most people are afraid to
edit/ contribute to it because it is so complicated. (or seems so)
Would anyone be interested in talking to me about what they do/don't
like, and what they have a hard time with as an administrator?
Also what your users have issues with.
Thanks,
Adam Meyer
ameyer(a)g.risd.edu
Industrial + Interaction Designer
http://www.adam-meyer.com
Hello,
I would like to receive an email when a user create a new page.
I add these lines at the end of my LocalSettings.php :
$wgEnotifWatchlist = true;
# Declare this twice so it works for all versions
$wgUsersNotifiedOnAllChanges =
$wgUsersNotifedOnAllChanges = array( 'jonathan' );
But this, activate a notification for every changes in the wiki, also
creation of a new page.
So, do you know how I can enable just the notification for the creation
of a new page ?
Thank a lot,
Sincerely,
Jonathan