The first words the user* sees on every page are "Take me back".
Back to Mom's house? Back to the future? Back to the previous page?
Why can't you just use "Older interface", "Traditional interface", etc.?
*Wikipedia, Commons, etc. users, who have chosen the Vector interface,
especially those who haven't then afterward logged in again for a month
and don't now remember what this is all about.
Hey,
Currently the Maps extension [0] allows you to specify marker specific data
[1] (a titel, further text, and the icon to use). The way this is done is
not very clean, and needs improvement. Someone poked me about this at
Wikimania, but I don't know how to improve upon the current syntax, hence
this discussion.
What's bad about the current approach:
* A 3rd level of parameter is needed, which is rather insane. (The first
level are the regular parser function parameters, separated with vertical
lines. The second level are coordinates or addresses, separated by
semicolons. The third level for the marker specific data uses tildes as
delimiters.)
* Not very readable.
* Unsuited for long text.
* Excludes all the delimiters from usage in the text.
* Problems with links in the wikitext.
Anyone an idea what a better approach would be? You can either reply here,
or write out your idea's on the discussion page [2].
[0] http://www.mediawiki.org/wiki/Extension:Maps
[1] http://mapping.referata.com/wiki/Help:Marker_data
[2] http://mapping.referata.com/wiki/Help_talk:Marker_data
Cheers
--
Jeroen De Dauw
* http://blog.bn2vs.com
* http://wiki.bn2vs.com
Don't panic. Don't be evil. 50 72 6F 67 72 61 6D 6D 69 6E 67 20 34 20 6C 69
66 65!
--
Hello to all!
I am currently working on interwiki transclusion [1].
In the proposed approach, we currently retrieve distant templates:
1) using wfGetLB('wikiid')->getConnection if the distant wiki has a
wikiid in the interwiki table
2) using the API in the opposite case
In case 1, it seems that retrieving a template from a distant DB is
just as expensive as retrieving it from the local DB. So, we don't
store the wikitext of the template locally.
In case 2, the retrieved wikitext is cached in the transcache table
for an arbitrary time.
I have two questions about this system:
* Is it better to use the transcache table or to use memcached for the
API-retrieved remplates?
* Should we cache the DB-retrieved templates with memcached?
An advantage of memcached here is that it is shared by all the WMF
wikis, whereas the transcache table is owned by a wiki for itself.
Thanks in advance
Best regards
--
Peter Potrowl
http://www.mediawiki.org/wiki/User:Peter17
[1] http://www.mediawiki.org/wiki/User:Peter17/Reasonably_efficient_interwiki_t…
Hello, I hope I'm not being inappropriate posting to these lists, but
I'm looking to hire a mediawiki expert on a consulting/freelance basis
to help me with a project.
I am developing a website that will serve as a directory for real
estate agents (essentially one page per agent). I'm looking for
somebody to do the following for me:
*Set up the mediawiki software on a server
*Help design layout of appropriate templates and other add-ons
*Write scripts for pywikipediabot, and use it to translate data into the wiki
*Assist in writing scripts to datamine the web for real estate agent information
Please email me to discuss the project further.
Thanks,
--Misha Zaitzeff
Hey,
Having a tag implementation next to the current parser function one would be
acceptable. Replacing the parser function by a tag extension is not. So that
would not solve the current problem.
Cheers
--
Jeroen De Dauw
* http://blog.bn2vs.com
* http://wiki.bn2vs.com
Don't panic. Don't be evil. 50 72 6F 67 72 61 6D 6D 69 6E 67 20 34 20 6C 69
66 65!
--
On 19 July 2010 14:34, Platonides <Platonides(a)gmail.com> wrote:
> Why not use a tag instead of a parserfunction?
> You could have one point per line, with parameters (which could be
> named) separated by pipes.
>
>
>
>
> ------------------------------------------------------------------------------
> This SF.net email is sponsored by Sprint
> What will you do first with EVO, the first 4G phone?
> Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first
> _______________________________________________
> Semediawiki-user mailing list
> Semediawiki-user(a)lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/semediawiki-user
>
Hi,
Here are the notes from a discussion a few of us had a Wikimania.
Attendees please add anything I may have missed. Once we make some
progress on some of the short term goals, we can have another status
update via phone or Webex.
http://www.mediawiki.org/wiki/Test_framework_07_11_2010
-p
--
Priyanka Dhanda
Code Maintenance Engineer
Wikimedia Foundation
http://wikimediafoundation.org
Hello,
We would like some help in figuring out the quickest and easiest way to disable link checking in the mediawiki software. By this I mean that when rendering a page, there is a check to see if the original image exists. If it's found, a link is constructed to the appropriate thumbnail size. If not, an upload link is constructed instead. It would be a great help to us if someone familiar with the code could give us some info on the best way to force it to construct the image link even if it's not there when it's rendered. I hope this is clear enough. If not, I'll try to clarify.
Brent
Widernet.org
On Sun, Jul 11, 2010 at 5:42 AM, Siebrand Mazeland <s.mazeland(a)xs4all.nl> wrote:
> Hi,
> Just to inform you about the NOW running live streams from Wikimania about MediaWiki.
> See http://toolserver.org/~reedy/wikimania2010/jazzhall.html
> Runs until 13.00 CEST TODAY/NOW!
Shame. This requires some plugin stuff.
I run a custom Vector installation for a corporate-ish setup. Since
the wiki is readonly unless people log in I've hidden some of the
scarier wiki stuff. Here's my diff of Vector.php:
Index: skins/Vector.php
===================================================================
--- skins/Vector.php (revision 68624)
+++ skins/Vector.php (working copy)
@@ -435,13 +435,13 @@
'tagline',
),
'places' => array(
- 'privacy',
- 'about',
- 'disclaimer',
+# 'privacy',
+# 'about',
+# 'disclaimer',
),
'icons' => array(
- 'poweredbyico',
- 'copyrightico',
+# 'poweredbyico',
+# 'copyrightico',
),
);
$footerlinksClasses = array(
@@ -582,7 +582,7 @@
private function renderPortals( $portals ) {
// Force the rendering of the following portals
if ( !isset( $portals['SEARCH'] ) )
$portals['SEARCH'] = true;
- if ( !isset( $portals['TOOLBOX'] ) )
$portals['TOOLBOX'] = true;
+# if ( !isset( $portals['TOOLBOX'] ) )
$portals['TOOLBOX'] = true;
if ( !isset( $portals['LANGUAGES'] ) )
$portals['LANGUAGES'] = true;
// Render portals
foreach ( $portals as $name => $content ) {
I could just add hooks to munge those things. E.g.:
wfRunHooks( 'SkinVectorExecuteFooterlinks, array( &$this,
&$footerlinks ) );
And:
wfRunHooks( 'SkinVectorRenderPortalsPortals, array( &$this, &$portals ) );
But I thought I'd ask if someone (particularly Trevor) has suggestions
on how to do it better. I can't see a quick and sane way to do it for
the general case, since Vector uses a diffrent $footerlinks structure
than MonoBook.
That's fine for my purposes, but might not be such a good idea for
MediaWiki. I think per-skin hooks aren't an inherently bad idea
though.
It would be used similar to how you can use SkinBuildSidebar now,
here's something from my LocalSettings.php:
# Hide scary stuff in the sidebar from users that aren't logged
in. Derived from
# http://www.mediawiki.org/wiki/Manual:Interface/Sidebar#Change_sidebar_conte…
$wgHooks['SkinBuildSidebar'][] = 'efHideSidebar';
function efHideSidebar($skin, &$bar) {
global $wgUser;
if (!$wgUser->isAllowed( 'edit' )) {
unset($bar['vf-navigation-users']);
unset($bar['TOOLBOX']);
}
if (!$wgUser->isAllowed( 'block' )) {
unset($bar['vf-navigation-admins']);
}
return true;
}