Is it possible to create a page that includes a question mark symbol? e.g.
to create the page "What is life?"
I was trying with this: "¿Y qué es la felicidad?" and there is no problem
with the first caracter but the final "?" doesn't work.
thanks
Pablo
--
Francois,
I have changed that allowuploads=true in my LocalSettings.php file,
also allowedMaxSize=20 MB and made the upload directory writable ...
but still I am having the same error.
I fail to understand what the problem could be.
Hey, thanks for the instructional reply. That's what I've liked about this
list since joining: there's always something new to learn by just keeping up
with the traffic.
Yeah, given the hairiness of that kind of approach to rendering, I agree
it's insignificant beside the potential for breakage.
Kynnin S
On Fri, 8 Apr 2005 12:56:45 0100 rowan.collins(a)gmail.com wrote:
> On Apr 7, 2005 5:16 AM, Kynnin Scott <kynnins(a)sfu.ca> wrote:
> > I noticed this behaviour in someone else's edit. It appears that they
had
> > habitually wikified a term despite the fact that they were including
> it as
> > the text of an external link. I realise this shouldn't be supported,
> but the
> > behaviour that it exhibits should probably be considered as not
> "doing the
> > right thing".
>
> Hm, yes, that is odd behaviour, and definitely a bug, but it's
> tempting to just say we won't try and fix it. My reasoning is that
> there's no meaningful use for that combination of markup, unless
> somebody really really wanted the text "[[...]]" in their link
> caption, which seems like something they could do without really.
>
> > This doesn't really break anything or pose any security issues (that
> I can
> > think of), but the code handling wikitext rendering should probably
> not try
> > to wikify double-square-bracket-delineated text in external links
anyway.
>
> The problem is that the code that does that is tortuous enough as it
> is, in order to deal with exactly that kind of context-specific
> behaviour. Every now and then, someone shifts it around, and new
> things break; if they're major, that leads to yet more shifting
> around; repeat...
>
> For those that haven't looked deep into the code, it's very tempting
> to think of the code as "knowing where it is", and doing things
> differently inside or outside certain markup; but actually it performs
> one step for every instance of, say, internal links, in the whole
> text, and then goes back and does something else. So things like this
> have to be dealt with by doing things in the right order, stripping
> bits of text out and putting them back later, and all sorts of odd
> tricks like that. My suspicion is that fixing this without breaking
> anything else would be a pretty major challenge.
>
> --
> Rowan Collins BSc
> [IMSoP]
> _______________________________________________
> MediaWiki-l mailing list
> MediaWiki-l(a)Wikimedia.org
> http://mail.wikipedia.org/mailman/listinfo/mediawiki-l
>
I have just installed the latest stable version 1.4.0 on an external Windows
server. Things went well during the install but now I am running into a
strange problem:
When I reload any wiki page in my browser (either Safari or Firefox), it
comes back blank. This also happens on Firefox on the PC. The source it
returns is: <html><body></body></html>
So I clear the browser's cache ... reload the page ... and everything is
normal.
Any suggestions to the cause and / or solution to this problem?
I am a wiki newbie, so any help would be appreciated.
- Bryan
"Every time I see an adult on a bicycle, I no longer despair for the human
race." -H.G. Wells
I thought Magnus would also make write a message on this list, as he did
yesterday on gmane.science.linguistics.wikipedia.technical ... but he
didn't. I quote his mail:
************************
Magnus Manske wrote:
Here it is! The millionth pseudo-parser I wrote for wiki(p|m)edia! :-)
Written as a single class, it takes a MediaWiki-style markuped (is that
a word?) source and generates the XML flavor Timwi and I have been using
in all our unfinished projects! ;-)
Try it out at
http://www.magnusmanske.de/wikipedia/wiki2xml.php
Just paste a wiki source text in, and get the XML. As you will notice,
it wasn't written for speed.
It is not a "real" parser, but the structure is simlar to what a parser
generator would make, except taking a few shortcuts here and there.
This could be the heart of a *real* export function. Just write a
XML-to-PDF generator (and replace the templates, and get rid of the
categories and language links) and you're done! :-)
*******************************
That was just to let you know of something I really like :D
François
> Quoting Jan Steinman, from the post of Wed, 30 Mar:
> well, hardlinks, surprisingly enough, are possible in Woodnose, but M$
> will not admit it openly, rather burry this fact deep in some MSDN
> article they hope nobody reads :-)
I'll promise I won't read them (-: I have no idea what Woodnose is
(although I understand it's a MS-thingie)..
Ira replies:
> sounds like he was looking for different thing, if I understood -
> different CSS for different pages in the SAME wiki...
Exactly, that would be great. And a bit foolproof (Am I in the right
room for that kind of demand).
Example of what would do:
Every page (database page) with the name
wiki/index.php/tp-nameofthepage will use tp.css. Also
wiki/index.php/job-nameofthepage wil use job.css, and so on.
Or
Every page which is downlinked below wiki/index.php/thispage, inherit
the CSS, that I had assigned to thispage
but
any other solution is welcome.
I was suprised that there is so little information about wikifarm at
meta or anywhere else on mediawiki. While it could be so interesting.
--
Commentaar op mijn ligstroom:
http://huijgen.ligfiets.net/mediawiki/index.php/Ligstromen
Export of a given article in a Wiki to PDF is nearly trivial using wget,
tidy, html2ps and ps2pdf:
<?php
$page = $_GET['page'];
$content=`export HOME=/home/apache;links -source \
http://localhost/wikids/index.php/$page | clip.pl | /usr/local/bin/tidy
| \
/usr/local/bin/html2ps | /usr/bin/ps2pdf - -`;
header('Content-type: application/pdf');
header("Content-Disposition: attachment; filename=$page.pdf");
print $content;
?>
clip.pl is a simple perl parser to strip out the nav bar and the like
and leave only the article div; any perl neophyte should be able to hack
it together. I added a link to the standard template on the intranet
wiki I administer, "Get PDF of this page"; it basically just calls
[http://evo.mydomain.com/pdfgen/index.php?page={{NAMESPACE}}:{{PAGENAMEE
}} Get PDF of this page]
I might not want to try this on something like Wikipedia ( all the
disk/memory action involved in the multiple conversions might push the
system load over-the-top ), but on a fairly busy intranet wiki, the load
is nominal, and it's a quick-and-useful hack. :D
Have a good one!
Steve
-----Original Message-----
From: mediawiki-l-bounces(a)Wikimedia.org
[mailto:mediawiki-l-bounces@Wikimedia.org] On Behalf Of Rowan Collins
Sent: Friday, April 08, 2005 7:58 AM
To: MediaWiki announcements and site admin list
Subject: Re: [Mediawiki-l] export wiki to PDF
On Apr 8, 2005 3:29 AM, Aaron Macks <amacks(a)techtarget.com> wrote:
> is there anyway to export an entire wiki to some fixed format, like
> linked html or pdf?
Some of the tools listed on
http://meta.wikimedia.org/wiki/Alternative_parsers are designed for
this exact purpose; others could be adapted to it.
I've also added a note there about the dump-using-internals parser
that Brion mentionned, because although it's not an alternative
parser, it could well be an alternative to using an alternative... :p
--
Rowan Collins BSc
[IMSoP]
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)Wikimedia.org
http://mail.wikipedia.org/mailman/listinfo/mediawiki-l
On Apr 8, 2005 3:29 AM, Aaron Macks <amacks(a)techtarget.com> wrote:
> is there anyway to export an entire wiki to some fixed format, like
> linked html or pdf?
Some of the tools listed on
http://meta.wikimedia.org/wiki/Alternative_parsers are designed for
this exact purpose; others could be adapted to it.
I've also added a note there about the dump-using-internals parser
that Brion mentionned, because although it's not an alternative
parser, it could well be an alternative to using an alternative... :p
--
Rowan Collins BSc
[IMSoP]
is there anyway to export an entire wiki to some fixed format, like
linked html or pdf?
Aaron
--
____________________________________________
Aaron Macks amacks(a)techtarget.com
TechTarget PGP keyid: FBE946C5
117 Kendrick St, Suite 800 Phone: (781) 657-1519
Needham, MA 02494 Fax: (781) 657-1100
{{:Article}}
also seems to work. I guess this is just the default namespace?
al.
-----Original Message-----
From: Sérgio Ribeiro [mailto:sr.ribeiro@gmail.com]
Sent: Friday, 8 April 2005 12:41 a.m.
To: MediaWiki announcements and site admin list
Subject: Re: [Mediawiki-l] how to include a part of one page into
another?
About any tech help I can't help you.
But you can use same thing that we use one Wikinews.. templates.
Like:
{{Namespace:Article}}
Note that {{Article}} Doesn't work, since the default namespace for using
{{}} is Template:
For a live example:
http://pt.wikinews.org/wiki/Template:<http://pt.wikinews.org/wiki/Predefini%
C3%A7%C3%A3o:Not%C3%ADcias_do_dia>Notícias
do
dia<http://pt.wikinews.org/wiki/Predefini%C3%A7%C3%A3o:Not%C3%ADcias_do_dia>
and see the source of the template:
Hope this can help you.
Cheers,
On Apr 7, 2005 8:57 AM, Fedya Mosalov <fedya.mosalov(a)gmail.com> wrote:
>
> Hello!
>
> First, I have a News page with a list of events which looks like that:
>
> 03/04/2005 Event 3
> 01/02/2005 Event 2
> 01/01/2005 Event 1
>
> Second, I have the Main Page, and I want to include top N events from the
> News page into the Main Page.
>
> Are there any features in WikiMedia to do it in the simplest possible way?
> Please provide me with the section in the documentation or some example
> concerning this question.
>
> Thanks in advance.
>
> --
> fedya
>
> _______________________________________________
> MediaWiki-l mailing list
> MediaWiki-l(a)Wikimedia.org
> http://mail.wikipedia.org/mailman/listinfo/mediawiki-l
>
--
Edit this page @ http://pt.wikipedia.org
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)Wikimedia.org
http://mail.wikipedia.org/mailman/listinfo/mediawiki-l