Hi all!
This is my proposal announcement for GSoC 2104:
Abstract
Appearance is most important for any site. Unfortunately, while MediaWiki is
very advanced in functionality it lacks flexibility in terms of appearance.
Vector skin is great, no doubt about it, but it seems there is no other
alternative. Almost all wikis in the world look like Wikipedia. Of course
one can apply their own CSS and customize the site's layout or even create
their own skin in PHP but both solutions demand a lot of knowledge and
effort and in the end nobody really does so except maybe for some big
commercial sites.
My proposal is the creation of a frontend that will help users with little
or no experience at all to easily produce all the CSS needed to change their
wiki's layout.
Full proposal wiki page:
https://www.mediawiki.org/wiki/User:Protnet/Frontend_for_Vector_skin_CSS_cus
tomizations
It's still a draft but you can get most of the idea. I've learnt about GSoC
a few days ago by chance so I didn't have much time to prepare it properly.
Any last minute help would be most appreciated! And the most important thing
is I still haven't found any possible mentors. Please help me on this!!
With regards,
Ioannis Protonotarios
Electrical & Computer Engineer MSc, now studying education
Athens, Greece
(P.S. I still have the fantasy that I will manage to apply with a second
proposal as well.)
Hie all,
This to notify all project leads to apply to get booklets published for
their respective wikimedian projects at Wikimania 2014. Booklets help
projects to reach out better and help Foss enthusiasts to get involved
quicker. This year Wikimania is supporting each project, no matter now
small to grow so visit the page below for details, queries and to apply.
https://wikimania2014.wikimedia.org/wiki/Booklets
These names will be moved so that requests to them go to our server in
the eqiad data center. This should not cause any service interruptions
but you may notice more current files available for download as the
switch goes into effect.
Time of switch: 10 to 12 am Thursday March 27, UTC. Depending on your
ISP's nameserver, you may not be affected until some hours later.
Ariel
Hi! I would like to discuss an idea.
In MediaWiki is not very convenient to docomputingusing the syntax of
the wiki. We have to use several extensions like Variables, Arrays,
ParserFunctions and others. If there are a lot of computing, such as
data processing received from Semantic MediaWiki, the speed of page
construction becomes unacceptable. To resolve this issue have to do
another extension (eg Semantic Maps displays data from SMW on Maps).
Becomes a lot of these extensions, they don't work well with each other
and these time-consuming to maintain.
I know about the existence of extension Scribunto, but I think that you
can solve this problem by another, more natural way. I suggest using PHP
code in wiki pages, in the same way as it is used for html files. In
this case, extension can be unificated. For example, get the data from
DynamicPageList, if necessary to process, and transmit it to display
other extensions, such as Semantic Result Formats.This will give users
more freedom for creativity.
In order to execute PHP code safely I decided to try to make a
controlled environment. I wrote it in pure PHP, it is lightweight and in
future can be included in the core. It can be viewed as an extension
Foxway. The first version in branch master. It gives an idea of what it
is possible in principle to do and there's even something like a
debugger. It does not work very quickly and I decided to try to fix it
in a branch develop. There I created two classes, Compiler and Runtime.
The first one processes PHP source code and converts it into a set of
instructions that the class Runtime can execute very quickly. I took a
part of the code from phpunit tests to check the performance. On my
computer, pure PHP executes them on average in 0.0025 seconds, and the
class Runtime in 0.05, it is 20 times slower, but also have the
opportunity to get even better results. I do not take in the calculation
time of class Compiler, because it needs to be used once when saving a
wiki page. Data returned from this class is amenable to serialize and it
can be stored in the database. Also, if all the dynamic data handle as
PHP code, wiki markup can be converted into html when saving and stored
in database. Thus, when requesting a wiki page from the server it will
be not necessary to build it every time (I know about the cache). Take
the already prepared data (for Runtime and html) and enjoy. Cache is
certainly necessary, but only for pages with dynamic data, and the
lifetime of the objects in it can be greatly reduced since performance
will be higher.
I also have other ideas associated with the use of features that provide
this realization. I have already made some steps in this direction and I
think that all of this is realistic and useful.
I'm not saying that foxway ready for use. It shows that this idea can
work and can work fast enough. It needs to be rewritten to make it
easier to maintain, and I believe that it can work even faster.
I did not invent anything new. We all use the html + php. Wiki markup
replaces difficult html and provides security, but what can replace the
scripting language?
I would like to know your opinion: is it really useful or I am wasting
my time?
Best wishes. Pavel Astakhov (pastakhov).
Now MobileFrontend is using JSON for languages, I jumped on this to
create a script to make language addition easier - basically a command
line interface called `make message` that edits the JSONs to add an
English message and QQQ code and maintains alphabetical ordering [1].
Recently this was used and some updates came from translatewiki.net
I ran my `make message` script and noticed it made some changes to
those from translation updator bot [2].
I was wondering - what would be the correct way to store these messages?
Do I need to update my script or should Translator bot being doing
things differently?
"아라" or "\uc544\ub77c"
"\u003Ccode\u003E" or "<code>" ?
Thanks in advances for your opinions!
[1] https://gerrit.wikimedia.org/r/#/c/119637/
[2] https://gist.github.com/jdlrobson/9767604
Hi,
Please ignore my earlier mail, i have done some editing in my draft and
mailing it again.
I am Rahul Mishra,final year undergraduate and pursuing
my B-Tech form Netaji Subhash Engineering College having
majors Computer Sciences & Engineering.
I am very much interested in the project of "A system for reviewing funding
requests" and proposed a draft titled "A system for reviewing funding
requests".
Please review my draft and please give your valuable advise/suggestions,
so that
i can further improve my proposal and make it better.
Link to my Userpage.
https://www.mediawiki.org/wiki/User:Rahulmishra22
Link to the project.
https://www.mediawiki.org/wiki/Google_Summer_of_Code_2014#A_system_for_revi…
Link to the Proposal.
https://www.mediawiki.org/wiki/Google_Summer_of_Code_2014#A_system_for_revi…
Thank you,
Rahul Mishra.
Dept. of CSE,
NSEC.
Hi! This is a quick heads-up about the status of HHVM migration, and what
the MediaWiki Core team is working on.
There are three challenges that we have to solve before we can run HHVM in
production:
* We need good packages. The packages provided by Facebook have some deep
issues that need to be fixed before they meet our packaging standards.
This is a good opportunity to recognize Faidon's leadership on this front:
he has been liasoning with Facebook and Debian, working to resolve the
outstanding issues. Thanks, Faidon!
* We need to port a bunch of C extensions to the Zend PHP interpreter to
HHVM. The most complex by far is LuaSandbox. Tim has been working on that.
In the process, he has made substantial improvements to the Zend extension
compatibility layer provided by HHVM, which we are waiting to have merged
upstream: <https://github.com/facebook/hhvm/pull/1986>. Once they are
merged, they will be in the queue for the next release. Releases are cut
every eight weeks.
* I also want to recognize Max Seminik, who stepped up to port Wikidiff2,
producing a patch in short order.
* We need to adapt our app server configuration for HHVM. This includes
configuring HHVM itself as well as reconfiguring Apache to act as a fastcgi
reverse-proxy.
* We need to amend our deployment process so that it implements additional
requirements for HHVM. Specifically, we will need to add a build step to
produce a bytecode archive in advance of deployment. We are not working on
that piece yet, but I think that Bryan's work on scap is going to make this
a lot easier to implement once we do tackle it.
What we've done so far is to use Facebook's packages in Labs and in
MediaWiki-Vagrant, configured Jenkins to run the unit tests under HHVM
(Antoine), and configured a Jenkins job to build HHVM from source hourly so
we can test patches (Chad). Aaron and I reasoned our way out of having to
port the igbinary extension, and Aaron is now working on porting
FastStringSearch. Along the way, we have been running into small
compatibility nits which we have fixed either by changing core's behavior
to be cross-compatible or by filing bugs and submitting patches upstream.
As you can see, there are some hard blockers that stand between us and HHVM
in production, and the biggest ones are not entirely in our hands (i.e.,
they depend on upstream merging patches and fixing packages). At the same
time, there is a lot of useful work left to do that can continue without
being blocked by these things. For that reason, the Core MediaWiki team is
currently targetting the Beta cluster for HHVM work.
Our target for the current sprint is to have the ability to have Apache run
either the Zend interpreter or HHVM based on the presence of a magic
cookie. By default, visitors to the beta cluster will be served pages
generated using the Zend interpreter, but by setting the cookie, Apache
would serve MediaWiki using HHVM instead. This is an idea we got from
Niklas, who has implemented something very similar for <
http://dev.translatewiki.net/>. Doing this this would allow the beta
cluster to continue to be faithful to production and thus continue to be a
good target for testing, while at the same time provide a way for people
working on HHVM specifically to test ported extensions and to identify and
fix integration points in a production-like environment. It also gives us a
way of making our progress visible to you.
We have benchmarked different workloads on different hardware and have
found the performance of HHVM to be impressively better than the Zend
interpreter in most cases, but we don't yet have numbers to share that
project the impact on users, because we don't have the means of simulating
the load patterns of production, and because some parts of the stack are
still in the process of being ported. We expect that having the option of
running HHVM on the Beta cluster with the complete set of extensions that
Wikimedia uses will make it possible for us to project how it will perform
in production. But we are optimistic, given what we've observed and given
the spate of independent evaluations of HHVM from different corners of the
PHP community.
We are using Bugzilla to track our progress. You can search for bugs with
the 'hiphop' keyword, or simply head to <https://www.mediawiki.org/wiki/HHVM>,
which aggregates the most recently touched items via RSS. If you'd like to
get involved, pick an open bug, or get in touch via the lists or IRC.
Regards, Core Platform.
Hi,
I've submitted a proposal to automate the process of uploading books from
libraries like Google-Books to the internet archive, which could further be
used on commons.
Please help me improving the proposal and adding and other features that
might be of interest. Please leave you feedback
https://www.mediawiki.org/wiki/User:8ohit.dua/GSoC_proposal_2014
Regards
Rohit Dua
(8ohit.dua)
8ohit.dua(a)gmail.com
I've been doing some work on the Lua/CSS/JS CodeEditor to make it and its toolbar a bit more usable, but I'm looking for some input on what YOU want.
I've listed some ideas here:
https://bugzilla.wikimedia.org/show_bug.cgi?id=59924
A list of key commands is here:
https://github.com/ajaxorg/ace/wiki/Default-Keyboard-Shortcuts
And ACE itself has a demo site that shows a few of the options as well:
http://ace.c9.io/build/kitchen-sink.html
I've now got a button to show invisible characters, and a button to show the find and replace dialog. Of course most options are available already trough key commands but many people are not familiar with those.
If anyone has any specific desires/ideas/feedback etc, I'd love to hear it. Also, if someone can help with making icons for those toolbar buttons, that would also be appreciated.
DJ