[HipHop] Decent sized data set imported

Chad innocentkiller at gmail.com
Fri Jun 14 22:35:52 UTC 2013


On Fri, Jun 14, 2013 at 6:26 PM, Paul Tarjan <pt at fb.com> wrote:
>>
>>I managed to get mediawiki.org (92k pages, ~700k revisions) imported
>>to mysql last
>>night. It's in a new database called mediawikiwiki. This is a nice
>>mid-sized WMF wiki.
>
> Yay. Will it cover all the extensions that are needed for all your sites?
>

A good initial batch at least. I tried installing ParserFunctions (probably
one of our most crucial), but got 500 errors. Where does this log to? Or
can we expose those errors to the web? (display_errors in LocalSettings
didn't seem to do the trick).

>>I'll work on getting the configuration swapped over to point to this,
>>as well as running
>>the rebuild scripts to get the link tables consistent again.
>
> Yes please.
>

The config is swapped, and various maintenance scripts are running.
Will update when everything's sorted.

>>I think the enwiki import can be killed rather than waiting 3 more days.
>
> Cool, I killed it. If we want to restart here how far it went:
>
>  (committed 8560383 pages)
>
> We have the geoip extension done and I recompiled the hhvm binary with it.
> We aren't going to include it in the standard distro, but Sara is working
> on how we're going to ship non-default extensions.
>

Question: how would I run command line scripts with hhvm? I figured it
might be nice to run some of our maintenance scripts with it but the
obvious `hhvm path/to/script.php` didn't seem to work (just hung with
no output).

-Chad



More information about the HipHop mailing list