I think I need to learn some Linux command first...
At least I know there exist method to get the wikipage hit counts :)
With best regards,
Grain
2009/1/29 Mathias Schindler <mathias.schindler(a)gmail.com>om>:
On Thu, Jan 29, 2009 at 4:31 PM, Albert Grain
<grainbackup(a)gmail.com> wrote:
Thank you, Mathias.
I have downloaded one file from the website you'd suggested, but I
found the files are too large after decompression. More than that, I
could not locate the wikipage I need which is in Chinese.
You can use zgrep to save some disk space. Non-latin characters are
encoded, the chinese language wikipedia front page should be something
like:
%E9%A6%96%E9%A1%B5
Hence, type in
mathias@lenovo-r60:~/wikipedia/stats/200901$ zgrep "^zh
%E9%A6%96%E9%A1%B5 " pagecounts-20090102-100000.gz
zh %E9%A6%96%E9%A1%B5 2661 56018771
to the the number of page requests at the chinese language edition
front page at the 2nd of January, 2009 at 10 am UTC until 11am. There
were 2661 requests.
Mathias
_______________________________________________
Commons-l mailing list
Commons-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/commons-l