Poor, Edmund W wrote:
Um, there is, I, (pause to remove foot from mouth)
Reducing time from .99 to .97 doesn't sound very important. But I still
think that efforts to reduce page delivery time are important. Any idea,
no matter how far-fetched, should be considered.
Naturally, the proof is in the pudding. So after we try a given
optimization, we ought to measure the results and see if it really gets
the pages to the users any faster.
Ed "Sheepish" Poor
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)Wikipedia.org
http://mail.wikipedia.org/mailman/listinfo/wikitech-l
A dumb suggestion:
Temporarily, set the code to parse each page _twice_ (or N times)
instead of once, and see if it makes things any _slower_, and if so, how
much.
This should give you a good way of estimating the current slowdown,
without writing much code. Then you can estimate the possible speedup,
and whether it's worth the effort.
-- Neil