[WikiEN-l] Re: Article size consistency 32k

John R. Owens jowens.wiki at ghiapet.homeip.net
Mon May 23 09:20:15 UTC 2005


-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Stephen Bain wrote:
| John R. Owens wrote:
|
|>I don't know if this might have ever been suggested before already, but
|>perhaps a change in software could allow us to set a maximum image size
|>in our user preferences? Either in width/height, e.g. "always shrink
|>images to less than 200 pixels wide or 150 pixels high, whichever is
|>smaller", or in kB, e.g. "always shrink images to less than 10 kB".
|
| Unfortunately I don't think this would be a workable solution, unless
| a thumbnail duplicate of every image was created at the time of
| uploading. That would be fairly straightforward to do, although it
| would be very resource intensive on the image server(s). (Is there
| more than one yet?) To shrink each image after every request before
| sending to the browser would literally kill the servers.
|
| A better way would be to allow for a user preference to have images
| either on or off by default, but have a link on the page to view a
| version with images. So if a user browsing with images off wanted to
| see an image, they could then re-load the page with images included.
| (If it's good enough for Outlook, then it's good enough for
| Wikipedia.)

That's why I was suggesting it would probably be desirable to have a
limited number of options for maximum width/height/filesize. If that
were the case, then a limited number of different sized copies could be
cached, the same way, if I understand correctly, that the thumbnail
images are cached after the first time they're viewd in the page. So,
for instance, you might have four width/height category choices, e.g.
100x75, 200x150, 400x300, and 600x400, and then you "just" need disk
space for four or less copies of each image (obviously, any images that
are, say, 250x200 would only need two extra copies, and those rather
small ones). I suppose that since this would be applied to pretty much
all images, instead of only those formatted to use thumbnails.
So what you might end up with, for the hypothetical picture and choices
given above (call the picture example.jpg), would be something like
creating files corresponding to the following URLs (I haven't figured
just which name hash directory it should actually belong in, especially
since I notice it does take the "XXXpx-" part into account):
http://upload.wikimedia.org/wikipedia/en/thumb/a/ab/Example.jpg
http://upload.wikimedia.org/wikipedia/en/thumb/c/cd/200px-Example.jpg
http://upload.wikimedia.org/wikipedia/en/thumb/e/ef/100px-Example.jpg
and perhaps something like
http://upload.wikimedia.org/wikipedia/en/thumb/1/12/5kb-Example.jpg
http://upload.wikimedia.org/wikipedia/en/thumb/3/34/10kb-Example.jpg

And then, if you've set your preferences accordingly, you get the 200px
or 100px version, and if not, you get Example.jpg itself in the viewed
page (assuming it hasn't been thumbnailed, either).
If this were to be done, I'd expect there should be a script to create
these thumbnails, run at a low priority, before the option were enabled,
so that the servers wouldn't bog down if many people using the option
started browsing image-laden pages. But I'd definitely want to see what
kind of demands it would make on disk space first.

And mind me, I'm not really pushing for the idea myself, just trying to
present an optimal solution to a problem that was brought up. I'm on
broadband, so it doesn't affect me much personally. I just thought I'd
throw the idea out there.

- --
John R. Owens
ProofReading Markup Language : http://prml.sourceforge.net/
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.6 (GNU/Linux)
Comment: Using GnuPG with Thunderbird - http://enigmail.mozdev.org

iD8DBQFCkaBNEZJRR5c5RmQRAu7tAJoCcuPgYkIR61XdgOC2moJl+1skSACeKvJC
I2Dxy9gho04HAtY6KiT9+6k=
=Zt8Y
-----END PGP SIGNATURE-----



More information about the WikiEN-l mailing list