On Thu, 15 Jul 2004 16:50:12 +1000, Tim Starling
<ts4294967296(a)hotmail.com> wrote:
We've discussed all this before, a couple of
times. It's not insurmountable.
When I read Emmanuel's first post, I thought MAX_INCLUDE_REPEAT limit
was set solely in order to prevent infinite loops in inclusions.
Then you informed me that it was also intended to prevent attacks
based on including a large number of large files (or more accurately,
including a limited number of large files multiple times).
I didn't see any way to prevent that kind of attack without either:
1) Checking the size before inclusion.
2) Limiting the number of inclusions (or at least making it more
difficult by limiting the number of times the same file can be
included, thus forcing attackers to create multiple large templates,
which is easier to track and/or prevent).
I didn't think (1) was feasible (but see below) so that meant we're
stuck with (2), which also prevents the inclusion of a large number of
small files (such as the fancy bullets you mentioned). That's why I
thought the problem was insurmountable; it seemed to be intrinsically
tied to the number of repeated includes. Block repeated includes for
large files, and you also have to block them for small files.
Maybe the type of approach in (1) isn't as bad as I thought though,
since we could just keep a running total of how much data has been
included, and block further inclusions once a limit has been reached.
MAX_CUMULATIVE_INCLUDE_SIZE?
Actually, that might not be a half bad idea, since it would solve the
infinite inclusion problem as well (though MAX_INCLUDE_DEPTH might
still be a better way to handle that problem, since it might take too
long for small looping templates to build up to the size limit).
-Bill Clark