Reading through the email archives I am forming a conjecture about the
alarmingly large size of caching proxies:
My conjecture is:
A caching proxy attempts to be transparent,
which means it must simulate the internet as a whole.
In general this cannot be done, and there is no short
finite set of rules that if followed will yield
an acceptable simulation.
The writer of a caching proxy encounters an unending stream
of unforeseen and unexpected cases, which he must deal with
as best he can, with the result that the code of a caching
proxy tends to grow without limit.
Does this conjecture seem to describe the code growth that you have
encountered?
Received on Wed Aug 11 1999 - 16:56:28 MDT
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:47:56 MST