A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2004; you can also visit the original URL.
The file type is
Evaluating a new approach to strong web cache consistency with snapshots of collected content
Proceedings of the twelfth international conference on World Wide Web - WWW '03
The problem of Web cache consistency continues to be an important one. Current Web caches use heuristic-based policies for determining the freshness of cached objects, often forcing content providers to unnecessarily mark their content as uncacheable simply to retain control over it. Server-driven invalidation has been proposed as a mechanism for providing strong cache consistency for Web objects, but it requires servers to maintain per-client state even for infrequently changing objects. Wedoi:10.1145/775236.775237 fatcat:dq5dvr4vq5ekbciaq5l4c6djiq