Updated
May 29th, 2015
First Posted
May 29th, 2015

Cache Aware Bandwidth Management

Emerging Technologies bandwidth management technology is "cache-aware", transparently managing cached and non-cached flows with an integrated or external cache. Following is a brief discussion of how it all works.

How Does a Cache Work?

With all of the chatter on the 'net about caches, there is suprisingly little information on just how caches operate in a functional manner. We are quite surprised that virtually no-one seems to understand how they work in terms of traffic flows. So here is how HTTP cacheing works in a nutshell: When a user (the BROWSER) makes a request to a remote web server (the SERVER), the transparent proxy (the PROXY) snags the packets and passes them to the cache (the CACHE) on whatever port and address its listening on. The CACHE then checks its local repository to see if it has a "recent-enough" copy of the page. If it has the page, it sends it back to the BROWSER. This is the "goal", because since the CACHE sits between the BROWSER and the internet, this means less traffic on your internet access line. If the CACHE does not have the page, or the page is not cacheable, then the CACHE will retrieve the page from the remote SERVER on its own behalf. This means that it uses the address of the CACHE to retrieve the page, not that of the BROWSER. This is important. This means that the BROWSER's address is not seen between the CACHE and the remote SERVER for the transaction. Secondly, if the page must be retrieved from the SERVER, the traffic is passed back to the BROWSER as it is received. Some people think that the CACHE will retreive the entire document, put it in its cache, and then sent it to the user. But that would make the delay too long, so the CACHE will simply "pass on" the transaction, and put the page in the local repository as it is retreived.

ET/BWMGR's Cache-aware Technology

The ET/BWMGR software can recognize the distinction between cached and non-cached traffic (tagging cached traffic with the http-cached pseudo-protocol). When the traffic reaches the bandwidth management device, the cached and non-cached traffic can be managed as separate streams. Typically you will allow the cached traffic to flow freely and not be "counted" as internet traffic (since it didn't use any bandwidth on your internet access line). Thusly, you can manage only traffic that does not result in a cache hit, which is typically what you want to do.

Using an External Cache - Set ups that Don't Work

When using an external cache, you need to position your bandwidth management device a bit differently. Normally you will connect it directly to your internet router. However, with the setup below, you have a problem: Using this setup, the HTTP traffic passing through the bandwidth manager will not have the IP addresses of your end-users. Although the bandwidth manager will only see traffic bound for the internet, it is a very limited set up. Since the cache retrieves pages on its own behalf, the bandwidth manager cannot tell what end user the traffic is associated with. So unless you are using general shaping or protocol shaping only, this setup will not allow you to manage your client's traffic the way you probably want.

Proper Operation with an External Cache

The diagram below shows that by placing the cache in-between the router and the bandwidth manager, it can be made to function like the integrated solution above. The cache transparently returns traffic (both cached and un-cached), and the bandwidth manager separates the traffic into 2 logical streams, depending on whether there was a cache "hit" or not.

Summary

As you can see, if you don't have a "cache-aware" bandwidth manager you will not be able to manage http traffic in the manner that you most likely need to manage it. Most low-end bandwidth management devices will not be able to distinguish between cached and non-cached traffic. With the ET/BWMGR, its as simple as it could be.
Add Comment

Next: Upgrade Notes