An Overview Of Transparent Caching and Its Role In The CDN Market

Everyone reading this blog is aware of the fact that Internet traffic continues to grow, and that an increasing amount of this traffic is being driven by video. Cisco VNI projects Internet traffic to grow 5 times between 2009 and 2013, and video will constitute 90% of overall traffic. As a result of all this video traffic, one of the biggest buzz words being used lately by telcos as well as vendors selling CDN platforms for use inside a carrier network is the term transparent caching. There's a lot of folks using the phrase these days and as a result, a lot of confusion exists as to what it is, how it differs from regular caching and the role it plays in the CDN industry.

Content caching technology utilized for network optimization has been available for many years, and today, there are a couple of different types of caching approaches being used. Originally caching technology focused on basic web pages and moving HTML files and web objects closer to a user to improve response time. Basic web caching became less necessary as network operators grew bandwidth capacity throughout the last decade and today, most people are familiar with caching to provide application acceleration and scale for CDN's like Akamai. There is also service specific content caching to address particular services, such as Google cache.

What may (or may not) come as a surprise to some readers is the impact that all this traffic has on service provider networks. While CDNs make their money from content publishers who typically pay based on volume, network service providers' money comes from their subscribers who pay a fixed amount per month. So while CDNs (theoretically at least) stand to gain from this increase in video traffic, network service providers are stuck between the proverbial rock and hard place. They have to invest in their networks to scale to support this traffic, yet they are receiving little incremental revenue from it. Clearly, investment with no return is not a sustainable model, and service providers recognize this.

However, as video streaming and rich media downloads continue to flood operator networks, with no end in sight, network operators are evaluating and deploying transparent Internet caching inside their networks to address a broader range of Internet content. The intent is two-fold. The first is to reduce the network infrastructure and bandwidth costs associated with over the top (OTT) content and the second is to differentiate their consumer broadband service and deliver better user performance. By eliminating any potential delays associated with the Internet or even the content origin, caching allows the operator to highlight their investment being made in the access network and deliver more content at top speeds.

You may be asking, "don't CDNs help offload the service providers network too?" The answer is, "only to a certain extent". First, CDNs only get the content so far-they do a good job distributing content regionally, but once the traffic leaves the CDN servers, it still needs to traverse the network service provider's access and edge networks. Also, CDNs don't collocate in every service provider network, so some traffic must be delivered via a peering or transit relationship. Additionally, there is a lot of content that is simply not delivered over CDNs.

Unlike traditional CDNs, which only store content based on business agreements, transparent caching can make intelligent decisions about which content can and should be cached locally to optimize the network. By deploying intelligent caches strategically throughout their networks, operators can cache and deliver popular content close to subscribers, thus reduce the amount of transit traffic across their networks. Hot content would be delivered very close to the edge, and content of decreasing popularity could be cached further back in the network.

But what is transparent caching, and how does it differ from an Akamai or Google caching service and other traditional forms of proxy or content caching? Most traditional content caching sits outside the network in a peering location, data center, or collocation space. It is usually managed by someone other than the network operator (like Akamai or Google). The network operator has little or no control over the cache servers, and as a result has little visibility into the actual productivity of those servers or what is being delivered from them.

In contrast, because transparent caching works across a much broader set of over the top content and traffic (as much as 75% of an operators consumer broadband traffic is video streams and file downloads), it is embedded inside the carriers network and provides the operator control over what to cache, when to cache, and how fast to accelerate the delivery.  A transparent cache has the following characteristics:

  • Multiservice caching – Because a transparent cache needs to address as much Internet content as possible, today's platforms tend to support multiple services and protocols running across the network. The most common applications that consume the bulk of the bandwidth include HTTP flash video, Netflix, HTTP based file sharing, Silverlight, RTMP, and BitTorrent.
  • Automatically adapts to popular content – A transparent cache automatically ingests and serves content as it becomes popular and usually does not require operator intervention to continuously modify the network or the caching solution to support a new popular service or device.
  • Transparent to the subscriber and content origin – Transparent caching does not require modification of any system or browser settings. The performance benefits should be automatic as the only evidence of caching should be better end-to-end performance to the user. Likewise, it does not require any special HTML code or DNS redirection from a content source. Benefits should be automatic to the content origin by providing better performance and less load on origin servers.
  • Preserves all application logic – A transparent cache does not impact any application or service logic, meaning critical functions like authorization and click-thru impressions are preserved so as not to impact Internet business models. In addition, a transparent cache must be careful not to serve stale content or content that has been removed from the Internet.
  • Embedded in the network and controlled by the network operator – Because a transparent cache operates across such a broad range of traffic and protocols, it is embedded in the network and controlled by the operator. Control allows an operator to determine how fast the content should be accelerated to the end user so as not to congest other downstream points in the network. It also provides the operator with visibility in terms of caching performance and what types of content are being cached and accelerated.

It should also be noted that some vendors increase the transparency of the caching solution by not even having a public IP address. This has the added benefit of being completely invisible to the user with the exception of providing faster service. This also means that the transparent cache cannot be hacked or bypassed or interfered with.

Many Telcos, MSOs, and Mobile Operators are now looking at transparent Internet caching as a required element in their network to control “over-the-top” content consumption and to provide the best possible end-to-end user experience. It is a unique technology that simultaneously benefits a content owner, a network operator, and most importantly a broadband or wireless subscriber.

With all the benefits of intelligent caching, it may beg the question as to why it is only now beginning to gain traction? The reality is that caching in this manner is a very difficult challenge to solve technically. In order to cache content, the cache must intelligently and dynamically identify and adapt to shifting content access patterns. Of course, caches must also be legal, which is achieved by following the guidelines outlined within the Digital Millenium Copyright Act (DMCA).

CDNs are good at what they do, serving content to subscribers quickly, and in the process alleviating peering costs for service providers. But there's an enormous amount of traffic that these CDNs do not serve, and that's where the value of intelligent caching comes in. Deployed throughout a carrier network, it will improve subscribers' experience and reduce carriers' peering costs, but only if it delivers the features and intelligence required to adapt to constantly changing user behavior and content patterns, and most importantly, scales economically to tens or hundreds of gigabits per second.

Transparent caching is one of the topics will be talking a lot more about at the Content Delivery Summit, May 2011 in NYC. I'll have preliminary details on the show in December.

  • Pepsi

    This doesn’t address the last mile which is the real bottleneck for cable, DSL, and wireless (all flavors including satellite) customers.

  • http://cisco.com jaak defour

    I think the big question for carriers is, where to draw the line of eligible traffic for transparent caching. If network cost savings from deep caching are far larger than CDN revenues, why not make life easy and cache as much as you can… YouTube, Hulu, Netflix etc. Then, there are no CDN services to develop, no marketing, no salesman to train, no content provider to convince… just a simple network project to cache all popular content. Ok, they miss out on the 0.02$/GB that Netflix was maybe willing to pay… but they save a high multiple in local transport costs. This does mean that the bit delivery CDN market could simply evaporate…

  • http://www.juniper.net anshu agarwal

    One thing which the article alluded to, but didn’t discuss in detail, is the future potential for transparent caches. The same content caching infrastructure—if architected correctly—can not only reduce costs today, but it can also be the foundation for new services. This is exactly what Juniper Media Flow Solution offers, we find this is what really gets the interest of service providers: they like that they address their immediate pain points with the same solution that can be used to generate new revenue down the road.

  • Pepsi

    Out of curiosity, are there any legal implications with ISP’s caching this content? Obviously they won’t have permission to store and deliver the content which is copywrited but do they need permission?
    Weather justified or not I could easily see lawsuits by content owners against the owners of the transparent cache.

  • http://profile.typepad.com/cbaker2 Charles Baker

    To answer the legal implications question – there is specific legislation written to protect transparent Internet caching. It is the Digital Millenium copyright act (DMCA) mentioned in the post. PeerApp’s UltraBand solution is fully compliant with this legislation.
    The evolution of the transparent Internet cache to the content delivery element in the network is a continued topic across the industry. Deployment flexibility and scalability are key as operators migrate from a pure network optimization deployment to a revenue generating platform.

  • http://www.netmon.ca Claire

    good information,thank you

  • Todd Culotta

    This is a hot topic with the MSO’s who are trying to figure out how to handle the YouTube and netflix traffic hitting their network and where they might want to consider deployed ther own CDN hardware. Cisco and Juniper will eventually be the option here, however companies like Bluecoat who can do it now are an option, but they are way to expensive at the moment, compared to just adding bandwidth capacity from someone like level 3. We will see this play out over the next 18 months, however what about reporting? If I was a content owner and Time Warner decided to transparently cache my content on their network I would certainly demand to know how many viewers I had etc..

  • http://www.btisystems.com Glenn Thurston

    Caching deeper in the network by its nature of being closer to a community of interest will become tailored to the environment its operating in. Think about caching near a university dorm (heavy P2P and software downloads) versus in a residential neighborhood (youtube and OTT video) – the traffic needs are different but both high bandwidth growth. Using a transparent cache that adapts to its environment provides faster tailored content delivery for less metro connectivity infrastructure dollars for each community of interest.

  • Jai-Jin Lim

    How the transparent caching works when the content is encrypted with the digital rights reserved to a person who purchased the content?

  • http://www.leonrome.com Leonrome

    This article is very interesting but seems to leave too many open points from a carrier/ISP point of view. At the moment it is not so convinient for a carrier or an ISP to build up a CDN proprietary solution since the 90% of the content (e.g. Akamai, Google,..) in the network are alraedy served through a CDN solution. On the other side the saving on the backbone by introducing CDN servers of partners indeed as Akamai or Google does not compensate the access costs and/or price decline towards the subscribers.

  • bluehrsolutions

    your article has great information and very interesting to read thanks for sharing…post like this…