Still reading emails. I will answer them as soon as possible.
Because of substantial growth prospects within the content delivery industry, CDN providers are continuously looking at opportunities to broaden market boundaries and offer new and unique value propositions to their customers. One of the interesting ways in which some providers have diversified recently, is by looking to get behind the firewall, for enterprise deployments. Essentially, CDN vendors are looking at the larger WAN market, and are introducing cloud networking solutions for global businesses.
Some analysts predict the demand for enterprise SaaS applications is set to exceed that for on-premises applications by a factor of five. Enterprises are increasingly adopting SaaS-based applications, even for mission-critical workloads, due to obvious cost and flexibility advantages. Alongside SaaS, businesses are also investing in IaaS and PaaS platforms such as Microsoft Azure, Amazon Web Services, and others.
Businesses are often unable to take full advantage of their potential productivity gains from cloud adoption because of substandard network performance. Poor accessibility of SaaS applications and IaaS/PaaS workloads through the public Internet kills user experience. Globally dispersed end-users, especially those from remote locations like China, India and the Middle East, are subject to inconsistent latencies, congestion and packet loss. This often results in frustratingly slow performance, which is the most common complaint I hear of from customers.
In the past, the enterprise WAN primarily served the purpose of connecting geographically dispersed branch offices to enable organizational collaboration. But most enterprises go over the public Internet to access mission-critical cloud workloads, and suffer from performance issues. With the onset of the cloud-era, legacy WAN providers have had to transform and adapt themselves to the changing technology landscape, while a slew of new vendors have brought a variety of new approaches to the table. This has caused the cloud networking market to become quite cluttered.
For instance, Telcos like AT&T, Verizon, British Telecom and others, have partnered with cloud service providers to offer private connectivity to limited cloud providers, using Multi-Protocol Label Switching (MPLS) technology. A few software-defined WAN (SD-WAN) vendors provide low-cost, appliance-based solutions that dynamically route traffic over multiple links based on traffic profiles and quality of links. Other WAN providers own private global networks and adopt a cloud delivery model to provide easy-to-use, high-performance connectivity solutions. CDN providers have now hopped onto the cloud bandwagon, and decided to make their presence felt. These providers are banking on leveraging proprietary routing technology over the Internet and technology partnerships, to offer optimized access to enterprise cloud applications. However, for most of them, there are a few limitations to the technology:
- The biggest problem is that these solutions, again, rely on a public Internet backbone. Hence, there is simply no way of avoiding the challenges of congestion, packet loss, jitter, and fluctuating latencies, which are inherent to the publicly shared medium.
- Traditional CDN players have based their network architecture on caching static content close to end users, and that’s what they’re good at. Their expertise does not lie in optimizing the performance of bi-directional, dynamic, cloud-hosted enterprise content.
- These solutions require organizations to invest in, integrate, and maintain a number of components, such as the network, optimization appliances, visibility software, etc. – something that demands the enterprise’s time and resources.
It’s going to be interesting to see how internet-based CDN vendors perform against incumbent cloud networking providers who are ahead in the race by several years and offer tightly-integrated, enterprise-grade cloud connectivity solutions. I like how Aryaka Networks, for example, has generated market traction with their cloud networking platform, one which combines private connectivity, optimization, cloud acceleration proxies, and visibility. And since they are focused only on the enterprise market, and aren’t trying to also provide CDN services for live streaming of M&E events, or targeting broadcasters, their solution can be very focused and solve a specific problem.
The enterprise WAN market has always welcomed network service providers who bring innovation and value to the table to address prevalent use cases, and the rapid growth of SaaS adoption by global businesses has painted a large bullseye for CDN vendors. However, it is likely that most CDN providers will find it difficult to set foot into an industry that consists of established cloud networking providers who have built their solutions from the ground up to provide enhanced cloud connectivity to enterprises. I would be interested to hear in the comments section below from companies who are using such solutions and what their experience, both good and bad, has been to date.
Last month, Limelight Networks filed a lawsuit against Akamai and XO Communications claiming the companies are infringing on six of their patents. I’m tired of all these back and forth patent suits in the CDN space that to date, have never amounted to anything but wasted time, money and focus by the companies involved. So I don’t plan to do a lot of coverage on the topic, but if you want to read the 46 page complaint, I’ve provided it here. The six patents in question are 7,715,324 – 8,750,155 – 8,856,263 – 8,683,002 – 9,015,348 – 8,615,577. These patents cover: systems and methods for acceleration of web pages; conditional protocol control; cache grouping; acceleration techniques; and policy based processing of content objects.
Every day we’re bombarded with data that says how many times a video has been played or what’s most popular, but one of the things we don’t see much data on is engagement. Simply tracking how many times a video is watched is not the same as measuring what the engagement experience was with the viewer. And more importantly, content experiences are now more personalized, more custom and as consumers, we have more choices than every of how we consume video.
The online video advertising industry likes to talk about engagement, but even they don’t talk to the methodology that should be used considering we have so many a-la-carte and custom bundling options for online video. Today’s online TV experience is completely personalized, which is especially true on mobile where content is updated in real-time, presented in feeds, and engaged through swipes and thumbscrolls.
Industry data shows that for e-commerce and media, 80% of viewers that are driven to video do not want to watch what is being presented. That begs the question, how much money are publishers leaving on the table by not effectively engaging with the remaining 20%? In similar industries, like ecommerce, Amazon has long utilized advances in machine learning and data science to become the premier example of how to do customer relationship management and get the most ROI in return. Publishers should take a lesson from the ecommerce market and bring the data driven approach of one-to-one marketing to video programming. By utilizing the best practices of customer relationship management to video, publishers could better engage audiences and grow viewership.
Today, viewers are presented with multiple entry points to content and have the upper-hand in the buyer-supplier relationship. TV was a destination, video is a journey. The challenge for publishers is to engage viewers in such a way that they look forward to the trip. Data has shown that positive user-experiences build consumption habits and fuel organic growth in viewership. So any opportunity content owner’s have to pass on incremental value to an engaged viewer should be taken, yet many publishers are still missing the opportunity.
The business of video, like TV, has for the most part been about servicing advertisers by providing them audiences at scale to watch their commercials. As publishers and content owners are finding less utility in traditional video value metrics, it may be time to look to familiar techniques utilized by one-to-one marketers. The foundation of CRM is to “sell more to fewer customers.” This sounds counterintuitive, but the greatest sensitivity to customer lifetime value is retention i.e. churn. A small change in retention has large impacts on profits and losses. There is higher return if you can sell/upsell/cross sell to an existing customer than acquiring a new customer and selling to them.
Video publishers have looked to curation to address this challenge to mixed results. Manually curated playlisting is not scalable and does not offer the benefits of one-to-one engagement that is inherent to the digital experience. Native content recommendation widgets promotes external discovery by sending viewers out to other sites. While there is positive audience development that cycles through these systems, this is done at the expense of internal discovery where the relationship between publisher and audience can flourish. It makes more sense to present viewers with relevant content from your library instead of sending them to a third-party destination. Publishers need to utilize video personalization data to have insight on user behavior and compare it against content categories, content length, content type, device, and a variety of other parameters. This not only informs content creation, but also video placement, and distribution.
Publishers have utilized performance data to position video players on different parts of pages, as well as against different types of content. This has enabled some publishers to provide an optimal user experience for native sponsored content, but they need to go further, like automating social media postings to align with trending asset alerts. I spoke with a content owner the other day that relies on the IRIS.TV platform, a personalized video programming and decisioning system. This allows content owners to know what content leads to the most follow-on views. IRIS.TV calls this “anchor assets” and this insight is analogous to brick-and-mortar retailers utilizing consumption patterns to optimize the placement of products on shelves. Publishers can optimize new video content in ways that lead to viewing of relevant library content. This is analogous to two consumers going to the market to buy milk, but the market places products in such a way that consumer A buys an additional Snickers bar and a Maxim and consumer B buys a Kind Bar and Runners World.
I asked IRIS.TV for some data from their system and the company said that for content publishers that use their plugin, consumption of video sees an average increase of 54% or more. Engagement views per session see a 62% increase. And when it comes to retention, the average bounce rate is reduced by 14%. Other interesting data they shared is the breakdown of consumption on mobile with Android leading the market with 53% versus iOS with 45%. On non-mobile, Windows leads with 63% share with OS X having 19% share.
The take away from all of this is that most publishers and content owners are not effectively engaging audiences, resulting in poor ROI from video and audience development strategies. I’d be interested to hear in the comments section below what key metrics publishers are using in their data driven approach to video consumption.
On a recent StreamingMedia.com webinar about live streaming, David Kirk at Epiphan gave out some really good information during his presentation on No Fail Live Video Best Practices. I asked him to summarize his best tips and to answer some of the most popular questions we received. Whether you’re tasked with getting your CEO’s message to the company, or the president’s message to the country, use these suggestions to ensure a smooth live streaming event.
Prepare for input problems: Although everyone expects their video inputs to be available during their live stream, sometimes they are lost. Whether a laptop with slides loses power or the camera’s cable comes loose, it’s best to be ready for these problems-waiting-to-happen rather than be caught unprepared. The best approach is two pronged: have a branded, custom no signal image that automatically appears when signal is lost; and have secondary video layouts ready to switch to when a signal is lost.
A custom no signal image that includes branding from the current event gives remote viewers the sense that you know there is an issue and are working to resolve it, while also maintaining the graphical style and flow for the event. When you see this no signal image appear in your stream (because you’re monitoring it – see the tip below!), be sure to switch immediately to another layout that doesn’t include the signal. For example if the problem is with a laptop showing slides and you were streaming a picture in picture that included the laptop and a camera on the presenter, switch as soon as possible to a layout that has only the presenter, or a different picture in picture has the presenter and the local audience, or something else. You get the idea. Next, use your local confidence monitors to determine when the problem input is resolved and switch back to your original layout.
Test and test again: For live events on location, get the most of your time before the show by making sure you test your gear offsite before the show and again on-site before the show. Start by creating configuration presets or changeable settings that let you test your expected inputs, create the layouts and add event branding before you go. If you have them, enter your CDN settings and test that everything works before you leave. Also create (or request from your client) pre and post show content that you can show in the stream before and after the event. For best results, use video with audio as your pre-show content.
When packing for the event, bring extra laptops or devices that provide the same outputs (HDMI, SDI, VGA, etc) you’re planning to use in your live layouts. This way, when you arrive at the event, you can plug in these devices for testing even if the main cameras or presenter laptop feeds aren’t available. If your pre-show video content includes audio, use this to test your whole setup, end-to-end including through the CDN. If you’re not using a video for pre-show content, test audio by adding music to the static pre-show image. (Note, when streaming to some services, like YouTube, music with copy protection can cause your audio to be muted for copy-protection reasons.) Don’t forget to also test audio coming from laptops (remember, some presentations include audio) or any other device that might feed you audio during the stream.
Fail-safe networking solution: Network and CDN failures during a live stream are a real possibility. Depending on the hardware you have available, make use of mobile tethering, back up CDN stream or even multiple CDNs to make sure a disruption in networking doesn’t affect your ability to get your content to the cloud. For remote locations and high priority events that absolutely need to be broadcast live, consider primary or backup networking with cellular bonding solutions that combine the throughput from multiple service providers to create a fat pipe to the cloud. If any one of these providers has a problem, these solutions automatically compensate by reshaping the traffic over the remaining providers.
Monitor end to end: For streaming your event, choose hardware that lets you monitor video and audio quality being captured to make sure you’re streaming exactly what you expect. But don’t stop there. Make sure you also have a computer connected as a client to your CDN or streaming server to view end-to-end content and quality. Keep an eye on both these places for your custom no signal image, audio issues, or other problems and adjust accordingly.
Where possible, simplify your setup: Reduce gear needed and possible points of failure by using all-in-one gear that accomplishes both recording and streaming to eliminate points of failure with multiple splitters and long mazes of cables. Simplify your event by having only the exact number of pre-configured layouts you need for live streaming. Too many layouts means it’s easy to make a mistake, so be sure to delete any that aren’t needed or that were used for testing and don’t have the right inputs or branding. If supported by your encoder, make a single-camera production appear more dynamic by using hardware cropping to crop a wide camera view to a more personal close-up that you can switch to on the fly.
Indeed, no two live streaming experiences will be the same, but using these tips you can make them have a uniformly pleasing outcome.
Watch the full No Fail Live Streaming presentation here.