As Pay TV Flocks To Devices, Multi-DRM Can Make Or Break Service Success

[This is a guest post by my Frost & Sullivan colleague Avni Rambhia]

It’s no longer news that TVE and OTT have graduated from experimental endeavors to full-fledged service delivery channels. On metrics such as subscriber growth, growth in hours viewed, and growth in advertising revenue – OTT services are surpassing traditional Pay TV services. That is not to say that OTT services are fully monetized today. Revenue generation, whether ad-based, transactional or subscription, remains an ongoing challenge for TVE/OTT services despite growing uptake and aggressive infrastructure investments.

The quest to bring a consistent, managed-quality experiences to  an unruly stable of unmanaged devices is a formidable challenge. Maintaining support across all past and present devices in the face of changing streaming and delivery standards is an undertaking in its own right. Nonetheless, secure multimedia delivery holds the key to delivering premium content to subscribers. With competing services only a few clicks away, ongoing growth relies heavily on the ability to deliver a service transparently across the many underlying DRM systems, device platforms, browsers and streaming protocols currently in use and on the horizon.

HTML5, with its related EME and CDMi standards, was envisioned as a way to unify cross-platform fragmentation and simplify cross-platform app development and content delivery. Things didn’t quite materialize that way, with the result that content companies will need to manage secure content delivery and handle back-end license and subscriber management across all major DRM platforms. While there is a perception that “DRM is free”, stemming primarily from the royalty-free nature of Widevine and the falling costs of PlayReady licensing, in reality the total cost of ownership is quite high. Moreover, DRM needs to be treated as a program rather than a project, subject often to unexpected spikes in R&D and testing overhead when a new operating system is released, a new device surges in popularity, old technology is deprecated, the DRM core itself is revised, or when a new standard takes hold. While the client side of the problem is often the first concern, server-side components play an important role in service agility and scalability in the longer run.

As part of our content protection research coverage at Frost & Sullivan, we took an in-depth look at factors affecting the total cost of ownership for both content companies (including broadcasters, new media services and video service operators) as well as OVPs who are increasingly outsourcing OTT workflows on behalf of content companies. The findings from this research are reported in a new white paper sponsored by Verimatrix. We’ll be discussing many of these factors, and their real life impact on customers, during a live webinar on Wednesday September 28th at 10am ET. Divitel will be joining to discuss their experiences first hand.

As we’ll talk about in the webinar, agility and scalability are crucial to OTT services as TV by appointment fades away and customers continue to trend towards device-first viewing behavior. While some companies may have the engineering talent and budget capacity to build and maintain their own multi-DRM infrastructure, our best practice recommendation in the majority of cases is to work with a security specialist vendor instead of going DIY. If you would like to share your own stories, whether as a vendor or a customer, or if you have any questions about DRM and available options, feel free to comment here or reach out to Avni Rambhia, Industry Principal, ICT/Digital Transformation at Frost & Sullivan.

Tuesday Webinar: Accelerating Your OTT Service

Tuesday at 2pm ET, I’ll be moderating a webinar on the topic of “Accelerating Your OTT Service“. The OTT market is expected to generate an estimated $15.6 billion in revenue globally through 2018. Join Brightcove’s Vice President of OTT Solutions Luke Gaydon for an interactive discussion about the state of the OTT landscape and the key opportunities and challenges facing media companies looking to capitalize on this thriving market.

Luke will review the latest data on the growth of OTT and discuss complexities including device fragmentation and how to address them. Then, he will showcase Brightcove OTT Flow – powered by Accedo, including key product features, and share how this innovative turnkey solution enables the seamless, rapid deployment of OTT services across multiple platforms. Join this webinar to learn:

  • The latest data on the growth of OTT across devices, platforms and audiences
  • The growing challenges, including device fragmentation and technical scope
  • Strategies for taking your content OTT
  • Key features, analytics and how OTT Flow provides a consistent user experience across devices

REGISTER NOW to attend this free live web event.

Correcting The Hype: Twitter’s NFL Stream Lacks Engagement and A Profitable Business Model

nfl-twitterWith two NFL games under Twitter’s belt now, I’m reading far too many headlines hyping what it means for Twitter to be in the live streaming business. Phrases like “game changing,” “milestone,” and “make-or-break moment” have all been used to describe Twitter’s NFL stream. Many commenting about the stream act as if live video is a new kind of new technology breakthrough, with some even suggesting that “soon all NFL games will be broadcast this way.” While many want to talk about the quality of the stream, no one is talking about the user interface experience or the fact that Twitter can’t cover their licensing costs via any advertising model. What Twitter is doing with the NFL games is interesting, but it lacks engagement and is a loss leader for the company. There is nothing “game changing” about it.

The first NFL game on Twitter reached 243,000 simultaneous users and 2.3M total viewers. But looking further at the data, the average viewing time was only 22 minutes. Most who turned into Twitter didn’t stick around. Many like myself tuned into only to see what the game would look like and how Twitter would handle live video within their platform. For week two, Twitter reached 2.2M total viewers and had 347,000 simultaneous users, but the NFL isn’t saying what the average viewing time was. Twitter and the NFL are also counting a viewer as anyone who watched a “minimum of three seconds with that video being 100% in view”, which is a very short metric to be using.

Unfortunately, the whole NFL experience on Twitter was a failure in what Twitter is supposed to be about – engagement. Watching the stream in full screen, on any device, felt like I was watching the game via any other app. Twitter didn’t overlay tweets in any way, some commercial breaks had no ads shown at all and tweets weren’t curated. Far too many tweets added nothing to the game with comments like “Jets Stink.”

Streaming live content on the web has been around for 21 years now, and it’s sad state of the industry when the most exciting part of the event was that people could not believe the video didn’t buffer or have widespread quality issues. It’s not hard to deliver a video stream to 243,000/347,000 simultaneous users, especially when Twitter hired MLBAM, who then used Limelight, Level 3 and Akamai to deliver the stream. Some suggested that the “NFL on Twitter opens an enticing lucrative new revenue stream,” which of course isn’t the case at all. We don’t know for sure what Twitter paid for the NFL games, but if the rumors of $1M per game are right, Twitter can’t make that back on advertising. They don’t have a large enough audience tuning into the stream and would never get a CPM rate to cover their costs. Some have even suggested that the NFL stream on Twitter is a “model for other revenue-generating live stream events.” but of course that’s not the case. One-off live events can’t generate any substantial revenue as the audience size is too small, and the length of the event too short.

There is nothing wrong with Twitter using NFL content to try to attract more users to the service and grow their audience, but the NFL content itself isn’t a revenue generator for the company. Some, including NFL players, suggested that soon all NFL games will be broadcast online and that what Twitter and the NFL are doing is the future. That idea isn’t in touch with reality. The NFL is getting paid $28B from FOX, CBS and NBC over the course of their contracts, which end in 2022. That averages out to $3.1B per year the NFL is getting from just those three broadcasters. The NFL has no financial incentive to put more NFL games online, without restrictions, or they risk losing their big payday from the broadcasters. It’s not about what consumers want, it’s about what’s best for the NFL’s bottom line. Economics drives this business, not technology or platforms.

If Twitter has a game plan for all the live video they are licensing, it’s not clear what that is. In a recent interview with Twitter’s CFO, he commented that Twitter’s goal with the NFL games is to, “help the NFL reach an audience it was not otherwise reaching.” How does Twitter know they are doing that? There were plenty like me who were watching the game on TV and Twitter at the same time. The NFL didn’t need Twitter to reach me. And when the CFO uses the term “high fidelity” to describe the stream, what does that mean? Twitter keeps saying they have the “mobile audience,” but they won’t break out numbers on what the usage was on mobile versus the web, or any data on average viewing time on mobile. Twitter also said, “there was evidence that people who had not watched the NFL in a while were back engaged with it.” Why kind of evidence is that exactly? Twitter can’t tell if I was on the day before, or watching the game on TV today, so what kind of data are they referencing?

Twitter also says that they were “incredibly pleased with how easy it was for people to find the Thursday night game on Twitter.” Making a live link available isn’t hard. A WSJ article said there are other “live sports streaming technologies out there” but Twitter’s was “easy to use.” All the live linear video on the web is using the same underlying technology, Twitter isn’t doing anything special. They are using the same platform that MLB, ESPN, PlayStation Vue, WWE and others use, as they are all MLBAM customers. Many in the media made it sound like Twitter did something revolutionary with the video technology, which wasn’t the case at all.

Someone commented online that the reason Twitter’s NFL stream is so “successful” is because “advertisers love the mass that live sports delivers.” But it doesn’t deliver that mass audience online, only on TV. And that’s the rub. The NFL and every other major sports league isn’t going to do anything to mess with their core revenue stream. So for at least the next six years, until their contracts with the broadcasters come up for renewal, the NFL isn’t going to do anything more than experiment with live streaming. And there will always be another Twitter, Yahoo, Verizon, Google, or someone else who wants to give the NFL money, more for the publicity of the deal, than for anything that actually increases their bottom line.

Moderating NYME Session On Future Of Live Streaming Thursday, Looking For Speaker

nyme-logoThursday at 12:20pm I am moderating a session on “The Future Of Live Streaming: Sports, Linear TV & Social” at the New York Media Festival in NYC. It’s a short 30-minute round table panel, with lots of Q&A from the audience. I am looking for one more speaker to join the panel, preferably a content owner/broadcaster/aggregator etc. Email me if interested.

The Future Of Live Streaming: Sports, Linear TV & Social
From NFL games on Twitter, to upcoming live linear services from Hulu and AT&T joining Sling TV, live streaming is exploding on the web. With rumors of Amazon wanting to license live sports content, Disney’s investment in MLBAM, and Twitch pumping out millions of live streams daily, consumers now have more live content choices than ever before. Attendees of this session will have the opportunity to participate in the discussion about the most important obstacles being faced when it comes to live streaming. Topics to be covered include content licensing costs for premium content, monetization business models, what consumers want to watch and the impact social services could have on live video. This will be an interactive session with plenty of Q&A from the audience.

How To Implement A QoS Video Strategy: Addressing The Challenges

While the term “quality” has been used in the online video industry for twenty years now, in most cases, the word isn’t defined with any real data and methodology behind it. All content owners are quick to say how good their OTT offering is, but most don’t have any real metrics to know how good or bad it truly is. Some of the big OTT services like Netflix and Amazon have built their own platforms and technology to measure QoS, but the typical OTT provider needs to use a third-party provider.

I’ve spent a lot of time over the past few months looking at solutions from Conviva, NPAW, Touchstream, Hola, Adobe, VMC, Interferex and others. It seems every quarter there is a new QoS vendor entering the market and while choice is good for customers, more choices also means more confusion on the best way to measure quality. I’ve talked to all the vendors and many content owners about QoE and there are a lot of challenges when it comes to implementing a QoS video strategy. Here’s some guidelines OTT providers can follow.

One of the major challenges in deploying QoS is the impact that the collection beacon itself has on the player and the user experience. These scripts can be built by the content owner, but the time and resources it takes to not only build them for their ecosystem of platforms, but develop dashboards, create metrics and analyze the data is highly resource intensive and time consuming. Most choose to go with a third-party vendor who specifically offers this technology, however choosing the right vendor can be another pain point. There are many things to consider when choosing a vendor but in regards to implementation, content owners should look at the efficiency of their proposed integration process (for example, having standardized scripts for the platforms/devices/players you are using and the average time it takes to integrate) and their ability to adapt to your development schedule. [Diane Strutner from Nice People At Work (NPAW) had a good checklist on choosing a QoE solution from one of her presentations, which I have included with permission below.]

dAnother thing to consider is the technology behind the beacon itself. The heavier the weight of the plug-in the longer the player load time will be. There are two types of beacons, ones that process the data on the client (player-side) which tend to be heavier or the ones that push information back to a server to be processed, which tend to be lighter.

One of the biggest, if not the biggest, challenge to implement QoS is that it forces content owners to accept a harsh reality: that their services do not always work as they should all the time. It can reveal that the CDN, or DRM or ad server or player provider that the content owner is using is not doing their job correctly. So the next logical question to ask is what impact does this have? And the answer is, that you won’t know (you can’t know) until you have the data. You have to gather this data through properly implementing a QoS monitoring, alerting and analysis platform and insights gathered from it and then apply it into your online video strategy.

When it comes to collecting metrics, there are some metrics that matter most to ensure broadcast quality video looks good. The most important are buffer ratio (amount of time in buffer divided by playtime), join time (or time to first frame of the video delivered), and bitrate (the capacity of bits per second that can be delivered over a network). Buffer and join time have elements of the delivery process that can be controlled by the content owner, and others that cannot. For example, are you choosing a CDN who had POPs close to your customer base, had consistent and sufficient throughput to deliver the volume of streams being requested, and peers well with the ISP your customer base is using? Other elements like the bitrate are not something that a content owner can control, but should influence your delivery strategy, particularly when it comes to encoding.

For example, if you are encoding in bitrates that are HD but your user base streams at low bitrates, this will cause quality metrics like buffer ratio and join time to increase. One thing to remember is that you can’t look to just one metric to understand the experience of your user base. Looking at one metric alone can lead to misinformation. These metrics are all interconnected, and you must have the full scope of data in order to get the full picture needed to really understand your QoS and the impact it has on your user experience.

Content owners routinely ask how they can ensure a great QoE when there are so many variables (i.e., user bandwidth, user compute environment, access network congestion, etc.)? They also want to know once they data is collected, what industry benchmarking should they use to compare their data to? The important thing to remember is that such benchmarks can never be seen as anything more than a starting block. If everything you do is “wrong” (streaming from CDNs with POPs half way across the world from your audience base, encoding in only a few bitrates, and other industry mistakes) and your customer base and engagement grow (and earn more on ad serve and/or grow retention) then, who cares? And let’s say you do everything “right” (streaming from CDNs with the best POP map, and encoding in a vast array of bitrates) and yet your customers leave (and the connected subscription and ad serve revenue drops) then, who cares either?

When it comes to QoS metrics, the same logic applies. So what do content owners focus on? The metrics (or combination of them) that are impacting your user base the most. How do content owners identify what these are? They need data to start. For one it could be join time. For their competitor it could be buffer ratio. Do users care that one content owner paid more for CDN, or has a lower buffer ratio that their competitor? Sadly, no. The truth behind what matters to the business of content owners (as it relates what technologies to use, or what metrics to paramount) in their own numbers. And that truth may (will) change as your user base changes, viewing habits and consumption patterns change, and your consumer and vendor technologies evolve. And content owners must have a system that provides continual monitoring to detect these changes at both a high level and granular level.

There has already been widespread adoption for online video, but the contexts in which we use video, and the volume of video content that we’ll stream per capita has a lot of runway. As Diane Strutner from Nice People At Work correctly pointed out, “The keys to the video kingdom go back to the two golden rules of following the money and doing what works. Content owners will need more data to know how, when, where and why their content is being streamed. These changes are incremental and a proper platform to analyze this data can detect these changes for you“. The technology used will need to evolve to improve consistency, and the cost structure associated with streaming video will need to continually adapt to fit greater economies of scale, which is what the industry as a whole is working towards.

And most importantly, the complexity that is currently involved with streaming online video content, will need to decrease. And I think there is a rather simple fix for this: vendors in our space must learn to work together and put customers (the content owner, and their customers, the end-users) first. In this sense, as members of the streaming video industry, we are masters of our own fate. That’s one of the reasons why last week, the Streaming Video Alliance and the Consumer Technology Association announced a partnership to establish and advance guidelines for streaming media Quality of Experience.

Additionally, based on the culmination of ongoing parallel efforts from the Alliance and CTA, the CTA R4 Video Systems Committee has formally established the QoE working group, WG 20, to bring visibility and accuracy to OTT video metrics, enabling OTT publishers to deliver improved QoE for their direct to consumer services. If you want more details on what the SVA is doing, reach out to our Executive Director Jason.