New Patent Pool Wants 0.5% Of Every Content Owner/Distributor’s Gross Revenue For Higher Quality Video

Updated: [Companies Should Reject Licensing Terms From HEVC Advance Patent Pool]

In March, a new group named HEVC Advance announced the formation of a new patent pool [see: New HEVC Patent Pool Launches Creating Confusion & Uncertainly In The Market] with the goal of compiling over 500 patents pertaining to HEVC technology. The pool of patent holders, which is “expected” to include GE, Technicolor, Dolby, Philips, and Mitsubishi Electric has just announced their royalty rates and are going directly after content owners and CE manufacturers. HEVC Advance wants 0.5% of content owners attributable gross revenue for each HEVC Video type. To put in perspective how unjust and unfair their licensing terms are, they want 0.5% of Netflix, Apple, Facebook, Amazon and every other content owner/distributor’s revenue, as it pertains to HEVC usage. Considering that most content owners and distributors plan to convert all of their videos over time to use the new High Efficiency Video Coding compression standard, companies like Facebook, Netflix and others would have to pay over $100M a year in licensing payments. The licensing terms apply to all content services that get revenue from advertising, subscription and PPV – which pretty much equals every content owner, OTT provider, broadcaster, sports league, satellite broadcaster and cable provider you can think of.

Making matters worse, HEVC Advance says their licensing terms [listed in detail here] are “retroactive to date of 1st sale”, so companies would be required to make payments on content they have already distributed using HEVC. In addition to content owners, HEVC Advance is also going after CE manufacturers of TV, mobile and streaming devices. TV manufacturers would have to pay $1.50 per unit and mobile devices incur a cost of $0.80 per unit. Streaming boxes, cable set-top-boxes, game consoles, Blu-ray players, digital video recorders, digital video projectors, digital media storage devices, personal navigation devices and digital photo frames would cost a manufacturer $1.10 per unit.

While HEVC Advance is quick to say how “fair and reasonable” their terms are, they aren’t. The best way to describe their terms would be unreasonable and greedy. MPEG LA, another licensing body for HEVC patents, charges CE manufacturers $0.20 per unit after the first 100,000 units each year (no royalties are payable for the first 100,000 units) up to a current maximum annual amount of $25M. HEVC Advance’s rates for TV manufacturers is seven times more expensive than MPEG LA’s licensing fees. In addition, MPEG LA charges content owners nothing for utilizing HEVC technologies in their business. (Updated 7/24: Removed reference to the licensing terms for AVC to make it easier to compare pricing)

Licensing groups typically don’t go after content owners; instead they go to hardware and platform vendors in the market who are customers of content distributors. But in this case, HEVC Advance is going directly after content owners and isn’t asking CDNs, encoding vendors or others in the video ecosystem to license their patents. What HEVC Advance doesn’t grasp is that this approach of trying to get a share of content owners revenues has been tried in the past and failed miserably. MPEG-4 Part 2, the original MPEG-4 video compression that pre-dated AVC failed in the market because of this licensing approach. Content owners are not willing to share in their revenue, and HEVC Advance is taking a fatal flaw in their approach. The fact that they think someone like Facebook, Apple or Netflix is going to hand over tens if not hundreds of millions of dollars to them, each year, shows just how delusional they really are. While I don’t expect HEVC Advance to get any traction with content owners, their licensing terms could have some major impacts in the industry. Right now, content licensing deals around streaming media services do not account for the cost of royalty payments. So if more money is required to play back higher-quality video, content licensing costs will go up, and consumers are going to foot the bill for higher priced streaming services

Another big problem is that there are no caps on the proposed royalties. This creates an immense burden for Internet-based technologies and software applications that may be looking to incorporate HEVC, since there is, in most cases, no realistic way of predicting what percentage of your content will be consumed using HEVC each month. Secondly, the 0.5% royalty on all revenues attributable to HEVC-based content is, to put it mildly, a loose cannon. This umbrella is intended to cover both direct revenue such as from subscription and PPV services as well as indirect revenues such as advertising supported business models. One could argue that the definition could be pushed so far as to cover the purchase price of merchandise that is advertised using HEVC-compressed video. If Zappos has a product video of a shoe on their website encoded using HEVC, Zappos would have to give a percentage of the sale of that show to HEVC Advance. Furthermore, this is 0.5% of gross revenues, not net profit, and so it is effectively a “compression tax” that spans content licensing fees and all content delivery expenses that any content service provider needs to generate in order to be profitable. Recall again that there is no cap on fees, and you can see how this one’s almost inevitably headed to the courts.

Currently, these terms only capture B2C applications such as social media, streaming video and Pay TV but licensing terms for B2B use cases such as video conferencing, video surveillance and enterprise video webcasting are still being considered by HEVC Advance. It is conceivable that those use cases will not be licensed at the same onerous terms as the content-based fee, but the burden generated by the current B2C licensing terms is almost certain anyway to increase the perceived risk by enterprises of adopting HEVC. This is a shame because HEVC has the potential to begin to transform the video delivery infrastructure, and was finally approaching an inflection point where shipments and usage was projected to rise to meaningful levels. 4K content is already reeling under the pressures of unviable economics. From limited improvement in visual quality to storage and transmission costs that rise at a much higher multiple than monetization can, the financial argument for 4K was already hurting for the vast majority of use cases. The present curveball thrown by HEVC Advance further impacts this already risky value proposition.

Adding insult to injury, HEVC Advance has yet to provide any details on which patents they expect to be in the pool. The company has said they will have more details to share later in the year, yet they acknowledge that the patents have not yet gone through an independent patent evaluation process, which they expect to start next month. HEVC Advance’s CEO also mentioned that some of the patent owners that are “expected” to be in the pool, are still finalizing their paperwork, so there is no confirmation yet of exactly which companies are officially in the pool. No patents have as yet been identified by HEVC Advance as being essential to HEVC, even though HEVC Advance says many of the patents are essential. In other words, there is still room for patent holders to take the responsible road and monetize their intellectual property in a fairer, more scalable and more industry-friendly manner, on their own, or in other pools other than HEVC Advance.

While some content owners have told me they feel these rates don’t apply to them since they use cloud providers to encode and deliver their content, that’s not a valid argument. They are not protected in their contracts with vendors when the vendors are not required to have to license the technology. If content owners band together and agree not to license from HEVC Advance, which is what I suggest they do, HEVC Advance will fail in the market and be forced to change strategy, or change their terms to be fair and reasonable. Since HEVC Advance is simply a licensing body, they can’t sue anyone. The actual patents holders would have to legally go after thousands of content owners and CE manufacturers, which I don’t see someone like Technicolor, Dolby, Philips, and Mitsubishi Electric having the time or energy to do. But make no mistake, there is a lot at stake here. 0.5% of a market that is over $100B a year is a lot of money that HEVC Advance is going after.

Another interesting impact this could have on the industry is the potential for content owners and CE manufacturers to move away from HEVC and adopt Google’s competing VP9 codec, which requires no licensing. This might be hard for many who have already tied their services to hardware, but nothing is stopping them from re-encoding their library to use VP9 for playback via the web and apps and then only use HEVC for playback tied to TVs and other hardware devices. HEVC Advance’s licensing terms definitely put VP9 in the spotlight and are going to have many content owners running the numbers to see how much money they can save, if they had to pay the royalties, versus the cost to re-encode into VP9. The whole reason content owners are moving to HEVC is better quality, with fewer bits, which equals a cheaper cost. If the cost savings is now erased by new royalty payments for HEVC technology, what’s the point of using HEVC over VP9? This is a question many content owners are going to start asking themselves.

The bottom line is that HEVC Advance is bad for the industry, for consumers, for the growth of 4K and in my two calls with the company, it’s clear that lawyers are driving the licensing, not technology people. The company was under the impression that content owners deliver their own content, which most don’t since they use third-party CDNs. So even if a content owner wanted to license patents from HEVC Advance, they don’t currently have the data that’s needed to determine what they pay. This means they would have to spend more money and time to to set up a data collection process, in addition to the added cost of the license. HEVC Advance thinks content owners and distributors track what percentage of their content is consumed via different codecs, which isn’t accurate, and they were dumbfounded when I told them that. Almost all of their answers to my questions was that they would make it “easy” for companies and “work with them”, but of course they don’t understand the basics and don’t have the skills to even know what kind of help content owners would need.

HEVC Advance excels when it comes to being vague, speaking in lawyer terms, not being specific, and showcasing their complete lack of understanding of the market they are going after. Frankly, it’s insulting that their press release from today speaks to how they are providing “efficient and transparent access to patents”, yet, have provided no details on any of the actual patents. This pool is completely incompetent and lacks any real understanding of the market.

This won’t be the last we’re hearing about patents pertaining to HEVC and higher-quality video/audio as Dolby has already told content owners that they should expect another royalty, as it pertains HDR in addition to HEVC. So don’t assume that HEVC and 4K are going to get the kind of traction that many are predicting. (Updated 7/23: Edited post to reference Google’s VP9 codec, not VP10)

Frost & Sullivan Analyst Avni Rambhia contributed to this post.

Note: I am available to talk to any content owner, vendor, member of the media or anyone else who wants details on all of this. This is bad for everyone, and I am making it my job to try to educate everyone as much as possible. I have already had multiple calls with lawyers and intellectual property groups at content firms, MSOs and others potentially impacted by this. All calls are off-the-record and confidential. I can be reached anytime at 917-523-4562 or

Announcing The “Live Streaming Summit”: New Conference On The Live Video Ecosystem

lss_20150708-250Live streaming is one of the hottest topics in online video, and so we’re announcing the Live Streaming Summit—a brand new two-day event produced in conjunction with the Streaming Media West show, taking place November 17-18 in Huntington Beach, Calif. Live streaming has always been a big part of our program at each Streaming Media show, but now we’ll have two dedicated tracks to the subject and more speaking and presentation spots than before.

The Live Streaming Summit focuses entirely on the technologies and strategies required to take large-scale live event and live linear video and deliver it to viewers watching on all manner of devices—computers, tablets, mobile phones, set-top boxes, and smart TVs. This isn’t about cameras and video production, but rather all of the backend pieces that make up the live streaming workflow.

No matter what content area you might be working in—entertainment, news, sports, gaming, worship, or live linear channels—we’ve got you covered with two days of panel discussions and presentations focusing on best practices and insights in seven topic areas:

  • Best practices for backhaul, transmission, and ingest—satellite, fiber, cellular, and more
  • Encoding and transcoding—on-prem, cloud, and hybrid
  • Management—metadata, content protection, stream stitching, and preparation for syndication and post-event VOD viewing
  • User experience—video players, APIs, clip sharing, and social media integration
  • Monetization—advertising, subscription, and pay-per-view
  • Distribution—content delivery networks, real-time analytics, QoS, and QoE
  • Post-event evaluation—how to determine if your event was a success

We’ll also feature case studies from leading content publishers and service providers highlighting real-world success stories. If you are a technical or business decision-maker whose job depends on delivering successful large-scale live events, then the Live Streaming Summit is a must-attend conference.

If you’re interested in presenting or speaking on a panel in one of the above topic areas, submit a proposal via our Call for Speakers page before August 31. I’ve worked with’s Editor Eric to create an outline of what the show will cover and Eric is now organizing all of the speakers and chairing the program. So hit up the call for speakers page, or email Eric or myself with any ideas or questions.

Why I Bought Stock In Netflix; Company Could Have 100M Subs By 2018

When writing about public companies on my blog, I have said many times that I have never bought, sold or traded a single share of stock, in any public company, ever. While my blog is not a site for financial advice or guidance, I still wanted to disclose that yesterday I purchased stock in Netflix. For me, it’s a long-term play as Netflix really seems to be firing on all cylinders right now and I like their road map for international expansion.

Their subscriber growth outside of the U.S. is starting to see some nice gains, they continue to add new content to their inventory and they are doing a good job of licensing content for the geographic regions they are expanding into. With $455M in revenue from international subscribers for Q2, Netflix’s well on their way to becoming a company with 100M subs within a few years. Most on Wall Street estimate it will take Netflix until 2020 to reach 100M subscribers, but I think they will do it sooner than that. With just over 65M customers, Netflix only needs to add 3M new subs a quarter, over the next 13 quarters, to reach 100M subs by the end of 2018.

Some argue that will be hard once Apple comes out with a streaming subscription service of their own, but until Apple actually has something in the market, if it ever happens, and we can judge whether or not it can compete with Netflix, there is no threat. And while Amazon competes with Netflix, to date, they have not shown they can keep Netflix from expanding. Netflix likes to talk a lot about how they see HBO as their biggest competitor, but HBO’s catalog of content is very small and the one thing consumers have shown us is that they want a lot of choice. Netflix’s content catalog offers real depth and breath of choice, whereas HBO’s is very limited.

What I’d really like to know is what methodology Netflix is using to judge the impact of their original content strategy on their business. They keep saying it helps their business, but we have no details on the direct impact. Does original content produce new subscribers, or is it simply a way to reduce churn? I think the latter, but until Netflix gives out some viewing statistics, which I don’t see them doing any time soon, we have no real way to measure it. But, they would not be spending $100M to create one season of House Of Cards if they didn’t see positive results on their business, so one has to simply trust that they know how to properly measure the impact of original content creation, good and bad, on their business.

Netflix has gotten so big now, with a market cap of almost $49B, that it almost guarantees that they will remain independent. Many have speculated that Netflix’s long-term strategy was to be acquired, but considering how big they are now, who would acquire them? There is almost no one who could afford them and the few that could, think Apple, Amazon, Microsoft, Facebook and Alibaba, have already missed the boat. Netflix is almost too big now for it to make sense for even any of them to acquire the company. And with Netflix only getting stronger, bigger and growing their business well outside the U.S., every day that goes by, Netflix’s value continues to rise. It really is amazing what Netflix has accomplished in just seven years and how fast they have growth.

Streaming Vendor News Recap For The Week Of July 6th

“Study” of ISP Slowdowns Conveniently Ignored Real Cause, ISPs Not At Fault

Last week, so-called consumer advocacy group Free Press announced that according to data collected via their BattlefortheNet website, major ISPs were “slowing” content services at key interconnection points. Free Press pitched their agenda to the media and some news outlets wrote stories saying major Internet providers were slowing traffic speeds for thousands of consumers across North America. But as it turns out, the Free Press came to the wrong conclusion when they accused the ISPs of being responsible. The main provider having the problem with the ISPs, GTT, confirmed they were given extra capacity by some ISPs, dating back more than six months ago, and that GTT simply hasn’t turned up that extra capacity yet. Updated Tues June 30: GTT and AT&T have entered into an interconnection agreement.

The data that Free Press is highlighting shows that GTT, a transit provider that connects to many ISPs, was having capacity problems with AT&T, Comcast and other ISPs in select cities. Naturally everyone assumed it must be the ISPs fault and interestingly enough, GTT told me that not a single member of the media of the Free Press contacted them for more details. I reached out to GTT and they were happy to set up a call and very easy to talk to. While GTT could not disclose full details on their peering agreements/relationships, I did confirm that multiple ISPs provided GTT with extra capacity, which the company is still in the process of turning up. But it doesn’t stop there.

GTT actually turned down capacity at interconnection points as they are shifting their flow of traffic because of acquisitions they have done in the market and consolidating how they connect with ISPs. In the last six years, GTT has acquired five companies (WBS Connect, PacketExchange, nLayer Communications, IP network Tinet from Inteliquent, UNSi) and a few months ago, announced an agreement to acquire their sixth, MegaPath.

As a result of the acquisitions, the nLayer side of GTT has been shutting down their connections, specifically AS4436, and moving that traffic over to the Tinet connections, AS3257. To make it simple to understand, GTT is simply consolidating networks and shifting how they connect to ISPs through different connections, while terminating others. So the capacity issues that Free Press data shows is a result of GTT essentially shutting down those connections and not because of any wrong doing on the ISPs part. The M-Labs data, that the Free Press is using, is measuring a problem that GTT owns and as GTT told me, extra capacity was made available to them before M-Labs even started their measurements.

Taking it a step further, public info that details GTT’s network shows that GTT/Nlayer (AS4436) now has all traffic behind the SFI relationships of GTT/Tinet (AS3257) and is no longer connecting with other networks. When you look this AS up in the peering database GTT says, “We are no longer accepting Peering requests for this ASN”. GTT has a lot of AS numbers and looking at all of them it shows the consolidation taking place and the reason they are no longer accepting peering requests for AS4436, since it is being shut down.

Data also shows that GTT/Nlayer (AS4436) was once connected to multiple networks and likely paying for connections to Tier 1 networks. These paths still exist and the BGP information is still available, but will likely be gone soon. GTT/Tinet (AS3257) is a 1Tbps+ network with “balanced” traffic and GTT/Nlayer (AS3257) is a 1Tbps+ traffic source with “mostly outbound” traffic. Of course none of this is info the average consumer would know how to look up or even understand, and that’s exactly what the Free Press and others want. It was not hard to find out the cause of the performance issues if you simply asked GTT, looked at public network info and asked the ISPs.

The take away from all of this is that many are far too quick to judge who is at fault when it comes to network performance topics, without talking to all the parties involved and having all the facts. GTT is making changes to their network, working closely with ISPs, already has the relationships in place and is working to solve any performance problems. While some like to say that these networks can just “flip the switch” to fix issues, it does not work that way, especially when you are consolidating networks, like GTT is. Many are quick to want to lay blame on ISPs just because it is fashionable to want to hate service providers or push an agenda like the Free Press.

It’s clear that the Free Press should not be trusted as they used wrong conclusions from the data to push their agenda. Even if they didn’t do it on purpose, it shows the Free Press has a complete lack of understanding of the data being collected. They don’t understand that when a Tier 1 network or CDN makes changes to their infrastructure, it impacts the data that is being collected. Don’t point fingers unless you talk to all the parties involved and review ALL of the data available in the market, not just a slice of it.

It should also be noted that GTT told me that no one from the Free Press ever contacted them, before the Free Press laid blame on the ISPs. If they had, GTT would have been able to inform the Free Press of some more details, which the Free Press neglected to even look into. I also find it interesting that while Free Press says they are “fighting” for consumers rights to “communicate”, the Free Press doesn’t allow any comments, on any of the posts they publish on their website. To try to discredit me, the Free Press has also called me a “paid cable-industry operative”, which isn’t true and is laughable to suggest, considering that I have written two detailed blog posts calling out Verizon for “poor business practices“, just within the past three months. Apparently, sticking to the facts is not something the Free Press does very well.

The Free Press has no wiggle room on this and if they don’t edit their blog post to correct their accusations, then it only proves they care about their agenda, not the truth.

Note: For those that say I am standing up for the ISPs, I’m not. They can speak for themselves. This is about getting to the real root cause of the problem, not the “perceived” problem that someone like the Free Press likes to promote. Like many, I am tired of all the vagueness and double-speak around discussions involving interconnects, peering and transit topics. We need more clarity, not more politics.