I’ve read quite a few blog posts about NBC Sports live stream of the Super Bowl and it’s clear that the vast majority of the media don’t understand what the workflow for a live event looks like, the pieces that are involved and the various factors that determine the quality of the live stream. A post on DSLReports.com says the Super Bowl stream “Struggled Under Load”, yet provides no details of any kind to back up that claim. The fact is, capacity wasn’t an issue at all. [Update Tuesday 10:58am – DSLReports.com has changed the headline of their post to no longer reference any kind of capacity issue.]
NBC Sports used third-party CDN provider Akamai to deliver the stream and had Level 3’s CDN in backup mode in case they got more traffic than expected, but never had to use them. Media members that complained about the stream didn’t provide any tech details of how it worked, how it was delivered, the companies involved and didn’t speak to any of the third-party companies that were monitoring the stream in real-time. They really made no effort to learn what was really going on with the live stream or speak to the companies responsible for it, which is just lazy reporting. Cedexis data shows Akamai’s availability did drop during the game, to 98.81% in the Northeast, but not significantly.
NBC Sports said that the live stream peaked at 1.3M simultaneous, which isn’t a big number for Akamai. Six years ago, Akamai’s network peaked at 7.7M streams, 3.8M of which came from the Obama inauguration webcast. Akamai has plenty of capacity to handle the live stream of the Super Bowl and has done live events, including those for Apple, that make the Super Bowl look small in comparison. Slate Magazine’s post called the Super Bowl webcast a “disaster”, with the biggest complaint being the fact the live stream had a delay, when compared to cable TV. Clearly the author doesn’t understand how the Super Bowl stream worked or he would realize that based on the setup, the delay was unavoidable.
The video was encoded in the cloud using Microsoft’s Azure platform, which adds a delay. On top of that, using HLS adds an additional delay and doing HLS over Akamai adds even more. Talk to any of Akamai’s largest live customers and they will tell you the number one complaint of Akamai, when a live stream is using certain parameters, is the delay in Akamai delivering the stream. Akamai’s network requires a lot of buffering time, for both HLS (and RTMP), otherwise you can get audio drop-outs on bitrate switches. NBC Sports used both HDS (HTTP Dynamic Streaming) for desktop and HLS (HTTP Live Streaming) for devices. So before some members of the media start blaming NBC Sports for the delay, learn all the pieces in the live workflow and understand how it all works. Even twenty years later, there are limitations in the technology. I simplified the workflow, but there were many more pieces involved in making the stream possible and many of them can add a delay in the live stream.
For those that want more tech details on the encoding, the average bitrate for the stream was 2.5 Mbps with an average viewing duration of 84.2 minutes. Also of note is that NBC Sports optimized their in browser display at 2.2 Mbps for target video size with a max bit rate of 3.5 Mbps in full screen mode.
The Slate piece also goes on to say that “NBC was dealing with huge traffic for its Super Bowl stream” and that the “traffic would be tremendous.”Again, statements that want to imply there would be capacity issues, which simply wasn’t the case. A piece by Mashable called the stream “slow” saying it was a “bit disconcerting for anyone who wants to keep up with up-to-the-second plays on social media”. Not all forms of content get delivered, in real-time, at the same speed. If you want “up-to-the-second” then the video stream is not for you. But it’s not the fault of the live stream, which has to get ingested, encoded, delivered and then played back with a player/app. Compare that workflow to a tweet, it’s not even remotely similar.
As for my experience with the Super Bowl stream, I did experience some problems, in regards to the actual video quality. I worked with NBC Sports tech team, giving them specs on my setup and they looked up my IP and tracked me throughout the game, having me test various links and setups. While we still don’t know what my issue was, it only appeared when I was using Verizon, but didn’t crop up when I used Optimum. So one thing people have to remember is that it’s not always the CDNs fault. Many times, it’s things down at the ISP level. As an example, I was having a lot of issues with streaming YouTube and Google looked into it and found there was a specific issue inside Verizon that was causing it.
During the live stream, NBC Sports was using multiple third-party quality measurements platforms, including those by Conviva and Cedexis that. Conviva’s is in real-time and can show everything from buffering times to failed stream requests. The media needs to learn more about these third-party platforms as you’ll notice, they don’t know anything about them, nor ever seem to look at their data after a large event. Stop coming up with “theories” around capacity and dig into the real data. While NBC Sports isn’t going to give out all the data we want, any member of the media who has connections could have easily talked to some of these third-party companies and gotten info or guidance of what they saw and any impact it might have had on performance. For the majority of users who turned into the live stream, it worked and worked well. There were some like myself and others that did experience intermittent problems, but we were the minority and in many cases, problems down at the ISP and WiFI level always causes quality issues with both live and on-demand video. Media members who considered the live webacst a failure due to it not being real-time, or lack of certain ads shown, should then be focusing on the business aspects of the stream, not the technical ones.
One last thing. For all the people reporting that the Super Bowl stream was a “record”, it wasn’t. The raw logs are not verified by any third-party company, there are many different ways to count streams, (simultaneous, unique simultaneous etc.) and if you want to look at just the sports vertical, you have events by ESPN, eSports and others that did more than 1.3M simultaneous. Quantity is important, but it’s not the single biggest piece of methodology that should be used to determine the success or failure of a live webcast. There is no such thing as the largest when it comes to live events as many times, numbers aren’t even put out after the event. Just look at Apple’s live streams, we don’t know their stream count and on the days they do a live webcast, Akamai takes down their real time web metrics chart that shows the live stream count on their network, just so no one knows how many streams Apple is doing.
If there is one thing the Super Bowl stream did reinforce, it’s that streaming video technology can’t replace traditional TV distribution, for quality, or scale. Yes, I know some will want to argue that point, but if you talk to those who are smarter than me, building out these networks to deliver content, not only are their many technical limitations, but there are just as many business ones as well.