On Monday The Guardian ran a story with a headline stating that major Internet providers are slowing traffic speeds for thousands of consumers in North America. While that’s a title that’s going to get a lot of people’s attention, it’s not accurate. Even worse, other news outlets like Network World picked up on the story, re-hashed everything The Guardian said, but then mentioned they could not find the “study” that The Guardian is talking about. The reason they can’t find the report is because it does not exist.
In an email exchange with M-Labs this morning, they confirmed for me that there is no new report, since their last report published on October 28th, 2014. So The Guardian wrote a story about a “study released on Monday”, referencing data from M-labs, but provides no link to the so-called study. The Guardian does cite some data from what appears to have been collected via the BattlefortheNet website, using M-Labs methodology, which uses tests that end users initiate. Tim Karr of the Free Press, one of the organizations that makes up BattlefortheNet is quoted in The Guardian post as saying that, “Data compiled using the Internet Health Test show us that there is widespread and systemic abuse across the network.”
What The Guardian story neglects to mention is that this measurement methodology that the Free Press is highlighting, was actually rejected by the FCC in their Measuring Broadband America report. They rejected it because the methodology wasn’t collected in a real-world fashion, taking into account all of the variables that determine the actual quality consumers receive, as others have shown. (one, two, three) Updated 1:10 pm: M-Labs just put out a blog post about their data saying, “It is important to note that while we are able to observe and record these episodes of performance degradation, nothing in the data allows us to draw conclusions about who is responsible for the performance degradation.” M-Labs did not include a link to any “study” since they didn’t publish one, but you can see a Google Docs file of some of the data here. It’s interesting to note that the document has no name on it, so we don’t know who wrote it or published it to Google Docs. Updated 2:28 pm: Timothy Karr from Free Press has deleted all of the data that was in the original Google Docs file in question and simply added two links. It’s interesting to note that they published it without their name on it and only edited it once it was called into question.
Updated 2:07 pm: M-Labs has confirmed for me that they did not publish the Google Docs file in question. So the data and text that Free Press was showing the media, to get them to write a story, has now been erased. This is exactly why the media needs to check the facts and sources instead of believing anything they are told.
If the Free Press is referencing any “study” they may have put out on Monday, using M-Labs methodology, it’s nowhere to be found on their website. So where is this “study”? Why can’t anyone produce a link to it? Mainstream media outlets that picked up on The Guardian should be ashamed of themselves that they didn’t look at this “study” BEFORE they ran a story. This is sloppy reporting when you reference data in a story you haven’t seen yourself or even verified that a “study” exists.
Adding even more insult to injury, The Guardian piece has no basic understanding of how traffic flows on the Internet and the difference between companies that offer CDN services versus those that offer transit. The Guardian piece calls GTT a “CDN” provider when in fact, they are nothing of the sort. GTT is an IP network provider, they offer no CDN services of any kind and don’t use the word CDN anywhere on their website. At least one other news site that also incorrectly called them this has since corrected it and gotten the terminology right. But once again, some news outlets simply took what The Guardian wrote without getting the basics right or checking the facts. Others did a good job of looking past the hype.
The Guardian piece also says that, “Any site that becomes popular enough has to pay a CDN to carry its content on a network of servers around the country“, but that’s not entirely true. Netflix doesn’t use a CDN, they built one itself. So you don’t “have” to pay to use a third-party CDN, some content distributors choose to build and manage their own instead. The Guardian piece also uses words like “speed” and “download” interchangeably, but how these words are used have very different meanings. Speed is the rate at which packets get from one location to another. Throughput is the average rate of successful message delivery over a communication channel.
Even if The Guardian article was trying to use data collected via the BattlefortheNet website, they don’t understand what data is actually being collected. That data is specific to problems at interconnection points, not inside the last mile networks. So if there isn’t enough capacity at an interconnection point, saying ISPs are “slowing traffic speeds” is not accurate. No ISP is slowing down the speed of the consumers’ connection to the Internet as that all takes place inside the last mile, which is outside of the interconnection points. Even the Free Press isn’t quoted as saying ISPs are “slowing” down access speed, but rather access to enough capacity at connection points.
It should be noted that while M-Labs tells me they had not intended to release an additional report, because of The Guardian post, M-Labs will be putting out a blog post that broadly describes some of the noticeable trends in the M-Lab data and “clarifies a few other matters”.
Look for that shortly. M-Labs blog post is now live.