What Classifies Online Video As HD Quality?

While there continues to be a lot of talk in the industry about HD quality web video, to date, I still have yet to figure out what classifies web video as HD quality? As an industry we are using the word HD to clarify video as being a certain level of quality, but these days it seems many are calling their video HD or their video delivery offering as supporting HD, but then the video is not truly HD by TV standards.

Depending on who you ask, the HD standard for the web video seems to be all over the map. While we know that for broadcast TV HD quality is usually defined by a resolution of 1080i, 1080p or 720p, the codec used for web video seems to play more of a role in the definition of HD web video than in the traditional broadcast industry. Some also say that you have to take into account the bitrate the file is encoded at in order to classify it as HD or not.

We’ve seen delivery networks say they can deliver HD video over regular 6 megabit connections and others who say they can do it over connections at half that. Many delivery networks all say they support "HD quality" but then don’t define what they classify HD quality to be and what their offering supports. The recent Operation MySpace webcast was touted as HD quality but then I saw many in the industry commenting on how it really wasn’t HD quality due to the codec that was used.

So what is HD quality on the web? What are the classifications we need as an industry? While HD web video has very little traction today, with the term being used so often, we better create some sort of standard agreed upon. If we don’t, over time, the word HD may not be associated with such good quality video as we want it to be.

  • HD is defined as “you can only see it on a trade show floor and in press releases, but not at your home.”
    Like the term “streaming”, “HD” is a vague marketing term that has no actual quantitative definition when used for internet media delivery.
    As we all should know by now, a skilled encoding tech with proper equipment, time, and knowledge of video engineering can get the same result at 2mbps as a poorly trained encoding person can get at 4mbps etc…
    So which is the HD? The better result at a lower bitrate or the worse result at a higher bit rate?
    Depends who is making the press release…

  • TMD

    “The recent Operation MySpace webcast was touted as HD quality but then I saw many in the industry commenting on how it really wasn’t HD quality due to the codec that was used.”
    That’s just H.264 supporters taking a dig at On2. It’s not the codec’s fault at what resolution Kulabyte decided to broadcast in.

  • Rob Green

    HD is 1080i, 1080p, and maybe 720p. HD refers to the resolution, not the encode quality or codec used. In other words you can transmit in HD and still have crummy picture quality.
    Further, streaming HD is very, very hard. Assuming the appropriate number of bits are utlized, 2-6mbps depending on live or on demand, decoding that in a good experience on the average users machine with their unknown number of other programs loaded and running is tough. Of course monitors don’t display interlace anyway so really we just want progressive source if at all possible.
    There are a lot of claims for <1mbps HD but I've never seen that. The reality is most of the sources are already highly compressed and even if they weren't anything under 2mpbs is dubious.

  • You also have to look at what the end users can actually receive – there is more to HD than being able to push the streams – any of the larger CDN’s can push the bits, the bottleneck (especially in the us) is the DSL or cable modem at the end.
    I have tried most of the ones who claim the HD & only have been able to get a quality picture from one group – and this is based on their player having intelligence to auto-detect how much to be pushed to the end user.
    Heck – try Akamai HD – you will get the error I received more than likely “You must have 7 M connection” – very few homes in the US have this size of a connection…
    The HD is a bit out, but I agree with Rob – H.264 is where I would set as the “bottom” for the consideration.

  • RS

    (note: comment edited by moderator to remove sales pitch)
    HD streaming is here and everyone who has a broadband connection of 2+mbps can stream 720 resolution clips.
    CDNs stream low quality videos in Flash or using the Move player at under 2mbps and call that HD. The quality is marginal at best and no where near HD. Apple and Akamai have some HD clips but you have to download them to watch and even then, these clips are not really HD as the studios just took a 480 file, increased the bit rate and resolutions and call them HD.
    All the content owners are playing with different technologies and over the next 12 – 18 months we are going to see real HD streaming come to life.
    Till then just use the sites operated by the studios as these at these are a tad better than VHS.
    And now you also have Netflix/Roku; but have you seen the quality! What is Reed Hastings thinking?

  • HankG

    IMHO H.264 and WMV9 are the internet HD codecs. Although many other codecs can do HD as well.
    HD is 720p, 1080i or 1080p. Or higher. 720p is very nice for the internet. At 2-5Mbps VOD HD can be stunning. HD needs 2-pass encoding so live HD is hard to do (you will need hardware encoders).
    And I think Europe is ahead of the US. 5Mbps is the minimum bandwidth for HD. For example the average consumer bit rate in the Netherlands is 4-5Mbps, and more and more ISP’s offer 10 to 20Mbps for 20 euros (30 dollars) per month.
    In Europe streaming companies have been streaming in 720p HD for years, without the need for special hardware or software, not even with Edge servers. All modern media servers can handle HD. The disks and the networks have to be fast. And Western Europe happens to sit on top of the 3 largest internet exchange in the world 😉

  • High Definition refers to any video resolution greater than Standard Definition. Helpful, right? ;]
    SD is defined as any resolution equal to or lower than 704×480 (rectangular pixels) or 640×480 (square pixels). The term “HD” refers only to resolution, and doesn’t define minimum bitrate requirements. Theoretically, you could have a 1080p HD video at a very low bitrate, which could result in a net viewing experience inferior to that of a higher-bitrate video in SD resolution.
    On2’s VP6-S video profile codec is actually designed specifically for High Definition video on the web, and it’s what we use at viddyou.com for our High Definition video sharing. We have 720p AND 1080p capabilities at roughly 3MB/s at the higher end (which is pretty much the max that even the most modern computers can handle when using Flash as the viewing method).

  • JimB

    Last mile is no-doubt an issue. But don’t forget about the fact that even if an audience member had a 7+ mbps sustained connection to initiate a HD ‘stream’, your desktop of laptop may not have the horsepower to permit viewing of the HD file (Mac users not included). As said earlier in a previous post, from the audience memebers perspective, choosing to “stream” HD (high bit-rate/bandwidth) is a bad idea, knowing that progressive dowload (aka HTTP) has a better chance of maintaning a higher quality and more consistent user experience.

  • I agree with the comments above that HD is really the lines of resolution (usually vertical). As TMD stated above you can have a horrible HD picture. I would like to know everyones reaction to Hulu’s HD Gallery: http://www.hulu.com/hd/
    Its in h264 and at 720p. You can fullscreen and put your face up to your monitor and still not see many artifacts…
    This is the main difference between web and TV High Definition. Web has a harder sell because eyeballs will always be closer to the screen than the normal viewing distance. If you get that close to your HD broadcast at home, you will think the quality is bad! Even people watching nice Blu-Ray dvds wont get that close to judge picture quality!

  • Try http://www.abarth.it/gpaesseesse, and play it in Fullscreen HD (you need flash9 to do that).
    That is real HD quality at 1,2 Mps.
    Streaming.
    Isn’t it amazing?

  • I would have to agree, that there is NO reason to lower our standards of what “HD” is for the web. If a company says they are broadcasting a webcast in HD, then it better be 540p, 720p, 1080p, etc.
    Streaming HD video is not hard (for a CDN at least with the proper network setup), the end user needs three things:
    1) An Internet connection of at least 1.5Mbps for 540p, 2.5Mbps for 720p and 4Mbps for 1080p
    2) A dual-core processor, since HD video requires a lot of decoding power.
    3) An ISP (**cough** Time Warner, Speakeasy) that will not cap your download GBs per month.
    If you have those three things, then you can enjoy real HD video via the Internet directly on your computer.
    So to get back to your question, as an industry and as webcasters, we should only label our videos as “HD” if they are actually broadcast in “HD” resolution, those resolutions being:
    540p – 960×540
    720p – 1280×720
    1080p – 1920×1080
    Of course if your video is 4:3, then you would need to deliver these resolutions to call your video HD:
    540p – 960×720
    720p – 1280×960
    1080p – 1920×1440
    The actual bitrate (and codec) the video is delivered at doesn’t matter… as long as you don’t pull a DirecTV and over-compress your video (like 720p @ 0.5Mbps); rendering the entire video just a blurry mess.

  • 720p is good for the internet, HD is 720pHD, 1080i or 1080p or higher. I think it is good enough to classify an online video HD quality. I also agree HD is more on vertical.