Microsoft, Adobe and the Industry Need To Agree On An HD Video Standard

For an entire industry that defines itself based on the word "quality", today there is still no agreed upon standard for what classifies HD quality video on the web. Both Microsoft and Adobe have different views on what classifies a video as HD and many content owners I speak don't truly know themselves how to classify HD video on the web. If the industry wants to progress with HD quality video, we're going to have to agree on a standard – and fast.

This isn't the first time I have written about the HD video problem and every time the subject comes up, you get a lot of comments from people who all have different options on how HD video should be classified. Some want the rate at which the video is encoded to be the deciding factor, others think it should be the size of the window (called aspect ratio) and some say it's the resolution that's the determines what is HD or not. While all of these examples could be the way to decide what is HD, the fact of the matter is that to date, no one has agreed upon anything. We have content owners calling videos HD that in my book, and many others, is not truly HD. Simply scaling up the aspect ratio by itself does not mean you've achieved an HD stream.

To me, the term HD should refer to and be defined only by the resolution and not by a minimum bitrate requirement. Since you could have a 1080p HD video encoded at a very low bitrate, which could result in a poor viewing experience inferior to that of a higher-bitrate video in SD resolution, the resolution and bitrate is the only way to define HD. One thing I did notice about the March Madness videos is that Microsoft and CBS are using the term "HQ" in the payer instead of HD. I think this is a smart move on their part as a way to help define what is considered high quality video, content encoded at a higher bitrate, but content that is not truly HD. It seems as if Microsoft is going out of their way to tell content owners to only call video HD if it is at least 1280 pixels wide.

Adobe on the other hand is calling 480p HD but I don't agree that 480p should be classified as an HD size for web content. If 480p is not defined as HD video quaity for the TV, why should it be for the PC? I think HD video needs to be defined using the broadcast standard of 720p, 1080i or 1080p. One thing that might make this a bit more complex is that there are more devices playing back web video than just the PC. When I download an HD show from iTunes and play it on a 50" TV, it looks great, but does not look even close to Blu-ray. Is that a fair comparison? Where do you stop comparing the quality of the video to the device it is being played back on?

It's important to remember that an industry standard needs to be created not for those in the industry but rather for viewers. Consumers don't care what codec is being used, what the bitrate is or how the video is being delivered. But they do care about quality and we can't expect them to want to adopt HD quality video when the industry itself has not even defined what HD video is.

So, what is HD quality video on the web? What is the definition and more importantly, what is it going to take to get both Microsoft and Adobe to agree to use the same standard so that content owners aren't confused? For all the competition amongst the two companies, some things need to be worked on together, with the understanding that it will help everyone in the industry if done correctly. I think Microsoft has started to do this with their definition but without Adobe and others agreeing to all use the same metrics, it's will only slow down the adoption of HD. So Microsoft, Adobe, what is it going to take for you guys to publish an agreed upon HD web video standard? We're waiting.

  • http://www.streamingmedia.com Tim Siglin

    Another large level of confusion is for those content owners who shoot in High Definition (720 or 1080, not 480 “enhanced”) and then edit in an HD system, such as Final Cut.
    I’ve heard several content owners claim to be doing HD progressive download or streaming, when in reality they shot and edited in HD but their content is being PDed or streamed at a resolution well below any HD standard. It almost becomes embarrassing to explain that “higher definition” video streaming (above 320×240 or 640×480) is not the same as High Definition.
    One point of clarification, when you say: “others think it should be the size of the window (called aspect ratio)” the aspect ratio is strictly the proportion. Many videos have been streamed at 180×120, instead of 160×120, and called “widescreen” but I don’t think anyone is arguing that the aspect ratio makes something High Definition. The size of the window, yes, but not strictly the aspect ratio.

  • http://carnalnation.com John Pettitt

    Forget the web – lets have a standard for TV. The “HD” you get on a heavily re-compressed Sat or Cable channel is not the same as the over the air HD or from BlueRay. Similarly the “HD” you get on iTunes another standard entirely.
    We need an objective measure of quality that is codec independent – sadly I don’t think one exists.

  • http://alexzambelli.com/blog/ Alex Zambelli

    Dan,
    I absolutely agree that the Web needs a standard definition of HD video. I’m really glad you picked up on the fact that the Silverlight March Madness player is called “High Quality” player and not “High Definition” player. As described in my blog (linked), the live video streams offered there peak at 1.5 Mbps @ 784×432, just short of 480p (only due to size constraints of the player video window, otherwise we probably would’ve just made it a nice even 480p). We were very adamant about making sure the CBS Silverlight player didn’t get called an “HD player” when it wasn’t in fact offering HD video. But higher quality – absolutely!
    I do, however, disagree with you on the topic of how HD video should be defined. In fact, I was a little confused by your statement – it seemed somewhat contradictory when I read it:
    “To me, the term HD should refer to and be defined only by the resolution and not by a minimum bitrate requirement. Since you could have a 1080p HD video encoded at a very low bitrate, which could result in a poor viewing experience inferior to that of a higher-bitrate video in SD resolution, the resolution is the only way to define HD.”
    So you’re saying that HD video at a low bitrate could look worse than SD video at a high bitrate – which I totally agree with – but you’re still saying that bitrate should NOT be a factor? I either misunderstood or we’re in disagreement there. :) I absolutely think that 1280×720 video encoded at 500 kbps should not be considered HD video!
    Alex Zambelli
    Media Technology Evangelist
    Microsoft Corporation

  • http://www.BusinessOfVideo.com Dan Rayburn

    Hey Alex, I think I confused myself as well when I said that. Clearly we both agree, you have to take the bitrate into account as well. That’s what I get for writing this post at midnight.

  • http://www.hulu.com Hassan Wharton-Ali

    I disagree with the bitrate factor. What happens when someone has a breaktrough with an algorithm with greater compression efficiency and allows an excellent viewing experience of 720p video at 500Kbps?
    Would that mean that the standard would then have to be re-written?
    It is not too far fetched either, look how far we’ve come in just 10 years of codec evolution where we can even get a good HD picture from 5Mbps.
    I would argue that the definition of HD video be “lines of resolution with ENOUGH of a bitrate that a viewer sees little to no degredation in picture quality when viewing at any distance.”

  • http://profile.typekey.com/benwaggoner/ Ben Waggoner

    I’ve been mulling this over while working the 2nd edition of my compression book.
    The lowest things I think we could reasonably agree are actually HD would be:
    1280x528p24 (720p square pixel, cropped to 2.4:1 aspect ratio# for movie content.
    960x720p30 #anamorphic 720p, in the native resolution of DVCPROHD, which is the lowest resolution HD production format).
    Inside Silverlight, the hard-coded definition of “HD” is > 1024 wide or > 576 tall; 1024×768 is PAL 16:9 (720×576 anamorphic) stretched to square pixel, and thus the highest SD you’d be likely to see in the wild.
    As to Hassan’s point, I think we can also include that, given the bitrate, it has to look BETTER at that resolution than it would look at a lower resolution at the same bitrate. In other words, it can’t be HD-for-marketing-only, but HD with a reasonable bits-per-pixel for the content and codec.
    And that was your Codec Nerditry Minute.

  • http://www.highdefnow.com/ Kieran Kunhya

    We define HD as having come from a 720p or greater source and then displayed at 720p or at the source resolution. Certainly 480p shouldn’t be classed as HD.
    Bitrate is only one factor when it comes to picture quality. Low bitrate 720p doesn’t necessarily have to look that bad. We use 700kbps on our “Medium Quality” setting and on good sources like http://www.highdefnow.com/play/209 the picture quality is pretty good. I would class that video as being at the lower end of HD with our “High Quality” setting being at the higher end of 720p Web HD. However on other sources 700kbps may not come out as well; therefore we call it ‘HD-like’ as a compromise for people with slower connections who want more than Standard Definition.

  • http://profile.typepad.com/benwaggoner Ben Waggoner

    @Kieran,
    Yeah, having a particular number for bitrate as the HD threshold’s not going to work. Screen recordings may require only 10% as many bits per pixel as a basketball game.
    The 1.5 Mbps maximum bitrate we’re using for March Madness is SD for that very complex content in live streaming, but 1.5 Mbps could be ample for a VBR-encoded CGI movie trailer at full 1280×720.

  • http://www.global-mix.com Dom Robinson

    OK: Off core competancy for me but heres a view preceded by a question:
    Can you present still images ‘in HD?’
    I think of SD 720×576 resolution for PAL or 720×480 for NTSC. So images rendered for SD TV broadcast are traditionally encoded for that resolution, regardless of the carrier bitrate (typically mpeg-2 at IRO4Mbps because of sat transpoder segementation, breaking analog 32Mbps transponders down into multiple digital carriers of 4Mbps).
    When you buy a widescreen TV the carrier signals have the same bitrates, the aspect ratio is changed, and the resolution is provided by the same density of pixels on the display.
    This is STILL an SD service.
    HD TVs promise to produce widescreen (16:9 images) (i have not seen any 4:3 HD). The resolution is 1080i, 1080p or 1280 when you look at the PIXCEL density accross the display. The fact that MOST HD systems are used domestically for watching SD images still means that the 720×480 or 720×576 image is being rendered accross an ‘excessive’ number of pixels, so it could be said you are HD rendering and SD stream…
    If the carrier supports HD bitrates then an image with a higher resolution can be sent AND rendered to that ‘density’ and will have what I would consider HD characteristics: Ie a 1:1 relationship between the density of the pixcels on the display and the resolution of the image, and that would be at a higher resolution than SD at 720×576 or 730×480 or thier widescreen cousins.
    So actually it seems to ba a function of resolution at encoding and density of pixcels on the display: the bit rate is meerly an operator in the equation which needs to be there to measure the quality of service of the delivery of the HD – so in effect if you introduce a lossy codec, such as the MPEGs, you can pretty much guarantee that there will NOT be a 1:1 relationship between the input HD to the encoder and the output ‘HD’ to the screen: so a lossy codec will wander from HD to low definition as required by the codec’s algorightms. Only when ALL the HD image captured at high resolution is delivered to the display at the same high resolution could the effect be truely HD in my mind.
    I agree that HQ is a better spearation since in Variable Bit Rate encoding you may have moments of high definition, followed by virtually nothing going on on the streaming transports: Hence my initial question: Can you present still images as ‘HD’ (since once the initial image is sent as a key frame, there will be little streaming activity at all until that image is changed: and that stands for MPEG2 and MPEG4 and derivatives.)
    It seems to me to be more of a marketing term driving innovation into the broadcast and production sector. Afterall a lot of the broadcast tech journals are already loosing interest in a sluggish take off in HD and now focussing on 3D – in fact we are doing one of the worlds first 3D webcasts next week – and its esy to define 3D: those funky coloured specs are back ‘in’ – you dont need specs for HD!

  • Bryan Sarpad

    I would like to mention two things I noticed missing from this conversation. First, bitrate (as mentioned) is quite subjective as there are a number of factors that affect the quality yielded from certain bitrates. In particular, avc1 supports a number of ‘profiles’ allowing for more complex algorithms on hardware capable of decoding the stream using these features in realtime. Some of these more advanced profiles can offer extremely high quality results at lower bitrates for certain types of video, such as low color depth animations.
    Second, I know there is often little attention paid to this aspect, but I consider high definition audio part of the qualification for high definition content (video typically does little good without audio). I am a quality nut and if I can’t get at least multichannel AC3, I hold out for a better source. A lot of smart people are banking on blu-ray being the last physical media for this content, without similar audio quality, the home theater enthusiast crowd will be slow or reluctant to adopt it.

  • http://on10.net/blogs/benwagg Ben Waggoner

    Okay, I put my codec where my mouth is (and added more of my mouth), and made some samples showing what I consider the low end of true HD contrasted with “fake” HD and pretty good SD here:
    http://on10.net/blogs/benwagg/Proposed-definition-of-HD-on-the-web-with-examples/
    I’m still working on getting the embedding working right with my blogging engine.
    @Dom,
    As you can see in my samples, a clip can be “HD Quality” overall, but still have sections that are less so. That’s true of broadcast ATSC as well. We could argue for a threshold of HD-ness, I suppose; 95% of frames?
    @Bryan,
    Yeah, bitrate is a bad criteria, since complexity of content and efficiency of codec implementations vary so widely.
    We’ve been mainly talking about video here. But you ask a good question: what *IS* “HD audio?”
    I’d say it has to be at least 4.0 (center, left, right, surround), ala the old decoded Dolby Pro Logic. In that regard, both Flash and Silverlight are behind in being stereo only. That said, most PCs don’t have stereo output, and most streaming doesn’t drop down to bitrates good enough for 5.1, so we’re a little ways away from that.

  • Mark East

    I’m having a little trouble keeping track of the point of discussion here. Are we debating what the standards for “HD” should be within our industry or are we debating how the expectations of our customers and internet video watchers at large should be set with regard to what is and is not “HD”?
    These are two very different discussions, methinks.

  • http://on10.net/blogs/benwagg Ben Waggoner

    @Mark,
    I’m talking about what appropriate standards should be, which would hopefully then become consumer expectations.
    In particular, I really, really want to keep anyone from being able to get away with calling 480p “Web HD” or some nonsense like that. So, I’m trying to define a minimum bar for what we can call HD for web applications.
    Which I think should be 720p, with anamorphic allowed for 30p anamorphic sources and cropping allowed for widescreen sources.

  • Andreas

    What bitrates are we talking for this qualities. From Ben Waggoners samples on the blog It looks like we could get away with 1080p at 3Mbps, but what about the surround sound. When can it be solved.
    We don’t want to build our own player specifically for that!

  • Jeremy Noring

    Very interesting post. I wrote up some thoughts here,
    http://goldfishforthought.blogspot.com/2009/05/hd-video-standard.html
    …but the short version is: I think we can do better than specifying bitrate. I think an objective video quality metric, like PSNR/MSE or SSIM should be used as a baseline for quality rather than bitrate. There absolute are, as Alex Z. put it, “objective measure(s) of quality that (are) codec independent,” and this is a totally idea place to use them.

  • Jeremy Noring

    Apologies, apparently my fingers/keyboard staged a revolution:
    There absolutely are, as Alex Z. put it, “objective measure(s) of quality that (are) codec independent,” and this is a totally ideal place to use them.

  • http://www.hdflvplayer.net FLV Player

    I agree with John Pettitt. Let’s consider TV.

  • http://www.surf-cam.net/ Surf Cam Bob

    It is nice to see that someone else agrees with me on the standardization issues.

  • http://action-movietrailer.blogspot.com/ reean

    Now is 2010 …3D technology will soon replace HD technology

  • http://www.livingsound.com.au/ audio brisbane

    I think that they should agree since we are already using HD products and more products are already supporting this format. What’s the snece of having an HD product if you don’t have any HD files to play, right?