The 2010 Google I/O Developer Conference Roundupby Ganesh T S on May 28, 2010 2:00 AM EST
The recently concluded 2010 Google I/O Developer Conference saw three important developments from a consumer perspective, namely, the introduction of Android 2.2 (Froyo), announcement of the WebM initiative and finally, Google TV. First off, we will discuss the controversial WebM / VP8 project, and then talk about Google TV.
One of the most talked about announcements made during the 2010 Google I/O Developer Conference was the open sourcing of the VP8 codec, and the adoption of a Matroska based container termed as WebM. Much has been written online about how VP8 would save the world from the clutches of MPEG-LA. However, there have been rebuttals too, where people analyzed the licensing terms for VP8 and realized that Google offers no patent indemnification for potential users. Coupled with the talk from the MPEG-LA CEO about the creation of a patent pool for VP8, things seem pretty murky with a lot of FUD being spread around. Moving on to the technical side of things, one of x264's (the open source H264 encoder) main developers, Jason Garrett-Glaser, has an in-depth analysis of VP8. In his post, he points out the various shortcomings of the codec, as compared to H264. Detractors have pointed to his role as a developer for x264 as a source for bias in his views, but fail to realize that, as an open source developer, he is free to work on any technically sound spec that he wants. In the rest of this section, we will provide our perspective on the developments surrounding VP8 and WebM
By open sourcing VP8, Google has fired a shot against the MPEG-LA, indicating that they wouldn't remain unchallenged. When one analyzes the reasons as to why VP8 was open sourced, it is easy to see that MPEG-LA's draconian licensing policy was to blame. While indicating that H264 would be free for web use till 2016, it made no forward looking statements regarding the eventual licensing situation after that. This basically meant that open source based organizations such as Mozilla and Wikimedia (which intend to stay free of patent encumbrances in all countries) were hesitant to implement H264 technology in their products. Open source enthusiasts clamored for technology which could stand up to the mighty H264 in terms of technical prowess, particularly considering the fact that Theora compared very poorly with it.
It would be for the good of H264 and its patent holders (not just in monetary terms) if it were to be made more popular by getting used in web videos all over. The sooner MPEG-LA realizes this, the better it is for all concerned. There needed to be a shift in the status quo for MPEG-LA to rethink its stance, and Google just caused that shift by open sourcing its $124 million purchase. How MPEG-LA responds to this remains to be seen.
How can the MPEG-LA satisfy the open source community, while also protecting the financial interests of its members? One avenue could be the introduction of licensing terms by which software based H264 decoders are exempt from any licensing fees. There are many open source H264 decoders already available (though they might not be completely legal in some countries right now), and this would mean that products such as Firefox need not have any reservations about including it in their browser bundle. Right now, MPEG-LA doesn't earn any money from the usage of H264 for web video. In order to get some financial returns, they could start charging websites which encode videos in H264, but also specify a revenue threshold below which these websites are not liable at all. A sample scenario would be a threshold of $5mn and the licensing charge for encoding the video in H264 could be .1% of any revenue above $5mn which was generated based on the usage of H264 video on that website. Of course, these are just ideas which MPEG-LA could think of implementing, which would ensure that low traffic and hobby websites aren't affected at all in any way.
In a perfect world, we would have no software patents and everyone would be capable of using the best technology available. However, for now, we will have to put up with these types of laws and patents. The best that could happen in the present scenario is one where MPEG-LA announces that the situation currently existing (till 2016) would be extended in perpetuity.
VP8 - The Bad
As Jason's article shows, the VP8 specifications are quite lacking in technical prowess. The workarounds to circumvent the H264 patents seem obvious. There would also have been reasons why the original H264 people didn't use the workarounds themselves or patent them. Also glaring is the absence of any technical rebuttals to Jason's analysis (at least that we are aware of) from any Google engineers or other proponents of VP8. [ Update: User EmmetBrown has brought this recent official WebM blogpost to our attention. ] The introduction of VP8 may also lead to a demand from consumers that their camcorders and PVRs record video in this format. While companies are unlikely to yield in to this (and Google itself wants to use this for Internet videos only), there is a small likelihood of of a drawn out format war like Blu-Ray / HD-DVD or VHS / Betamax. In these situations, it is the consumer who becomes the ultimate loser.
Google presented a host of hardware companies who purportedly support hardware acceleration for VP8. They appear to be playing around loosely with the term 'hardware acceleration' here. It is inconceivable that a specification which Google got its hands on late last year could be made available to the multiple partners mentioned in time for them to perform hardware design, verify it and tapeout the chip as a compliant decoder. What we would be seeing in the near future from chips supposedly capable of VP8 decode is software based decode on the ARM NEON engines or proprietary DSPs. Software based decode is always going to be less power efficient than full fledged hardware decode as available for H264 streams. In effect, expect battery life to be a little worse for playing back VP8 streams of the same bit rate as compared to a H264 stream. It is likely that it will take at least another 6 - 8 months for a full fledged VP8 hardware decode engine to become available in the hands of the consumer (Thankfully, this will probably be accelerated by the large number of similarities between VP8 and H264 which could allow reuse of blocks already designed for accelerating H264 decode).
VP8 - The Ugly
Google is a software company, and its inexperience with codec development shows, where they have rushed to market with a spec which has much scope for improvement. On top of that, they are not open to changing the core specifications. Unlike beta software, where bugs can be fixed with a simple and sometimes, transparent, updates, issues with codec specifications may result in huge loss of performance and/or quality if they need to be fixed. This problem will become much more apparent when there is an already existing installed user base of hardware acceleration platforms for this codec.
The MPEG-LA consortium has a number of companies working together to contribute know-how and improve future specifications of H264 in a professional manner. It is hard to imagine an army of open source developers working together to develop specifications, while also avoiding patent issues. Future specifications will always have to skirt around the tried and tested (but patented) H264 algorithms. Even when one considers a big backer like Google behind them, it would become difficult for the project to stay away from legal scrutiny of some sort. Another thing to note is that consumer electronics is probably never going to use VP8 or WebM. Even Google claims that VP8 is suited only for web video. Camcorders (professional or consumer) are unlikely to move away from H264 because quality and performance is of paramount importance in that space.
Do we really need two different codecs in the consumer / web space? Do companies want to make life confusing for the average consumer? This is a question for MPEG-LA to ponder. In our opinion, Google has done the right thing in forcing MPEG-LA to act, but, eventually, we hope MPEG-LA realizes the follies in its licensing terms. If VP8 is successful in the long run, it will have to co-exist with H264, and this is not a good situation for the consumer. Instead of creating a patent pool for VP8, MPEG-LA should work towards making the usage licenses for H264 more friendly for companies and consumers alike.
Post Your CommentPlease log in or sign up to comment.
View All Comments
ganeshts - Friday, May 28, 2010 - linkThanks for bringing this to my attention. The post, I see, was made on Thursday morning, and the draft for this article was prepared much before that.
I will update the article and we will see how this pans out when other people join in to discuss the post you have linked.
iwodo - Friday, May 28, 2010 - linkOne thing we should consider when we compare x264 ( the best of H.264 encoder ) to WebM ( VP8 ), is to use Baseline profile for H.264 only. Because one of the biggest advantage / argument for H.264 is hardware acceleration. For where hardware acceleration matter most, the Mobile sector, their hardware decoder only/mostly support Baseline profile. So if we use a High Profile H.264 quality compare against VP8 and then assume most mobile gadget can get hardware acceleration is simply unfair comparison.
VP8 doesn't have a spec. At least not in its current form. It can only be called a Commented / Documented Reference Encoder.
ganeshts - Friday, May 28, 2010 - linkWhile your indication that hardware acceleration is available only for Baseline Profile H264 might hold true for yesteryear's devices, present generation chipsets support even High Profile 4.1 with just some bit rate restrictions.
Compare Tegra 1 and Tegra 2 chipsets, and also take a look at Chips&Media IP series on their website. Previous generation used to support only baseline. Present generation can do L4.1 High Profile with 10 Mbps restriction for Tegra 2 and 30 Mbps restriction for the Chips&Media IP (Coda series). (Chips&Media is used in the Shanzai PMP chipsets such as those from Telechips and can decode 1080p videos on to small PMP screens / output to HDMI also).
VP8, as a spec, people say, is comparable to only Baseline H264. Personally, I feel we are regressing on the quality we could have, because H264 hardware decode is maturing fast. VP8 will take some time to catch up, and unless a new version of the codec appears to catch up with High Profile, will always lag on quality. VP8 is going to take quite some time to catch up on that front.
iwodo - Friday, May 28, 2010 - linkYes. Today's new chips set may support High Profile. But how many of them are shipped?
Compare to Millions of iPhone / iPod Touch Devices that still only support Baseline Profile.
ajp_anton - Friday, May 28, 2010 - linkOne of the x264 developers wrote a blog about VP8:
There are comparisons somewhere near the end with both screenshots and the whole video downloadable.
x264, high profile: http://doom10.org/compare/x264.png
x264, baseline profile: http://doom10.org/compare/x264baseline.png
CountDown_0 - Friday, May 28, 2010 - linkI'm not saying that Froyo, Google TV and WebM are not interesting, but... One thing I noticed is that nobody seems to have spent a single word about Google Wave. Does this mean bad news for that project?
Casper42 - Friday, May 28, 2010 - linkWhat about it? Its been out for a while now so its kinda old news.
Also doesn't tie in much with Anandtech, being mainly a hardware site.
awaken688 - Saturday, May 29, 2010 - link"In a perfect world, we would have no software patents and everyone would be capable of using the best technology available. However, for now, we will have to put up with these types of laws and patents."
You need to keep obvious bias from your articles. Not everyone in the world thinks everything should be open-sourced and free for all. As much as the patent trolling and patent squatting sucks, the core fundamentals of the system are what drive a lot of innovation. Many of these companies wouldn't even bother putting in the R & D if someone else could just come in and steal it without having to pay that cost.
Long story short, keep it technical, objective, and less biased.
ganeshts - Sunday, May 30, 2010 - linkThanks for your comment.
This article is meant to be Anandtech's take / opinion on the introductions. So, there will definitely be a bias.
For an unbiased report, we have DailyTech's articles.
Also, many engineers who work in the industry believe that software patents are not that great. Patents that reflect actual hardware / system designs make more sense. Otherwise, we end up with patents like the one issued for 'Linked List' [ Check this out: http://www.patentstorm.us/patents/7028023.html ]
flatline403 - Saturday, May 29, 2010 - linkI assume Ganesh is not a native English speaker. This is a good article with good technical details, but it's badly in need of careful editing.