Monthly Archives: September 2010

How Much H.264 In Each Encoder?

Thanks to my recent experiments with code coverage tools, I have a powerful new — admittedly somewhat specious — method of comparing programs. For example, I am certain that I have read on more than one occasion that Apple’s H.264 encoder sucks compared to x264 due, at least in part, to the Apple encoder’s alleged inability to exercise all of H.264’s features. I wonder how to test that claim?

Experiment
Use code coverage tools to determine which H.264 encoder uses the most features.

Assumptions

  • Movie trailers hosted by Apple will all be encoded with the same settings using Apple’s encoder.
  • Similarly, Yahoo’s movie trailers will be encoded with consistent settings using an unknown encoder.
  • Encoding a video using FFmpeg’s libx264-slow setting will necessarily throw a bunch of H.264’s features into the mix (I really don’t think this assumption holds much water, but I also don’t know what “standard” x264 settings are).

Methodology

  • Grab a random Apple-hosted 1080p movie trailer and random Yahoo-hosted 1080p movie trailer from Dave’s Trailer Page.
  • Use libx264/FFmpeg with the ‘slow’ preset to encode Big Buck Bunny 1080p from raw PNG files.
  • Build FFmpeg with code coverage enabled.
  • Decode each file to raw YUV, ignore audio decoding, generate code coverage statistics using gcovr, reset stats after each run by deleting *.gcda files.

Results

  • x264 1080p video: 9968 / 134203 lines
  • Apple 1080p trailer: 9968 / 134203 lines
  • Yahoo 1080p trailer: 9914 / 134203 lines

I also ran this old x264-encoded file (ImperishableNightStage6Low.mp4) through the same test. It demonstrated the most code coverage with 10671 / 134203 lines.

Conclusions
Conclusions? Ha! Go ahead and jump all over this test. I’m already fairly confident that it’s impossible (or maybe just very difficult) to build a single H.264-encoded video that exercises every feature that FFmpeg’s decoder supports. For example, is it possible for a file to use both CABAC and CAVLC entropy methods? If it’s possible, does any current encoder do that?

Using gcovr with FFmpeg

When I started investigating code coverage tools to analyze FFmpeg, I knew there had to be an easier way to do what I was trying to do (obtain code coverage statistics on a macro level for the entire project). I was hoping there was a way to ask the GNU gcov tool to do this directly. John K informed me in the comments of a tool called gcovr. Like my tool from the previous post, gcovr is a Python script that aggregates data collected by gcov. gcovr proves to be a little more competent than my tool.

Results
Here is the spreadsheet of results, reflecting FATE code coverage as of this writing. All FFmpeg source files are on the same sheet this time, including header files, sorted by percent covered (ascending), then total lines (descending).

Methodology
I wasn’t easily able to work with the default output from the gcovr tool. So I modified it into a tool called gcovr-csv which creates data that spreadsheets can digest more easily.

  • Build FFmpeg using the '-fprofile-arcs -ftest-coverage' in both the extra cflags and extra ldflags configuration options
  • 'make'
  • 'make fate'
  • From build directory: 'gcovr-csv > output.csv'
  • Massage the data a bit, deleting information about system header files (assuming you don’t care how much of /usr/include/stdlib.h is covered — 66%, BTW)

Leftovers
I became aware of some spreadsheet limitations thanks to this tool: Continue reading

Bye Bye FATE Machine

This is the computer that performed the lion’s share of FATE cycles for the past 1.5 years before Mans put a new continuous integration system into service. I’ve now decided to let the machine go. I can’t get over how odd this feels since this thing is technically the best machine I own.



It’s a small form factor Shuttle PC (SD37P2 v2); Core 2 Duo 2.13 GHz; 2 GB RAM; 400 GB SATA HD; equipped with the only consistently functional optical drive in my house (uh oh). I used it as my primary desktop from March 2007 – November 2008, at which point I repurposed it for FATE cycles.

As mentioned, the craziest part is that this is technically the best computer in my house. My new EeePC 1201PN isn’t at quite the same level; my old EeePC can’t touch it, of course; the Mac Mini has a little more RAM but doesn’t stack up in nearly all other areas. But the Shuttle just isn’t seeing that much use since the usurpation. I had it running automated backup duty for multimedia.cx but that’s easy enough to move to another, lower-powered system.

Maybe the prognosticators are correct and the PC industry has matured to the point where raw computing power simply doesn’t matter anymore. I fancy myself as someone who knows how to put CPU power to work but even I don’t know what to do with the computing capacity I purchased over 3 years ago.

Where will the Shuttle go? A good home, I trust– I know a family that just arrived in the country and could use a computer.

Linux Media Player Survey Circa 2001

Here’s a document I scavenged from my archives. It was dated September 1, 2001 and I now publish it 9 years later. It serves as sort of a time capsule for the state of media player programs at the time. Looking back on this list, I can’t understand why I couldn’t find MPlayer while I was conducting this survey, especially since MPlayer is the project I eventually started to work for a few months after writing this piece.

For a little context, I had been studying multimedia concepts and tech for a year and was itching to get my hands dirty with practical multimedia coding. But I wanted to tackle what I perceived as unsolved problems– like playback of proprietary codecs. I didn’t want to have to build a new media playback framework just to start working on my problems. So I surveyed the players available to see which ones I could plug into and use as a testbed for implementing new decoders.

Regarding Real Player, I wrote: “We’re trying to move away from the proprietary, closed-source ‘solutions'”. Heh. Was I really an insufferable open source idealist back in the day?

Anyway, here’s the text with some Where are they now? commentary [in brackets]:


Towards an All-Inclusive Media Playing Solution for Linux

I don’t feel that the media playing solutions for Linux set their sights high enough, even though they do tend to be quite ambitious. Continue reading