Author Archives: Multimedia Mike

Optical Drive Value Proposition

I have the absolute worst luck in the optical drive department. Ever since I started building my own computers in 1995 — close to the beginning of the CD-ROM epoch — I have burned through a staggering number of optical drives. Seriously, especially in the time period between about 1995-1998, I was going through a new drive every 4-6 months or so. This was also during that CD-ROM speed race where the the drive packages kept advertising loftier ‘X’ speed ratings. I didn’t play a lot of CD-ROM games during that timeframe, though I did listen to quite a few audio CDs through the computer.



I use “optical drive” as a general term to describe CD-ROM drives, CD-R/RW drives, DVD-ROM drives, DVD-R/RW drives, and drives capable of doing any combination of reading and writing CDs and DVDs. In my observation, optical media seems to be falling out of favor somewhat, giving way to online digital distribution for things like games and software, as well as flash drives and external hard drives vs. recordable or rewritable media for backup and sneakernet duty. Somewhere along the line, I started to buy computers that didn’t even have optical drives. That’s why I have purchased at least 2 external USB drives (seen in the picture above). I don’t have much confidence that either works correctly. My main desktop until recently, a Mac Mini, has an internal optical drive that grew flaky and unreliable a few months after the unit was purchased.

I just have really rotten luck with optical drives. The most reliable drive in my house is the one on the headless machine that, until recently, was the main workhorse on the FATE farm. The eject switch didn’t work correctly so I have to log in remotely, 'sudo eject', walk to the other room, pop in the disc, walk back to the other room, and work with the disc.

Maybe optical media is on its way out, but I still have many hundreds of CD-ROMs. Perhaps I should move forward on this brainstorm to archive all of my optical discs on hard drives (and then think of some data mining experiments, just for the academic appeal), before it’s too late; optical discs don’t last forever.

So if I needed a good optical drive, what should I consider? I’ve always been the type to go cheap, I admit. Many of my optical drives were on the lower end of the cost spectrum, which might have played some role in their rapid replacement. However, I’m not sold on the idea that I’m getting quality just because I’m paying a higher price. That LG unit at the top of the pile up there was relatively pricey and still didn’t fare well in the long (or even medium) term.

Come to think of it, I used to have a ridiculous stockpile of castoff (but somehow still functional) optical drives. So many, in fact, that in 2004 I had a full size PC tower that I filled with 4 working drives, just because I could. Okay, I admit that there was a period where I had some reliable drives.

That might be an idea, actually– throw together such a computer for heavy duty archival purposes. I visited Weird Stuff Warehouse today (needed some PC100 RAM for an old machine and they came through) and I think I could put together such a box rather cheaply.

It’s a dirty job, but… well, you know the rest.

Revisiting the Belco Alpha-400

Relieved of the primary FATE maintenance duties, I decided to dust off my MIPS-based Belco Alpha-400 and try to get it doing FATE cycles. And just as I was about to get FATE running, I saw that Mans already got his MIPS-based Popcorn Hour device to run FATE. But here are my notes anyway.



Getting A Prompt
For my own benefit, I made a PDF to remind me precisely how to get a root prompt on the Alpha-400. The ‘jailbreak’ expression seems a little juvenile to me, but it seems to be in vogue right now.

alpha-400-jailbreak.pdf

Toolchain
When I last tinkered with the Alpha-400, I was trying to build a toolchain that could build binaries to run on the unit’s MIPS chip, to no avail. Sometime last year, MichaelK put together x86_32-hosted toolchains that are able to build mipsel 32-bit binaries for Linux 2.4 and 2.6. The Alpha-400 uses a 2.4 kernel and the corresponding toolchain works famously for building current FFmpeg (--disable-devices is necessary for building).

FATE Samples
Next problem: Making the FATE suite available to the Alpha-400. I copied all of the FATE suite samples onto a VFAT-formatted SD card. The filename case is not preserved for all files which confounds me since it is preserved in other cases. I tried formatting the card for ext3 but the Alpha-400 would not mount it, even though /proc/filesystems lists ext3 (supporting an older version of ext3?).

Alternative: Copy all of the FATE samples to the device’s rootfs. Space will be a little tight, though. Then again, there is over 600 MB of space free; I misread earlier and thought there were only 300 MB free.

Remote Execution
To perform FATE cycles on a remote device, it helps to be able to SSH into that remote device. I don’t even want to know how complicated it would be to build OpenSSH for the device. However, the last time I brought up this topic, I learned about a lighter weight SSH replacement called Dropbear. It turns out that Dropbear runs great on this MIPS computer.

Running FATE Remotely
I thought all the pieces would be in place to run FATE at this point. However, there is one more issue: Running FATE on a remote system requires that the host and the target are sharing a filesystem somehow. My personal favorite remote filesystem method is sshfs which is supposed to work wherever there is an SSH server. That’s not entirely true, though– sshfs also requires sftp-server to be installed on the server side, a program that Dropbear does not currently provide.

I’m not even going to think about getting Samba or NFS server software installed on the Alpha-400. According to the unit’s /proc/filesystems file, nfs is a supported filesystem. I hate setting up NFS but may see if I can get that working anyway.

Residual Weirdness
The unit comes with the venerable Busybox program (BusyBox v1.4.1 (2007-06-01 20:37:18 CST) multi-call binary) for most of its standard command line utilities. I noticed a quirk where BusyBox’s md5sum gives weird hex characters. This might be a known/fixed issue.

Another item is that the Alpha-400’s /dev/null file only has rwxr-xr-x per default. This caused trouble when I first tried to scp using Dropbear using a newly-created, unprivileged user.

ANSI FATE

The new FATE server is shaping up well. I think most of the old configurations have been migrated to the new server. I see one new compiler for x86_64– PathScale. It’s not faring particularly well at this point.

New Tests
As I write this, I noticed that there are now an even 700 tests, twice as many as the last time I trumpeted such a milestone. (It should be noted that the new FATE system finally breaks down the master regression suite into individual tests.) Thankfully, it’s no longer necessary to wait for me to create or edit tests (anyone with FFmpeg privileges can do this), nor is it necessary to keep up with this blog to know exactly what tests are new. Now, you can simply inspect the file history on tests/fate.mak and tests/fate2.mak (I think these 2 files are going to merge in the near future).

Vitor, as of r24865: “Add FATE test for ANSI/ASCII animation and TTY demuxer.” Eh? What’s this about? I admit I was completely removed from FFmpeg development for much of June and July so I could have missed a lot. Fortunately, I can check the file history to see which lines were added to make this test happen. And if FATE is exercising the test, you know exactly where the samples will live. Here’s this new decoder in action on the relevant sample:



The file history fingers Suxen drol/Peter Ross for this handiwork. I might have guessed– the only person who is arguably more enamored with old, weird formats than even I. Now we wait for the day that YouTube has support for this format. I’m sure there are huge archives of these animations out there (and I wager that Trixter and Jason Scott know where).



It’s an animation — it just keeps going

Meanwhile, the FATE suite now encompasses a bunch of perceptual audio formats, thanks to the 1-off testing method and a few other techniques. These formats include Bink audio, WMA Pro, WMA voice, Vorbis, ATRAC1, ATRAC3, MS-GSM, AC3, E-AC3, NellyMoser, TrueSpeech, Intel Music Coder, QDM2, RealAudio Cooker, QCELP (just going down the source control log here), and others, no doubt.

Then there’s this curious tidbit: “Add FATE test for WMV8 DRM”. The test spec is "fate-wmv8-drm: CMD = framecrc -cryptokey 137381538c84c068111902a59c5cf6c340247c39 -i $(SAMPLES)/wmv8/wmv_drm.wmv -an". I would still like to investigate FFmpeg’s cryptographic capabilities, which I suspect are moving in a direction to function as a complete SSL stack one day.

New Platforms
As for new platforms, the new FATE system finally allows testing on OS/2 (remember that classic? It was “the totally cool way to run your computer”). Thanks to Dave Yeo for taking this on.

Further, a new MIPS-based platform recently appeared on the FATE list. This one reports itself as running on 74kf CPU. Googling for this processor quickly brings up Mans’ post about the Popcorn Hour device. So, congratulations to him for getting the mundane box to serve a higher purpose. Perhaps one day, I’ll be able to do the same for that Belco Alpha-400 netbook.

FFmpeg and Code Coverage Tools

Code coverage tools likely occupy the same niche as profiling tools: Tools that you’re supposed to use somewhere during the software engineering process but probably never quite get around to it, usually because you’re too busy adding features or fixing bugs. But there may come a day when you wish to learn how much of your code is actually being exercised in normal production use. For example, the team charged with continuously testing the FFmpeg project, would be curious to know how much code is being exercised, especially since many of the FATE test specs explicitly claim to be “exercising XYZ subsystem”.

The primary GNU code coverage tool is called gcov and is probably already on your GNU-based development system. I set out to determine how much FFmpeg source code is exercised while running the full FATE suite. I ran into some problems when trying to use gcov on a project-wide scale. I spackled around those holes with some very ad-hoc solutions. I’m sure I was just overlooking some more obvious solutions about which you all will be happy to enlighten me.

Results
I’ve learned to cut to the chase earlier in blog posts (results first, methods second). With that, here are the results I produced from this experiment. This Google spreadsheet contains 3 sheets: The first contains code coverage stats for a bunch of FFmpeg C files sorted first by percent coverage (ascending), then by number of lines (descending), thus highlighting which files have the most uncovered code (ffserver.c currently tops that chart). The second sheet has files for which no stats were generated. The third sheet has “problems”. These files were rejected by my ad-hoc script.

Here’s a link to the data in CSV if you want to play with it yourself.

Using gcov with FFmpeg Continue reading