Monthly Archives: November 2008

New FATE Database Format

First, I should announce that I finally fixed the the problem I introduced during last month’s big FATE rework in which the stdout/stderr blobs were not making it all the way into the MySQL database. I don’t know how that escaped notice during my initial testing. It works again, and after only 3 evenings of analysis and debugging.

Moving right along, I have also made good on my intention to move FATE’s test database format from a pickled Python blob to an SQLite database. You can download the current database of test specs at:

http://fate.multimedia.cx/fate-tests.sqlite.bz2

Run the file through bunzip2, install SQLite, either through your system’s package manager or from sqlite.org, and run:

  $ sqlite3 fate-tests.sqlite

Be sure to use ‘sqlite3’ vs. ‘sqlite’; the latter invokes v2 of the program.

First things first, study the schema:

sqlite> .schema
CREATE TABLE test_spec
        (
          id INTEGER,
          short_name TEXT,
          command TEXT,
          description TEXT,
          expected_status INTEGER,
          expected_stdout TEXT,
          timeout_seconds INTEGER,
          active INTEGER
        );

And then formulate queries. As an example, get a list of all the tests (short name and command) that are presently disabled:

sqlite>
  SELECT short_name, command 
  FROM test_spec 
  WHERE active=0;

You should find that the list matches up with the red boxes in the master FATE test list.

I hope some people find this useful.

SD And Me

When I got my Eee PC last December, I was exposed to 2 computer things I had never dealt with before: wireless networking and SD memory (yeah, I’m behind the curve technologically; what of it?). The Eee PC — at least the 701 — is not outfitted with a great deal of storage, only 4 GB SSD. An easy way to expand the capacity is to add an SD card. I did my homework on the technology because I saw that there were varying prices among many brands. Plus, they all seemed to be rated using an “X” speed. It turns out that this speed is the same as for compact discs (X = 150 kilobytes/sec, the minimum speed needed to play an audio CD).

I settled on a 133X PNY brand 2GB SD card for my Eee PC. The Xandros-based OS recognized it right away and it seemed to work just fine… except for one minor detail– its write speed was mind-bogglingly slow, about 350 kilobytes/sec, so about 2.5X. The read speed was fine, though. I clocked it in the range of at least 120X. This colored my perception of SD cards for much of this year — really fast to read and ludicrously slow to write — and I mainly used the SD card for storing large multimedia samples and other data that didn’t need to change often. I wondered how SD could be so widely used with such an abysmal write speed. It seems to be quite popular with cameras. I decided maybe that’s why cameras pack absurd amounts of RAM– to cache pictures until they can finally be dumped to storage. I asked other people about their experiences with SD memory but no one had thought to profile the stuff. Maybe they were fine with the slow write speeds?

I just bought a new digital camera and a 150X Transcend brand 2GB SD card. I couldn’t get the card to work in my camera (though the piddly 32 MB card included with the camera works fine). Before I returned the card, I decided to try it in my Eee PC. It worked pretty handily in there, so I proceeded to copy the data off of the PNY card and onto the Transcend card so I can perhaps use the PNY card in the camera. Large files were being written to the Transcend card in no time and so I profiled it a little more carefully. I saw write speeds of over 7 MB/sec. Okay, so this card definitely gets to live in my Eee PC.

But do I really want to put this slow PNY card in my camera? I decided to try the PNY one last time in the Eee PC. Suddenly, I saw outrageously high (compared to before) write speeds on the PNY. What’s going on?

One thing has changed in the interim: I have moved from using the Xandros-based Eee PC Linux to using Ubuntu-Eee. The only explanation I have at this point based on the available data is that the original OS had a really substandard SD card driver.

What have I learned from this exercise? I don’t know, maybe that I shouldn’t have such low expectations. If anyone cares about my precise methodology:

 # create a file of random garbage that is roughly 768 MB large
 $ dd if=/dev/urandom of=randomfile bs=1024 count=768000
 # this Eee PC has 512 MB of RAM; the file will not be cached in RAM
 $ dd if=randomfile of=sdcard/randomfile

The SD cards are formatted with ext2. This methodology is a little different than the one I found on ossguy’s blog where he profiled by reading and writing raw sectors. He did an undeniably thorough job, though, testing 5 cards against 3 different interfaces.

ARM Netbook In The Works

Apparently, I won’t have to revise the entire architecture of FATE in order to test FFmpeg on ARM via the Beagle Board. I have been reading some stuff about how ARM will release chips suitable for netbook devices, and how Canonical has signed on to make sure that at least one Linux distribution runs competently on said devices.

I’m excited about this for 2 reasons: I like netbooks (whereas no conventional laptop has ever managed to interest me) and because I retain an innate fascination with alternate (i.e., non-x86) CPU architectures, architectures that are often difficult to work with due to unavailability of hardware and appropriate tools.

Theora Is Now Officially Available

Wow, it seems like only yesterday that I downloaded the newly open sourced On2 VpVision source code package and started reverse engineering an English language description of the VP3 video coding algorithm. Well, actually, that was nearly 7 years ago. VP3 eventually formed the basis of the Xiph Theora video codec. And today Theora is pleased to announce that the codec is finally, well… final. It’s out. No more alpha/beta phases. The codec is ready for primetime use and should be conquering the digital media frontier in short order.

You know, just like Vorbis.