Author Archives: Multimedia Mike

64-bit Mac On FATE

I really appreciate the flurry of SVN activity on FFmpeg today– it helped build confidence in my revised build/test script. FATE work items completed tonight:

  • set up a configuration to build and test 64-bit FFmpeg binaries for Mac OS X; thanks to David Conrad for schooling me on how to do this in the comments to the last post
  • re-instated the Linux/x86_32 and x86_64 configurations
  • upgraded all gcc-svn configurations to the latest version
  • disabled all gcc 4.3.1 configurations; replaced with latest and greatest 4.3.2 configurations

I still haven’t gotten Python 2.5.2 on Gentoo to import sqlite3, so the PowerPC is still cheerfully running the classical, less efficient build/test script.

I sat back and watched the various configurations rip through their build/test assignments. Then I noticed a result on the main FATE page that had passed 85/94 tests. That’s a bug (too few tests). So I did what came naturally, deleted the record, and am waiting to see if it happens again. I think I may know what the problem is, though. That’s the nice thing about being the sole developer of a system– it can actually be possible to juggle all the pieces in your head, understand all the design decisions behind them, and intuitively understand what might be wrong.

Mac OS X On FATE

Okay, FATE is back online, somewhat. First and foremost, you will notice that Mac OS X autobuilds are at the top of the page. I know it doesn’t look like much is different but I changed a whole lot of stuff under the covers to get to that point. I have not re-instated the x86_32 or x86_64 build/test cycles for Linux yet because I would like to see this new script bake for a day or 2 before copying it to other systems. The PPC build/test cycles are running because I can’t figure out how to make the new script run on that machine (hi Gentoo!), so it gets to stick to the older, still functional script for now.

If you will indulge a little self back-patting, I’m rather pleased with how this new system is shaping up. I set out to solve one problem but I wound up realizing solutions and better approaches for a whole lot more things. The new system is already faster and more resilient to intermittent network problems (that will never go away completely no matter what availability and bandwidth guarantees we have). Going forward, I have new ideas about how to make the system easier to administer, and to allow co-administrators to help out as well. Look for more platforms on FATE in the near future as it should be much easier for others to run the client program and automatically submit data back to the server. And it may even be possible to adapt the system for other projects.

I look forward to writing up more notes about the infrastructure changes. Most of them boil down to my new love affair with SQLite.

FATE Will Return

The FATE server started getting frustrating and dispiriting to maintain, so I decided to scrap it a little while ago. But I have since started to heavily revise the infrastructure so it can come back online. I have been sitting on a pile of brainstorms about how to make the system work better. Once I finally got down to implementing the changes, it sort of snowballed and I thought of even more improved ways that the various pieces could work together.

But this growth is not without its associated pains. I largely blame PHP for this. Whenever I have a bad day at work, I just remind myself that things could be a lot worse. For example, it could be my job to write PHP code full time. I have lots of gripes with the language, but a few new ones due to this experience.

There must be 7 different ways to interface to one library I want to use, and I can’t get any of them to work. And since it’s all server-side, it’s incredibly difficult to diagnose why the server is having trouble.

PHP is hyper-paranoid about security. When you GET or POST data, PHP’s site-specific (that I can’t change) setting is to escape quotes and backspaces before it makes the data available to you, whether you like it or not. And I don’t like it. I really don’t want the data escaped, but I can’t turn it off. The manual states that the next version of PHP will remove this annoyance.

But there’s no point in complaining about PHP. As Jeff Atwood eloquently expressed, PHP Sucks, But It Doesn’t Matter. It’s still serves as the backbone of some of the most important sites on the internet. And I know I will eventually coerce PHP to be the backbone of FATE once more.

Python isn’t blameless in this either. I need a key feature that, for once, is not provided by the expansive Python standard library (even though the library handles everything else associated with this type of functionality). A few hackers around the net have attempted to fill in the missing piece but I haven’t successfully adapted their code yet.

On the plus side, I should mention that I have gotten FATE running on Mac OS X. It’s currently watching FFmpeg SVN and performing build/test cycles while saving the data locally. That was the easy part. Getting the data to the server is the troublesome part and the foregoing issues described are all components of that problem.

Baldur In Bulk

I got those Baldur’s Gate videos converted to something more modern. The problem turned out to be in the Interplay MVE demuxer code I wrote long ago for FFmpeg. Once upon a time, timestamps in FFmpeg were supposed to be in reference to a 90 kHz clock. Thanks to Pengvado for pointing out that my demuxer still made that assumption. Fixing the demuxer seems like a lot of work right now. So at this point in the exercise, I opted to simply hard code 15 fps for the framerate.

So I got that transcoding process underway, finally. And I made an interesting discovery along the way. I have a colleague who has this quote on his office whiteboard:


Baldur's Gate Nietzsche quote

I can only conclude that said colleague is a huge Baldur’s Gate fan.

Prerequisites for the transcoding operation (basic Kubuntu 8.04 virtual machine):

  • install the libfaac-dev package
  • download and manually compile YASM (required by x264 and the latest YASM packaged by Ubuntu is not bleeding edge enough)
  • download and compile the latest x264 snapshot; configure with –enable-shared
  • get the latest SVN of FFmpeg
  • configure and build FFmpeg with: configure –enable-gpl –enable-postproc –enable-avfilter –enable-avfilter-lavf –enable-swscale –enable-libx264 –enable-libfaac; I don’t really know if all the filter options are strictly necessary for this exercise but I’m used to them by now

So my process for transcoding in bulk after installing this software is:

  • use my Python script (parse-bif-graf.py, listed at the end of this post) to split the BIF resource into its constituent MVE files:
    $ parse-bif-graf.py MovieCD1.bif
    extracting file #0 at offset 132, 29654204 bytes, to 'MovieCD1.bif-0.mve'
    extracting file #1 at offset 29654336, 6530954 bytes, to 'MovieCD1.bif-1.mve'
    [...]
    
  • bulk transcode:
    for mve in `ls *.mve`
    do 
      ffmpeg -y -i $mve \
      -acodec libfaac -ab 128k \
      -vcodec libx264 -vpre hq -b 500k -bt 500k \
      `basename $mve .mve`.mp4
    done
    [...]
    

The resulting files are highly competitive, size-wise, against the original MVE files. At first, I was monkeying with the bitrate because there were some annoying artifacts in the high motion areas. But then I watched the original videos using ffplay and realized that those artifacts are artifacts in the source material.

Continue reading