Breaking Eggs And Making Omelettes

Topics On Multimedia Technology and Reverse Engineering


Archives:

Growing Pains Of FATE

February 12th, 2008 by Multimedia Mike

When I upgraded my web hosting plan last summer from 800 MB of online storage to 1/2 TB, I wondered what I could possibly use all that extra space for. The FATE Server is stepping up to the task and presently — somehow — occupies 1/2 GB of space. This is not a problem in and of itself since it’s only 1/1000 of my total allotment. However, it makes a regular, responsible backup schedule difficult to keep. I have toyed with the idea of hosting the database operation on my own hardware and bandwidth. I’m pretty sure I’m the primary user of this database anyway. Having the database under local authority would also likely allow for greater flexibility and configurability for the underlying engine.

As always, I have plans to add many, many, many more tests. There are various public MPEG conformance suites for different codecs, each consisting of tens or hundreds of samples. There is FFmpeg’s internal regression suite which ought to be run and verified for each build. By my accounting, ‘./ffmpeg_g’ is invoked over 300 times when running ‘make test’; I suspect each of those invocations would be a separate test in the database. Whenever I think of getting down to it and starting to enter individual test specs into the database using my custom PHP tool, I always step back, glance at the magnitude of the task, and instead start outlining a script that will automatically process the test series for me, and with fewer mistakes.

However, there are yet more problems. There are only 76 tests currently. Logging the 76 individual test result records nominally takes 10-15 seconds. Yes, I use one MySQL connection, but with 76 separate INSERT queries. It would probably be more efficient to concatenate them into one INSERT query with 76 records. However, it would probably be even better to parameterize the data, compress it, and POST it via HTTP to a custom CGI script on the server that could uncompress it and perform the INSERT locally and more efficiently, ideally. This would solve firewall problems and library problems as outlined in a previous post and allow for more diverse platform expansion in the future.

Finally, I also need to be able to test tests before deploying them. That’s right — test tests. I.e., enter a new test, or a series of new tests, into a staging area and be able to run a special script to verify that I got all the basic details right such as sample paths and FFmpeg command line parameters. None of this nonsense about dumping in a new test spec and waiting until the next SVN commit to see if I got it all correct. Or worse yet, artifically starting a new build/test cycle with a document update SVN commit. Out of all the problems examined in this post, this should be the easiest to take care of.

Thanks for putting up with yet another edition of Multimedia Mike’s research notepad.

Posted in FATE Server | 2 Comments »

2 Responses

  1. Ramiro Polla Says:

    Why is FATE not listed in the main multimedia.cx website?
    Intentional or forgotten?

  2. Multimedia Mike Says:

    A little of both, I suppose. I guess I don’t consider it a major feature yet. Plus, I never think of a the main multimedia.cx page as a major gateway to the rest of the site. I’ll find a place for it, though.