Tag Archives: fate

Implementing The RPC Idea

About that RPC-based distributed test staging idea I brainstormed yesterday, I’ll have you know that I successfully implemented the thing today. I used the fraps-v4 test spec for verification because it is known to work correctly right now, and because it only has 7 lines of stdout text. This is what the script looks like in action:

$ ./rpc-dist-test.py "FFMPEG -i 
  SAMPLES_PATH/fraps/WoW_2006-11-03_14-58-17-19-nosound-partial.avi 
  -f framecrc -" 
asking for results from 12 configurations...
testing config 0: Linux / x86_64 / gcc 4.0.4
testing config 1: Linux / x86_64 / gcc 4.1.2
testing config 2: Linux / x86_64 / gcc 4.2.4
testing config 3: Linux / x86_64 / gcc 4.3.2
testing config 4: Linux / x86_64 / gcc svn
testing config 5: Linux / PPC / gcc 4.0.4
testing config 6: Linux / PPC / gcc 4.1.2
testing config 7: Linux / PPC / gcc 4.2.4
testing config 8: Linux / PPC / gcc 4.3.2
testing config 9: Linux / PPC / gcc svn
testing config 10: Mac OS X / x86_32 / gcc 4.0.1
testing config 11: Mac OS X / x86_64 / gcc 4.0.1

1 configuration(s) failed
  configuration Mac OS X / x86_32 / gcc 4.0.1 returned status 133

There was 1 unique stdout blob collected
all successful configurations agreed on this stdout blob:
0, 0, 491520, 0x68ff12c0
0, 3000, 491520, 0x22d36f0d
0, 6000, 491520, 0xce6f877d
0, 9000, 491520, 0x85d6744c
0, 12000, 491520, 0x1aa85794
0, 15000, 491520, 0x528d1274
0, 18000, 491520, 0x357ec61c

A few notes about the foregoing: Continue reading

RPC-Based Distributed Test Staging

FATE needs to have more tests. A lot more tests. It has a little over 200 test specs right now and that only covers a fraction of FFmpeg‘s total functionality, not nearly enough to establish confidence for an FFmpeg release.

Here’s the big problem: It’s a really tedious process to initiate a new test into the suite. Sure, I sometimes write special scripts that do the busywork for me for a large set of known conformance samples. But my biggest record for entering tests manually seems to be a whopping 11 test specs in one evening.

The manual process works something like this: Given a sample that I think is suitable to test a certain code path in FFmpeg, place the sample in a shared location where my various FATE installations can reach it. Then, get the recent FFmpeg source from SVN (in repositories separate from where FATE keeps its code). Compile the source on each platform, using whichever compiler I feel like for each. On a platform that has SDL installed, run the sample through ffplay to verify that the data at least sort of looks and sounds correct (e.g., nothing obviously wrong like swapped color planes or static for audio). Then, run a command which will output CRC data per the ‘-f framecrc’ output target. Visually compare the CRC data (at least the first and last lines) to verify that the output is consistent across a few platforms (say, PPC, x86_32, and x86_64). Then go through the process of writing up the test in my FATE administration panel.

I’m constantly thinking about ways to improve processes, particularly processes as tortuously tedious as this. The process has already seen a good deal of improvement (before making a basic web admin form, I had to add and edit the test specs from a MySQL console). I intend to address the inadequacy of the basic web form at a later date when I hopefully revise the entire web presentation. What I want to do in the shorter term is address the pain of verifying consistent output across platforms.

I got the idea that it would be nice to be able to ask a FATE installation — remotely — to run a test and pass back the framecrc output. This way, I could have one computer ask several others to run a test and quickly determine if all the machines agree on the output. But would I have to write a special server to make this possible? Sounds like a moderate amount of work. Wait, what about just SSH’ing into a remote machine and running the test? Okay, but would I still have to recompile the source code to make sure the FFmpeg binary exists? No, if these are FATE installations, they are constantly building FFmpeg day and night. Just be sure to save off a copy of the ‘ffmpeg’ binary and its shared libraries in a safe place. But where would such saving take place? Should I implement a post processing facility in fate-script.py to be executed after a build/test cycle? That shouldn’t be necessary– just copy off the relevant binaries at the end of a successful build mega-command.

So the pitch is to modify all of my FATE configurations to copy ‘ffmpeg’ and 4 .so files to a safe place. As a bonus, I can store the latest builds for all configurations; e.g., my x86_32 installation will have 8 different copies, one for each of the supported compilers. The next piece of the plan is Python script! Create a configuration file that is itself a Python file that has a data structure which maps out all the configurations, the machines they live on, the directory where their latest binaries live, and where they can find the shared samples. The main Python script takes an argument in the form of (with quotes) “FFMPEG_BIN -i SAMPLES_PATH/sample.avi -an -t 3 -f framecrc -“, iterates through the configurations, builds SSH remote calls by substituting the right paths into the command line, and processes the returned output.

Simple! Well, one part that I’m not sure about is exactly how to parse the output. I think I might use the whole of the returned stdout string as a dictionary key that maps to an array of configurations. If the dictionary winds up with only one key in the end, that means that all the configurations agreed on the output; add a new test spec!

Thanks for sitting through another of my brainstorming sessions.

See also:

32-bit Shuffling

I decided to put that cross compiling effort on hold. I can’t get any cross compilers compiled and even if I could, if seems like a silly effort without any special hardware to test, or until I budget time to figure out how qemu works. (Though I am still pondering a MIPS-based laptop; if anyone knows where to find a Alpha-400 or Razorbook or any of the other dozen names it’s marketed under, for cheaper than Geek’s.com sells them, let me know.)

However, I am always reorganizing, always shuffling things around. When I got a Mac Mini in the first month of this year, I only meant for it to run x86_64 FATE cycles in a VMware Fusion session, and maybe native Mac OS X cycles. Now, the little box serves as my primary home desktop; I have fully migrated off of my old Windows XP desktop. So every time there was new code in FFmpeg SVN, the Mac Mini would run FATE cycles for x86_32/Linux, x86_64/Linux, and Mac OS X, all in parallel — and there are only 2 CPU cores and 3 GB of memory in play here. Things could slow down during primetime.

After I migrated to the Mac Mini as my primary desktop, I completely decommissioned and stowed the old WinXP box (which has a little more overall processing power than the Mac Mini). It didn’t take long before I realized that I should slap 64-bit Linux on the thing and put it back into service as a dedicated FATE box.

I quickly migrated all the x86_64 configurations over to the new box, thus easing the load on the Mac Mini. However, I think it would be useful to migrate the x86_32 configurations over as well. x86_64 is alleged to be able to run x86_32 binaries as well, just as long as any dependent 32-bit libraries are installed.

So the goal is to get this x86_64 Linux box building 32-bit binaries. Things I have tried in order to achieve this end:

  • I installed the libc6-dev-i386 package on this Ubuntu-based machine; I understand that’s crucial to running basic 32-bit binaries.
  • As a baseline, I tried getting the native gcc compiler (4.3.2) to build a 32-bit binary.
  • I have a bunch of compilers already installed on the 64-bit machine that I copied wholesale from the old x86_64 VMware session. I tried to convince them to compile 32-bit binaries.
  • I also have a bunch of gcc versions sitting on the x86_32 VMware session. Armed with the knowledge the x86_64 machines allegedly run x86_32 binaries, I copied the directories whole to the new machine.

The short story is that nothing worked. At the very least, I figured out that x86_32 is not a suitable arch to specify to FFmpeg’s configure script; these are the suitable x86_32 strings: “i386|i486|i586|i686|i86pc|BePC”. But that only goes so far toward solving the problem. Running the 32-bit compiled compilers makes ld segfault during FFmpeg’s detection. The transplanted 64-bit compilers failed during the configuration due to a failure to locate a suitable libgcc.a. I likely explicitly disabled multilib (–disable-multilib) when building them because… probably because it’s the only way they would compile. I’m pretty sure that multilib in this context pertains to building a 64-bit compiler that can spit out both 32- and 64-bit binaries. But I can never get 64-bit gcc to build with multilib. And if you google for the error message in question — something about not finding gnu/stubs-32.h — you will just find pages upon pages of forum posts from people who are trying to compile gcc on 64-bit platforms and who eventually arrive at the solution to — you guessed it — configure with –disable-multilib. After all, who really needs to wants to compile 32-bit binaries on a 64-bit machine?

The native compiler solution got the farthest, but that bombed out on an inline assembly error related to H.264. This was another concern I had about the whole process: is the Makefile set up to compile raw ASM files correctly via YASM? (I know that YASM != inline ASM, but the 2 topics are tangentially related.)

Sometime back in the early days of FATE, someone asked why I didn’t run the 32- and 64-bit FATE configurations from the same 64-bit machine. Is this a good enough answer?

Unassuming Make

I made a few changes to the FATE script tonight: the {MAKETEST} substitution is now configurable from the fateconfig.py file; can you believe not all systems have GNU make as the default ‘make’ command, and dare to put it in locations other than what I originally assumed and hardcoded? The script makes it easier to deal with those deviant outliers. Also, if you have been running the FATE script, wipe out your old source/ directory before running this new version. Otherwise, it will complain about a repository mismatch because I finally updated the SVN strings to point to ffmpeg.org rather than mplayerhq.hu.