Monthly Archives: December 2008

32-bit Shuffling

I decided to put that cross compiling effort on hold. I can’t get any cross compilers compiled and even if I could, if seems like a silly effort without any special hardware to test, or until I budget time to figure out how qemu works. (Though I am still pondering a MIPS-based laptop; if anyone knows where to find a Alpha-400 or Razorbook or any of the other dozen names it’s marketed under, for cheaper than Geek’s.com sells them, let me know.)

However, I am always reorganizing, always shuffling things around. When I got a Mac Mini in the first month of this year, I only meant for it to run x86_64 FATE cycles in a VMware Fusion session, and maybe native Mac OS X cycles. Now, the little box serves as my primary home desktop; I have fully migrated off of my old Windows XP desktop. So every time there was new code in FFmpeg SVN, the Mac Mini would run FATE cycles for x86_32/Linux, x86_64/Linux, and Mac OS X, all in parallel — and there are only 2 CPU cores and 3 GB of memory in play here. Things could slow down during primetime.

After I migrated to the Mac Mini as my primary desktop, I completely decommissioned and stowed the old WinXP box (which has a little more overall processing power than the Mac Mini). It didn’t take long before I realized that I should slap 64-bit Linux on the thing and put it back into service as a dedicated FATE box.

I quickly migrated all the x86_64 configurations over to the new box, thus easing the load on the Mac Mini. However, I think it would be useful to migrate the x86_32 configurations over as well. x86_64 is alleged to be able to run x86_32 binaries as well, just as long as any dependent 32-bit libraries are installed.

So the goal is to get this x86_64 Linux box building 32-bit binaries. Things I have tried in order to achieve this end:

  • I installed the libc6-dev-i386 package on this Ubuntu-based machine; I understand that’s crucial to running basic 32-bit binaries.
  • As a baseline, I tried getting the native gcc compiler (4.3.2) to build a 32-bit binary.
  • I have a bunch of compilers already installed on the 64-bit machine that I copied wholesale from the old x86_64 VMware session. I tried to convince them to compile 32-bit binaries.
  • I also have a bunch of gcc versions sitting on the x86_32 VMware session. Armed with the knowledge the x86_64 machines allegedly run x86_32 binaries, I copied the directories whole to the new machine.

The short story is that nothing worked. At the very least, I figured out that x86_32 is not a suitable arch to specify to FFmpeg’s configure script; these are the suitable x86_32 strings: “i386|i486|i586|i686|i86pc|BePC”. But that only goes so far toward solving the problem. Running the 32-bit compiled compilers makes ld segfault during FFmpeg’s detection. The transplanted 64-bit compilers failed during the configuration due to a failure to locate a suitable libgcc.a. I likely explicitly disabled multilib (–disable-multilib) when building them because… probably because it’s the only way they would compile. I’m pretty sure that multilib in this context pertains to building a 64-bit compiler that can spit out both 32- and 64-bit binaries. But I can never get 64-bit gcc to build with multilib. And if you google for the error message in question — something about not finding gnu/stubs-32.h — you will just find pages upon pages of forum posts from people who are trying to compile gcc on 64-bit platforms and who eventually arrive at the solution to — you guessed it — configure with –disable-multilib. After all, who really needs to wants to compile 32-bit binaries on a 64-bit machine?

The native compiler solution got the farthest, but that bombed out on an inline assembly error related to H.264. This was another concern I had about the whole process: is the Makefile set up to compile raw ASM files correctly via YASM? (I know that YASM != inline ASM, but the 2 topics are tangentially related.)

Sometime back in the early days of FATE, someone asked why I didn’t run the 32- and 64-bit FATE configurations from the same 64-bit machine. Is this a good enough answer?

Unassuming Make

I made a few changes to the FATE script tonight: the {MAKETEST} substitution is now configurable from the fateconfig.py file; can you believe not all systems have GNU make as the default ‘make’ command, and dare to put it in locations other than what I originally assumed and hardcoded? The script makes it easier to deal with those deviant outliers. Also, if you have been running the FATE script, wipe out your old source/ directory before running this new version. Otherwise, it will complain about a repository mismatch because I finally updated the SVN strings to point to ffmpeg.org rather than mplayerhq.hu.

Cross Compiled FATE

I have been considering the idea of adding gcc cross compilers to FATE. At first, I just want to try compiling some binaries to make sure the builds stay working; testing may come later via qemu or physical hardware.

There was once a time when I was reasonably competent at setting up cross compiling toolchains, when I was developing software for the Sega Dreamcast on a hobby basis (SH-4 and ARM toolchains). But I seem to have lost the skill somewhere along the line. Fundamentally, it involves configuring GNU binutils with an alternate –target than the default, native platform. The trouble is that it’s difficult to figure out exactly what the target is named. I recently tried to set up a toolchain for MIPS, just in case I should come into possession of a laptop with such a CPU. I couldn’t figure out if I needed a mips-elf target, or a mips32-elf target, or perhaps a mips32-linux-elf target. Nothing I tried worked.

Maybe I just don’t have the right targets. What would be some good, useful, cross-compiled targets to be building continuously with FATE? I suspect that, at a minimum, all of the targets for which FFmpeg has SIMD optimizations: Alpha, ARM, Blackfin, PS2-MIPS, SH-4, and Sparc.

FATE’s Further Evolution

I have gotten a lot of good feedback about FATE since I released the core fate-script.py program last week. I have posted a new version of fate-script.py and its config file, fateconfig.py-example, that includes a few new features:

  • Config file now has a NICE_LEVEL option which, when set to a numeric value, will re-nice the script to a nicer level. This is in consideration to certain testers who are trying to obtain permission to run FATE continuously on shared systems.
  • Setting the LD_LIBRARY_PATH used to be an explicit part of the script. It is now user-configurable (well, it’s open source, so it’s always configurable; it’s just more easily configurable now) through the config file. This was added since Windows targets do not honor LD_LIBRARY_PATH. This is one more step on the path to getting Cygwin/MinGW configurations into FATE.

Further, I fixed a bug with the timeout killer in the FATE script. Well, “fix” is a strong word (“wrongheaded hack” is more accurate). But the end result is that FATE will honor the individual test spec timeouts in order to guard against infinite loops that may creep into SVN.