Thanks to Daniel Verkamp for contributing a Kega video (KGV1) decoder to FFmpeg. I was about to demand samples for testing until I looked up what Kega is — a Sega game console emulator — when I realized that it would be more fun to create my own (be advised that only the Windows version of Kega presently supports the AVI encoding option). Then I looked up the Wiki page and realized that there is, in fact, one sample on record at the archive. Well, I went ahead and made my own sample anyway. I used the mountainside attract-mode scene from the Genesis game Strider, one of my favorite sequences in any video game. It’s in the samples directory.
I am holding off on adding a FATE test; there’s still an endian issue (PPC configs disagree with x86 configs). I’m also a little puzzled as to why FFplay insists on playing the video as 320×240 even though the video is encoded as 640×480. For that matter, I’m also bewildered trying to understand why Kega renders video as 640×480 by default; that’s not a native resolution for any of its emulated consoles.
I get the same output in PPC (Mans’ box) and x86 with the following command:
ffmpeg -i ../samples/strider-attract-mountain.avi -pix_fmt rgb24 -f framecrc – | md5sum
The “-pix_fmt rgb24” is necessary because the decoder outputs rgb555 in native endianess.
I know that pix_fmt option well (only way to get a variety of platforms to agree on the output for various RGB codecs). But it still doesn’t work. Using my tool to test all of my configs:
$ rpc-dist-test.py “{MD5} FFMPEG -i SAMPLES_PATH/strider-attract-mountain.avi -pix_fmt rgb24 -f framecrc -”
There were 2 unique stdout blobs collected
The following configurations agreed on the same stdout:
Linux / PPC / gcc 4.0.4
Linux / PPC / gcc 4.1.2
Linux / PPC / gcc 4.2.4
Linux / PPC / gcc 4.3.4
Linux / PPC / gcc 4.4.2
Linux / PPC / gcc svn
stdout:
da9d1c8a4af2022c752fcfa3f6383cc8
The following configurations agreed on the same stdout:
Linux / x86_32 / llvm-svn
Linux / x86_32 / icc 11.1
Linux / x86_32 / icc 11.0
Linux / x86_32 / icc 10.1
Linux / x86_32 / gcc 2.95.3
Linux / x86_32 / gcc 3.4.6
Linux / x86_32 / gcc 4.0.4
Linux / x86_32 / gcc 4.1.2
Linux / x86_32 / gcc 4.2.4
Linux / x86_32 / gcc 4.3.4
Linux / x86_32 / gcc 4.4.2
Linux / x86_64 / llvm-svn
Linux / x86_64 / icc 11.1
Linux / x86_64 / icc 11.0
Linux / x86_64 / icc 10.1
Linux / x86_64 / gcc 4.0.4
Linux / x86_64 / gcc 4.1.2
Linux / x86_64 / gcc 4.2.4
Linux / x86_64 / gcc 4.3.4
Linux / x86_64 / gcc 4.4.3
Linux / x86_64 / gcc svn
stdout:
1b5fe728b8f0323339a8c84480cbf629
linux / m68k / gcc-4.3.3
linux / mipsel / gcc-4.1.2
linux / sparc-v8 / gcc-4.1.2
1b5fe71b5fe728b8f0323339a8c84480cbf629 everywhere
Just reproduced Mike’s output. The problem is that the altivec scaler is not bit-exact, “-sws_flags +bitexact” should fix it.
I don’t have any PPC hardware to play around on currently, so I’m glad somebody could debug this.
For the terminally curious, my sample while testing was the opening clip of Zero Wing. :)
I have no idea why the original codec outputs 640×480; it seems to encode either 320×240 or one other resolution (don’t remember exactly at the moment) and scale them up to 640×480 internally before passing off the decoded image.
About the scaling to 640×480, it’s done numerous reasons. Double Density support is one (sonic 2 splitscreen, runs in 640×448), and better scaling for 256×224 games is another. And it’s technically correct since this is the amount of scanlines visible on a real TV too.