Category Archives: General

Running Windows XP In 2016

I have an interest in getting a 32-bit Windows XP machine up and running. I have a really good yet slightly dated and discarded computer that seemed like a good candidate for dedicating to this task. So the question is: Can Windows XP still be installed from scratch on a computer, activated, and used in 2016? I wasn’t quite sure since I have heard stories about how Microsoft has formally ended support for Windows XP as of the first half of 2014 and I wasn’t entirely sure what that meant.

Spoiler: It’s still possible to install and activate Windows XP as of the writing of this post. It’s also possible to download and install all the updates published up until support ended.

The Candidate Computer
This computer was assembled either in late 2008 or early 2009. It was a beast at the time.


New old Windows XP computer
Click for a larger image

It was built around the newly-released NVIDIA GTX 280 video card. The case is a Thermaltake DH-101, which is a home theater PC thing. The motherboard is an Asus P5N32-SLI Premium with a Core 2 Duo X6800 2.93 GHz CPU on board. 2 GB of RAM and a 1.5 TB hard drive are also present.

The original owner handed it off to me because their family didn’t have much use for it anymore (too many other machines in the house). Plus it was really, obnoxiously loud. The noisy culprit was the stock blue fan that came packaged with the Intel processor (seen in the photo) whining at around 65 dB. I replaced the fan and brought the noise level way down.

As for connectivity, the motherboard has dual gigabit NICs (of 2 different chipsets for some reason) and onboard wireless 802.11g. I couldn’t make the latter work and this project was taking place a significant distance from my wired network. Instead, I connected a USB 802.11ac dongle and antenna which is advertised to work in both Windows XP and Linux. It works great under Windows XP. Meanwhile, making the adapter work under Linux provided a retro-computing adventure in which I had to modify C code to make the driver work.

So, score 1 for Windows XP over Linux here.

The Simple Joy of Retro-computing
One thing you have to watch out for when you get into retro-computing is fighting the urge to rant about the good old days of computing. Most long-time computer users have a good understanding of the frustration that computers keep getting faster by orders of magnitude and yet using them somehow feels slower and slower over successive software generations.
Continue reading

Saying Goodbye To Old Machines

I recently sent a few old machines off for recycling. Both had relevance to the early days of the FATE testing effort. As is my custom, I photographed them (poorly, of course).

First, there’s the PowerPC-based Mac Mini I procured thanks to a Craigslist ad in late 2006. I had plans to develop automated FFmpeg building and testing and was already looking ahead toward testing multiple CPU architectures. Again, this was 2006 and PowerPC wasn’t completely on the outs yet– although Apple’s MacTel transition was in full swing, the entire new generation of video game consoles was based on PowerPC.


PPC Mac Mini pieces

Click for larger image


I remember trying to find a Mac Mini PPC on Craigslist. Many were to be found, but all asked more than the price of even a new Mac Mini Intel, always because the seller was leaving all of last year’s applications and perhaps including a monitor, neither of which I needed. Fortunately, I found this bare Mac Mini. Also fortunate was the fact that it was far easier to install Linux on it than the first PowerPC machine I owned.

After FATE operation transitioned away from me, I still kept the machine in service as an edge server and automated backup machine. That is, until the hard drive failed on reboot one day. Thus, when it was finally time to recycle the computer, I felt it necessary to disassemble the machine and remove the hard drive for possible salvage and then for destruction.
Continue reading

Evolution of Multimedia Fiefdoms

I want to examine how multimedia fiefdoms have risen and fallen through the years.


Medieval Castle

Back in the day, the multimedia fiefdoms were built around the formats put forth by competing companies: there was Microsoft/WMV, Apple/MOV, and Real/RM as the big contenders. On2 always wanted to be a player in this arena but could never quite catch a break. A few brave contenders held the line for open source and also for the power users who desired one application that could handle everything (my original motivation for wanting to get into multimedia hacking).

The computer desktop was the battleground for internet-based media stream. Whatever happened to those days? Actually, if memory serves, Flash-based video streaming stepped on all of them.

Over the last 6-7 years, the battleground has expanded to cover mobile devices, where Flash’s impact has… lessened. During this time, multimedia technology pretty well standardized on a particular stack, namely, the MPEG (MP4/H.264/AAC) stack.

The belligerents in this war tried for years to effectively penetrate new territory, namely, the living room where the television lived. This had been slowgoing for years due to various user interface and content issues, but steadily improved.

Last April, Amazon announced their entry into the set-top box market with the Fire TV. That was when it suddenly crystallized for me that the multimedia ecosystem has radically shifted. Now, the multimedia fiefdoms revolve around access to content via streaming services.

Off the top of my head, here are some of the fiefdoms these days (fiefdoms I have experience using):

  • Netflix (subscription streaming)
  • Amazon (subscription, rental, and purchased streaming)
  • Hulu Plus (subscription streaming)
  • Apple (rental and purchased media)

I checked some results on Can I Stream.It? (which I refer to often) and found a bunch more streaming fiefdoms such as Google (both Play and YouTube, which are separate services), Sony, Xbox 360, Crackle, Redbox Instant, Vudu, Target Ticket, Epix, Sony, SnagFilms, and XFINITY StreamPix. And surely, these are probably just services available in the United States; I know other geographical regions have their own fiefdoms.

What happened?

When I got into multimedia hacking, there were all these disparate, competing ecosystems. As a consumer, I didn’t care where the media came from, I just wanted to play it. That’s what inspired me to work on open source multimedia projects. Now I realize that I have the same problem 10-15 years later: there are multiple competing ecosystems. I might subscribe to fiefdoms X and Y, but am frustrated to learn that something I’d like to watch is only available through fiefdom Z. Very few of these fiefdoms can be penetrated using open source technology.

I’m not really sure about the point about this whole post. Multimedia technology seems really standardized these days. But that’s probably just my perspective because I have spent way too long focusing on a few areas of multimedia technology such as audio and video coding. It’s interesting that all these services probably leverage the same limited number of codecs. Their differentiation comes from the catalog of content that each is able to license for streaming. There are different problems to solve in the multimedia arena now.

Visualizing Call Graphs Using Gephi

When I was at university studying computer science, I took a basic chemistry course. During an accompanying lab, the teaching assistant chatted me up and asked about my major. He then said, “Computer science? Well, that’s just typing stuff, right?”

My impulsive retort: “Sure, and chemistry is just about mixing together liquids and coming up with different colored liquids, as seen on the cover of my high school chemistry textbook, right?”


Chemistry fun

In fact, pure computer science has precious little to do with typing (as is joked in CS circles, computer science is about computers in the same way that astronomy is about telescopes). However, people who study computer science often pursue careers as programmers, or to put it in fancier professional language, software engineers.

So, what’s a software engineer’s job? Isn’t it just typing? That’s where I’ve been going with this overly long setup. After thinking about it for long enough, I like to say that a software engineer’s trade is managing complexity.

A few years ago, I discovered Gephi, an open source tool for graph and data visualization. It looked neat but I didn’t have much use for it at the time. Recently, however, I was trying to get a better handle on a large codebase. I.e., I was trying to manage the project’s complexity. And then I thought of Gephi again.

Prior Work
One way to get a grip on a large C codebase is to instrument it for profiling and extract details from the profiler. On Linux systems, this means compiling and linking the code using the -pg flag. After running the executable, there will be a gmon.out file which is post-processed using the gprof command.

GNU software development tools have a reputation for being rather powerful and flexible, but also extremely raw. This first hit home when I was learning how to use the GNU tool for code coverage — gcov — and the way it outputs very raw data that you need to massage with other tools in order to get really useful intelligence.

And so it is with gprof output. The output gives you a list of functions sorted by the amount of processing time spent in each. Then it gives you a flattened call tree. This is arranged as “during the profiled executions, function c was called by functions a and b and called functions d, e, and f; function d was called by function c and called functions g and h”.

How can this call tree data be represented in a more instructive manner that is easier to navigate? My first impulse (and I don’t think I’m alone in this) is to convert the gprof call tree into a representation suitable for interpretation by Graphviz. Unfortunately, doing so tends to generate some enormous and unwieldy static images.

Feeding gprof Data To Gephi
I learned of Gephi a few years ago and recalled it when I developed an interest in gaining better perspective on a large base of alien C code. To understand what this codebase is doing for a particular use case, instrument it with gprof, gather execution data, and then study the code paths.

How could I feed the gprof data into Gephi? Gephi supports numerous graphing formats including an XML-based format named GEXF.

Thus, the challenge becomes converting gprof output to GEXF.

Which I did.

Demonstration
I have been absent from FFmpeg development for a long time, which is a pity because a lot of interesting development has occurred over the last 2-3 years after a troubling period of stagnation. I know that 2 big video codec developments have been HEVC (next in the line of MPEG codecs) and VP9 (heir to VP8’s throne). FFmpeg implements them both now.

I decided I wanted to study the code flow of VP9. So I got the latest FFmpeg code from git and built it using the options "--extra-cflags=-pg --extra-ldflags=-pg". Annoyingly, I also needed to specify "--disable-asm" because gcc complains of some register allocation snafus when compiling inline ASM in profiling mode (and this is on x86_64). No matter; ASM isn’t necessary for understanding overall code flow.

After compiling, the binary ‘ffmpeg_g’ will have symbols and be instrumented for profiling. I grabbed a sample from this VP9 test vector set and went to work.

./ffmpeg_g -i vp90-2-00-quantizer-00.webm -f null /dev/null
gprof ./ffmpeg_g > vp9decode.txt
convert-gprof-to-gexf.py vp9decode.txt > ~/bigdisk/vp9decode.gexf

Gephi loads vp9decode.gexf with no problem. Using Gephi, however, can be a bit challenging if one is not versed in any data exploration jargon. I recommend this Gephi getting starting guide in slide deck form. Here’s what the default graph looks like:


gprof-ffmpeg-gephi-1

Not very pretty or helpful. BTW, that beefy arrow running from mid-top to lower-right is the call from decode_coeffs_b -> iwht_iwht_4x4_add_c. There were 18774 from the former to the latter in this execution. Right now, the edge thicknesses correlate to number of calls between the nodes, which I’m not sure is the best representation.

Continue reading