Apple prefers the term "Universal Binaries", but "Fat Binary" is really more descriptive: this is a binary that contains code for two or more cpu's, which of course makes it larger, and thus "fat". As such binaries obviously wouldn't contain code for every cpu that exists, the "universal" tag really doesn't fit. These will become very important as Apple changes to Intel cpu's.
This is not the first time Apple has faced the problem of moving from one hardware platform to another: they moved from 68k processors to PowerPC in the mid-1990s and fat binaries were used then.
As you might suspect, fat binaries aren't necessarily a matter of just turning on a compiler switch. Apple has Developer Transition Resource Center that covers all the twists and turns.
It is interesting that this is the method chosen to handle the transition. There has to be pain somewhere: either fat binaries, or separate compilations, or a virtual machine - none of these are ideal, but you have to face the music somewhere.
Got something to add? Send me email.
More Articles by Tony Lawrence © 2009-11-07 Tony Lawrence
One of the main causes of the fall of the Roman Empire was that, lacking zero, they had no way to indicate successful termination of their C programs. (Robert Firth)