Seriously, chill

by Chris Adamson

Related link:… said it first, so they're the ones wiping egg off their face if it's wrong. But let's assume that Steve comes out to the WWDC crowd on Monday and says "hey kids, you're all writing Intel code now".

Not... I repeat... not a problem. Not a cause to be "be surprised, amazed and concerned", as the analyst in the article says. Does it make sense? Jason Snell doesn't think so. I'm more optimistic.

Everyone talks about Apple's great transition from 680x0 to PowerPC in the 90's and credits the remarkably stable 680x0 emulator. But have people already forgotten "fat binaries"? These were applications that contained both 680x0 and PowerPC code, so they could work on either architecture. Surely, with the well thought-out structure of the bundle, there's a place for PowerPC code, and a place for x86 code. If XCode starts supporting cross-compilation for the two architectures on Monday, we could all be shipping compatible applications long before an x86-based Mac hits the market, and not even know it.

Consider this too: with the 680x0 to PowerPC transition, there were a lot of applications that were basically "done" - they were widely deployed but not undergoing active development. These had to be emulated, because they couldn't/wouldn't be recompiled. But widespread interest in Mac OS X only really picked up a few years ago with the launch of Mac OS X 10.2 (Jaguar)... a pretty short time for a codebase to be abandoned. It's hard to think of anyone who's shipped OS X apps and then dumped them. OK, Samsung printer drivers, but their stuff sucks anyways, and they won't be missed. Anyways, my point being, the apps we all use are still under active development, and with cross-compilation tools made available early, it could be trivial to support a new architecture. It would obviously behoove Apple to see to it that this is the case.

And who said it had to be x86? Maybe it's something else. If there's an issue that I'm interested in seeing an answer for, it's endianness: the Mac CPU's have always been big-endian, x86 is little-endian. Or is it? Many modern processors are flexible about endianness (before G5, PowerPC's could run little-endian too, which made VirtualPC work well until the G5), so maybe whatever goes in this new hardware will appear big-endian to the developer.

Who knows - maybe they'll keep PowerPC in the portables, where its low heat and modest power consumption is a big win, and just use x86 in the desktops, where it has long since humiliated PowerPC's performance. This strategy makes sense: with a processor-agnostic OS, Apple can further optimize hardware for the best user experience.

Blind optimism, or does fortune favor the bold?


2005-06-04 20:32:39
Apple guesses
I'm guessing the opposite. I expect that mac mini and powerbooks/ibooks will be the first to use Centrino technology such as the Pentium M. Intel has done a pretty good job with the Pentium M of bringing good speed and low power consumption. The G5 seems to run too hot for notebooks. Maybe it won't be anything like this at all. I guess we'll see in a couple of days.


2005-06-04 23:27:21
Pentium M?
The Pentium M has been mentioned as a possible replacement for the G4 in low power application because it is a supposed low-power high performance processor.

I wonder how many of those who say this is such a great alternative to the G4 have actually ever used a computer based on it.

I have. I use a 1.6GHz Pentium M based laptop just about every day. I also use a 1.25GHz G4 based laptop. They have equivalent memory (though the Pentium M uses both clock transitions with the DDR Ram), video cards (ATI9600) and hard drives (both 80Gb but the one in the Pentium M machine is faster).

All that I can say is that in a few cases the Pentium M seems a bit faster, in many cases the G4 seems faster, and in many cases they seem about the same. In terms of battery life (both in-use and standby) they seem to be about the same (except that the Windows battery indicator overstates the remaining power in the battery).

Basically, I don't perceive the Pentium M to have any advantages over the G4 (other than the ability to use both clock transitions in the DDR interface). The G4 has a number of advantages over the Pentium M, real and potential. First, although a true DDR interface isn't part of the current G4 as used by Apple, Freescale does have such support on other G4 processors such as the 8641. Second, Freescale does have a dual-core G4 in the 8641D. These enhancements should make it to the chips used by Apple. The power dissipation of the G4 is lower than the Pentium M (8641D at 15W according to Freescale versus 21W for the Pentium M according to Intel). The G4 e600 core can also be embedded in SOC solutions such as the 8641/8641D. There are no Pentium based SOC solutions.

Freescale also has a 64-bit e700 core in development (real 64-bit as opposed to Intel's 64-bit marketing smoke and mirrors). And since the e600 based processors are available now, the e700 based processors should be following anytime.

In terms of technology, the Pentium M has no advantages over the G4. In terms of marketing, Intel is tough to beat.

2005-06-06 02:41:36
Life after Altivec?
I think we might see all sorts of things, but I don't think we'll see an IA32-based Mac. What with the NEXTSTEP fat-binary heritage and all, there's probably never been an easier time to build an IA32-based Mac, and build MacOSX and all the applications for IA32. But there are still bits of MacOS9 here and there which are stubbornly big-endian, and you'd break document binary compatibility if you made the new OS completely little-endian. And it would cost you engineer time to do this, and there are only so many engineers with only so many hours in the day; what would you gain?

You'd gain hardware with a higher clock rate but which gets slightly less done per clock cycle, and has a well-known consistent function call stack format which the VXers know how to abuse, and which doesn't have a compatible on-chip vector processing unit -- Altivec is good for graphics and audio processing, but it has other uses, eg (I'm told) in CPU-intensive parts of the TCP stack.

So I think you *could* build an IA32 based Mac quite easily (and much more easily than when this was looked at in 1992), but the game just isn't worth the candle. An IA64-based Mac makes slightly more sense, but not much more. Steve could be playing hardball with IBM over the G5 roadmap; or he's interested in BIOS chipsets, or low-power radio networking chipsets, or chip foundry capacity.

Intel did make DSP chips -- or at least, chips that NeXT used as DSPs -- and if I remember their endianity was configurable. Maybe Intel is about to become the latest member of the PowerPC consortium; but I can't see that as very likely either.

2005-06-06 07:15:59
Pentium M?
OK, for the record I have no desire to see an x86-64 architecture in the Apple product line. I like the PowerPC chip and have since it was first introduced along with the concept of Taligent. one survived.

However, your snide comment on '64 bit marketing smoke and mirrors' is patently false. EM64T and AMD64 use full 64 bit general purpose registers and it is truely a 64 bit chip. If 64 Bit Xeon is 'smoke and mirrors' then so is 32 and 16 bit because it was done the same way.

2005-06-06 10:38:22
I'll be damned.
It's true. I really thought this was just a case of the media screwing up, as they did with the "Apple to buy Universal Music" headlines. Wow.
2005-06-06 10:46:58
Re: I'll be damned.

What site are you following the SteveNote on? Macrumors' vaunted AJAX effort has crashed and burned into unresponsiveness, and I'm only getting reasonable updates from MacWorld/MacCentral.


2005-06-06 11:14:42
Re: I'll be damned.
Of course I'm too late answering your question, but I got my updates from

How's this for news: For $1,499 (for a $500 Select membership and $999 for the hardware and software), you can get an Intel-based Mac. Order today, receive in about two weeks.

Now I'm just trying to figure out how to get my Dashboard to show the weather in Hell.

2005-06-06 13:45:41
Details now available

Details for developers are now available in the form of the Universal Binary Programming Guidelines (PDF, 1.5 MB). And yes, developers will have to be careful about endianness.

There are other interesting hazards to note, like getting burned by endianness in Java, where we almost never notice endianness, if you're using the java.nio I/O package.


2005-06-06 18:05:41
Objective-C still the choice?
With Intel creating C/C++ compiliers, do you think we likely to see Apple shift away from Objective-C as the language of choice? Who knows, maybe ms will even officially port .NET (as opposed to MONO). This, I would think would be less likely however.