Essentially, yes. But really they don't even have to do that. I'm sure their build process already automatically creates both versions.
Apple isn't switching to PPC; they already have both versions, and they've had PPC longer than Intel so it is even more stable. I am a software engineer, I have worked on Unix OS development for 28 years, and I currently work for a company that produces a version of Unix for two different architectures (not Apple, but you might be able to guess which). It works this way: there is a small amount of code that is architecture-dependent (PPC vs. x86, in Apple's case) and a much, much larger code base that is common, a little of which is platform-dependent (like MacBook vs. iMac vs. Mac Pro). The code that is specific to PPC has been around for ages, longer than the x86 code, and by now it is thoroughly debugged. All of the new features are implemented in common code. There will be some differences based on platform, but those will probably affect the Intel platforms more than PPC.
Testing is the major resource that has to be committed to continuing the PPC line. But it's not like they have to ramp that up from scratch; it just means continuing their current process. I would guess that Apple's main manufacturing and burn-in and systems testing is done overseas, and those assembly lines are already in place.
There are no technical reasons, as far as I see, for Apple to drop PPC at the present time. If they do, it is driven by marketing - simply the desire to force customers to purchase new hardware in order to get the latest OS features. (I know the rumor says that the next OS will not introduce new features, but we'll have to wait and see about that.)
I think the point is that if they're rewriting much of the OS in this build to be optimized for Intel and redoing it as Cocoa native, I think the general idea is that Apple is *NOT* merely maintaining a build of the OS but essentially rewriting it to be Cocoa native. For that reason, there is little benefit for Apple to write whole new parts of the OS for compliance with PowerPC and then optimize said new parts when you're talking about a platform that is deprecated. There are no *NEW* PowerPC machines coming out, there is little desire to write new hunks of code for the old machines when your focus is to move forward. They have maintained code... that is exactly what Leopard is. It's code that is PowerPC-capable and that has been maintained and optimized. With the dawn of 10.6 though, that apparently won't be the case. Rather than get bent out of shape about 10.6 and it's directions... assuming they're true (and it makes sense to me), just hope that if there are any niggling issues that they get fixed by the time the last build of 10.5.x gets released. In other words, get your bug reports in to Apple.
The whole point people are missing is that Apple is *NOT* going to just up and shelve compatibility with Carbon apps.
More importantly... They're *NOT* going to render an update to 10.4+ or 10.5+ that makes their update patch *brick* the legacy machines. People are already crying foul about their PowerPC machines not getting the latest bleeding edge code when the reality is, Microsoft and every other software company deprecates hardware as they consider it valid to do so. If the amount of work to rewrite and then maintain code for a platform that is longer for sale is deemed inefficient, it's only logical to scrap it.
Those machines will operate as-usual with the current OS that is on them. Apple has been doing this for quite awhile, some beige hardware couldn't run OS X without hacks... some newer PPC colored hardware couldn't run newer versions of OS X (i.e. the G3) even if they could run older versions. You may call this as Apple leaving people high 'n' dry, I consider it a matter of Apple doing what they need to do. I've gotten burned by Copland, Gershwin, and Rhapsody giving way to OS X and it's lack of support for any of my beige hardware. I was able to deal with it...
Moral of the story... this isn't a case of being orphaned. I know everyone would love to be able to stick with older hardware for years upon years upon years but in Apple's defense, if Motorola and IBM had been on their A-game the odds of that happening would've been diminished as well just as they have, it just would've stayed on PPC. Face it, both companies took years to make "minor" improvements. Intel, supposedly on an inferior platform, was able to eat Motorola and IBM's lunch and eventually... it got both kicked to the curb. Intel is delivering where the others failed.
Had the G4 and G5 scaled better than they did, as well as the G3 did for quite awhile, we'd probably still be on PowerPC. The fact we're not is because the industry passed them by. I had a PowerPC mini and from the moment I got it up until the end, it was a solid machine but it wasn't anything that I'd remotely call "speedy". Even with 1 Gig RAM the machines felt like they were merely adequate. By contrast, my dad's new Intel mini (Core 2 Duo) is pretty darn fast. Other than the most strenuous of tasks, it feels about on-par with my 24" iMac. Even there, for what he does... it's very solid and an above-average performer for his needs.
Moral of the story... if you don't need a new machine, don't buy one. Yet at some point Apple is going to have to draw the line, deprecate what needs deprecating, and do what they can to improve the API's and move programmers in the right direction. As even was noted at Google's I/O, one of their execs. said that programmers tend to be "Lazy" because they try to build on new features rather than do what is necessary to revamp the core foundations and weed things out themselves. John Nack of Adobe had previously stated if Adobe hadn't pushed Cocoa-64 on them as their only solution for the future, he doubted Adobe would've ever ported their apps. to a completely Cocoa framework, which would've left Apple supporting and maintaining 2 separate API's concurrently when they're better off with one streamlined API. Otherwise, you have the situation that Windows developers have where they're left with ancient API calls tied to applications that are still relevant circa 2008, or multiple ways of doing the same thing via the same API sets and having to debug how or why things work better or worse in others or how one applications use of a specific call can conflict with others. It's a freaking mess and a big part of why Microsoft has a team looking to streamline the API's.
This doesn't render your machines obsolete, it merely means that going forward there's going to be improvements to hardware and software that you might not be able to use. That is inevitable anyhow, as people with AGP Macs can't use PCI-E cards. People with machines that don't support 4+ Gig of RAM can't fully leverage a 64-bit app. People with Penryn Macs today are not going to be able to use Nehalem's DDR3 RAM. Yet... if you truly *NEED* those things you will save for and buy them as you need them. I ran on beige hardware for years using OS 9 daily long after OS X shipped. I don't miss OS 9 in the least, but... I used what I had 'til I could get something new.
Far as my opinion on what Apple is to do with 10.6... I wager that they probably will release a copy of 10.6 for developers at WWDC. I wager that the build shown at WWDC will show few new features, but mainly be a major rewrite of the non-Cocoa pieces of the OS and a considerable overhaul for speed and performance as per the rumor mill. I also wager though that by the time the system launches in January that there'll be a bunch of functionalities and features added. I could even see Snow Leopard being the codename for the developer release while the final 10.6 release could even be called something else.