Universal Binaries

Discussion in 'macOS' started by mkrishnan, Jun 6, 2005.

  1. mkrishnan Moderator emeritus

    mkrishnan

    Joined:
    Jan 9, 2004
    Location:
    Grand Rapids, MI, USA
    #1
    I know there are a lot of threads on the Intel switch, but this topic is really interesting to me. I am not at WWDC (obviously), but it seems like even within the limited scope of the keynote, which is not under NDA, a lot more information about Transitive and Rosetta Stone has become available.

    Did anyone understand, does Apple plan to encourage developers to provide Universal Binaries more or less indefinitely at this point? Would that mean that (with corollary speed hits) PPC-based Macs would be able to run Apple software for the foreseeable future?

    It sounds like the Universal Binary is not exactly like a fat binary as previously discussed, in that rather than having two processor-specific embedded binaries, it has a processor specific one and some kind of bytecode one, much in the way of Java, that can run on Transitive's layer (aka Rosetta Stone).

    Anyway, anyone care to join in the speculation on this specific little pebble in the keynote? :D
     
  2. slb macrumors 6502

    Joined:
    Apr 15, 2005
    Location:
    New Mexico
    #2
    I think you're confusing Rosetta and fat binaries. They're seperate things. Fat binaries were supported way back on NeXTStep and include multiple compiled binaries for different processors. You can have one XCode project and create a single application that runs on either x86 or PPC without issue, freely moving between them.

    The JIT you're talking about is the Rosetta layer for emulating older PPC apps that don't provide an Intel binary. Non-universal apps compiled only for PPC. Using a fat binary, Rosetta wouldn't be needed as you'd already have native code.

    With fat binaries (guess I need to learn to call them "universal binaries"), today's PPC chips can be supported for years and years to come, invisibly to the user. I always wondered when Apple would take advantage of this ability.
     
  3. mkrishnan thread starter Moderator emeritus

    mkrishnan

    Joined:
    Jan 9, 2004
    Location:
    Grand Rapids, MI, USA
    #3
    SLB, so when Jobs was talking about Universal Binaries, did he mean fat binaries that contain just a PPC and an Intel implementation? Is that what Apple seems to be pushing developers to do, alongside Rosetta on Intel only, to allow for running PPC apps? It sounds fine as long as developers use intelligent development platforms. Like you said, under this condition, PPC machines might be preternaturally slow, but they wouldn't be completely obsoleted anytime soon...that would be nice! :)

    But, what concerns me is all the stuff that was ported God-knows-how...I seriously doubt SPSS/Mac was done in XCode, for instance. And these are usually the same developers who act like it is the biggest nuisance in the world to even just recompile their software under a Tiger-compatible version of the compiler they used.... :(
     
  4. slb macrumors 6502

    Joined:
    Apr 15, 2005
    Location:
    New Mexico
    #4
    With universal binaries, PPC will probably be supported for years and years. OS X is a very portable development system. We just never saw it because everything has been PPC only. :)

    All universal binaries means is that instead of compiling just a PPC app like we do today, we'll be compiling an app that is for PPC and Intel at the same time. OS X apps are really application folders containing all their resources, and inside that bundle will be two executables instead of just one. When you run the app, OS X will run the correct binary executable based on the chip architecture you're running on. This is all derived from NeXStep, which did this stuff over a decade ago to support the multiple chipsets it ran on. Way ahead of its time.

    As you can see, OS X is pretty much capable of running on any chip architecture Apple wants it to. That's why this announcement of moving to Intel isn't so surprising to me.

    Unfortunately, some developers need to learn to update or die off. Fortunately, Rosetta will emulate older PPC-only apps that don't supply an Intel binary in their bundles. With the faster speed of the Intel chips, you might not even notice a speed difference in those older apps. But honestly, I can't think of any OS X apps that would or should remain PPC-only, so I doubt it will even be an issue.
     
  5. mkrishnan thread starter Moderator emeritus

    mkrishnan

    Joined:
    Jan 9, 2004
    Location:
    Grand Rapids, MI, USA
    #5
    Ahhh, thank you for the education. :) I was familiar with most of what you said, and with what fat binaries are, but when the diction changed to *universal* binaries, I thought they were universal in some greater sense than Intel/PPC...à la bytecode. For any developer using a good development system and Cocoa, I understand that the transition will be very easy for developers, and that the end product will be satisfactory to mac users, PPC and Intel alike. And apparently it isn't even so bad for Carbon apps.

    With respect to to developers learning to "update or die off," I agree in principle, but the problem is that in some cases, app usage is dictated by forces outside of the Mac community's control. In the SPSS case, there *are* other applications to use, but SPSS is broadly recognized to an extent that the others are not in my field. Another example is law schools only supporting wintel for the law school exam software. Of course it could be written for MacOS, and of course someone could come up with an alternative that works just as well, but until that alternative has academic support....
     
  6. Westside guy macrumors 601

    Westside guy

    Joined:
    Oct 15, 2003
    Location:
    The soggy side of the Pacific NW
    #6
    I'm not sure why people seem to think getting a program to compile for multiple architectures is a trivial issue. Why do you think what's available for Linux PPC tends to lag behind what you can get for Linux x86? Or for that matter, why are the version numbers of fink packages lagging significantly behind what's current, in many cases? Sure there are a number of programs that will not be difficult to get working for both x86 and PPC, but many will require significant work to compile for both architectures. Software houses will only put the time in to do this for as long as it makes economic sense - and I'd guesstimate that'd mean 3-4 years, tops (I freely admit I don't know what's a typical length of time people hold on to their Macs though).

    I have no doubt that Apple's apps will support both platforms for quite a while, though - to do otherwise would be suicidal.
     
  7. RacerX macrumors 65832

    Joined:
    Aug 2, 2004
    #7
    As long as Apple has a way for current PowerPC apps to run on future Intel based hardware, I don't see that big a deal with the transition.

    As for what the future may look like... most likely not too different from how things looked in the past.

    This image is of a Rhapsody application package being installed on one of my systems (actually, I wasn't installing it as I already have this app on my systems, but this is what the dialog looks like). You are given a choice as to what hardware platform binaries you want installed.

    [​IMG]



    As for telling the difference between the platforms when running, here are two shots of my systems. One is based on the PowerPC 604e and the other an Intel Pentium. And it is the same application (Create) copied to both systems.

    [​IMG] [​IMG]

    Which is which?

    As for which platform I prefer... PowerPC.

    Sadly IBM seems to have other priorities. And they were never all that hot on the idea of the Altivec (it was added to the PowerPC 970 at Apple's request, it wasn't originally part of the design). And with Microsoft making moves towards the PowerPC, Apple's business wasn't able to keep IBM's attention.
     
  8. mkrishnan thread starter Moderator emeritus

    mkrishnan

    Joined:
    Jan 9, 2004
    Location:
    Grand Rapids, MI, USA
    #8
    This is another reason why I'm very curious to see how universal binaries play out. If Apple can really convince enough developers to develop on Cocoa and publish universal binaries, then the door really still isn't closed on IBM. Unless Apple signed an exclusivity deal with Intel (it has obviously happened in the past with Intel pushing Wintel majors to not use AMD chips :eek: ), I don't see why the dual binary scenario really prevents Apple from selling a mix of processors on their computers, should IBM come to the table and produce some interesting new chips.

    No content file cares which system it runs on, right, just executables? And as long as all executables are either PPC or Universal, then the only thing holding IBM out of the Apple market is Apple's perception of IBM...and the "switch" to Intel is not a one way street. :cool:
     
  9. slb macrumors 6502

    Joined:
    Apr 15, 2005
    Location:
    New Mexico
    #9
    OS X apps rely on API calls. The underlying processor is irrelevant if the APIs are ported to the new platform. The source code is still making the same API calls as before. You write a Cocoa app exactly the same way as before. There is nothing dependent on the PowerPC chip.

    I'm sure there are some apps that rely directly on the PowerPC for something, perhaps some random bits of assembly code, but for the most part everybody is using C/C++/Objective-C and calling Cocoa and Carbon functions.

    Any number of factors. Smaller developer support, Linux being more lower-level to the system, and so on.

    The work will only be difficult if your application is for some reason using direct calls to the processor ala assembly code or making some assumption about the processor. Some Adobe Photoshop filters might be written in assembly language, for instance. Also, Logic software instruments that rely on assembly for speed.

    But take an app like NetNewsWire. It's only going to be calling Cocoa objects for all its code. The processor is irrelevant because the app is dependent on the APIs, not the processor.

    OS X is derived from OpenStep, and the original goal for OpenStep was to be ported to any architecture and run fat binaries compiled for all of them. It was even available for Windows NT for a time.

    Steve Jobs is already going around saying this isn't as big an announcement as some are saying it is. OS X is a very portable platform. We've just been PPC only so far. The APIs are what matters, not the chip doing the math.
     
  10. unfaded macrumors 6502

    Joined:
    Dec 12, 2002
    Location:
    Seattle, WA
    #10
    ...cos it's an operating system, not something running within the operating system. Wow, that so... so stupid.
     
  11. RacerX macrumors 65832

    Joined:
    Aug 2, 2004
    #11
    I think that is a little off the deep end.

    Even with the OpenStep APIs, not all applications ran on all hardware platforms effortlessly. I know of tons of applications that only work on one hardware platform or another.

    And more to the point, applications written for NEXTSTEP or OPENSTEP had to still be heavily modified to run in Windows NT.

    :rolleyes:

    It does help to have not only first hand experience with this, but daily experience with this. I use this stuff daily still, so these pit falls are current issues for me rather than historical points to be sited.
     
  12. mkrishnan thread starter Moderator emeritus

    mkrishnan

    Joined:
    Jan 9, 2004
    Location:
    Grand Rapids, MI, USA
    #12
    Do you think this is true for apps written in cocoa and objective C, in XCode 1.x, and outside the realm of system enhancements, or more for carbonizations and so on?
     
  13. RacerX macrumors 65832

    Joined:
    Aug 2, 2004
    #13
    Well, as OpenStep was renamed Yellow Box for Rhapsody, and then renamed (again) as Cocoa for Mac OS X... yes, this is true for Cocoa/Objective C applications.

    Some apps are going easy, others are going to be harder. Somethings that look like you should just be able to recompile to get them to run... just end up not working.

    This isn't going to be trivial. And that is why the developers are getting this stuff now, well ahead of the rest of us.
     
  14. csubear macrumors 6502a

    csubear

    Joined:
    Aug 22, 2003
    #14
    I have a big question, how are developers going to adress existing customers. I have an older copy of photoshop, Office X, WoW, ect, what happens when i move to an intel mac? Will I be forced to rebuy this apps if I want to run native code? Okay so maybe photoshop and office can run with rossetta, but what about WoW? Or maybe someother game? Or how about an App that does not have a G3 version? How about iDvD in ILife 05 ? What if I don't want ILife 06? I hope companies put in place some sort of patch system so that we can get intel binaries....

    One more big question, will intel create a objective-c complier for x86, or will cocoa be stuck with gcc?
     
  15. mkrishnan thread starter Moderator emeritus

    mkrishnan

    Joined:
    Jan 9, 2004
    Location:
    Grand Rapids, MI, USA
    #15
    CSUBear, that's a good question...it's really hard to say. The MS rep's words were that "future" versions of Office would ship universal, and not that current ones will...I guess it'll really depend on the company. I'd bet it's more likely that things that do not involve upgrades (maybe like WoW?) will get free unibinaries than things that are normally upgraded -- like Office is definitely likely to be upgraded around the time Intel rolls out, and even PS wouldn't be that far off the path to see CS3 at that time.... Sadly. :(

    Another question. Anyone know, for the big companies who develop carbon apps, esp. MS and Adobe... what is their development platform? Is it really true that pretty much everyone developing for Mac is using Metrowerks or XCode, or do you think that MS and Adobe are using custom environments that are integrated into their mainstream development environments (such as, perhaps, in MS' case, some kind of Apple API add-on to Visual Studio)?
     
  16. Cooknn macrumors 68020

    Cooknn

    Joined:
    Aug 23, 2003
    Location:
    Fort Myers, FL
    #16
    Amen brother :p We are very fortunate to have OS X. Computing is fun again! Wait a minute, not sure if it ever was... Except for when I was playing on that NeXT box at Comdex back in '93 - I was grinning from ear to ear :D
     
  17. Applespider macrumors G4

    Applespider

    Joined:
    Jan 20, 2004
    Location:
    looking through rose-tinted spectacles...
    #17
    Well, if you buy a new Intel Mac, you'll get the new current version of iLife so whether iLife 05 runs will be largely irrelevant (so long as it can read your existing projects which I'm sure it will).

    I expect companies will start announcing their upgrade/patch systems once the amount of work becomes obvious and they've seen how it currently emulates under Rosetta. Or when they/Apple start their marketing plan to stop current sales tanking. :) Would you order CS2 without knowing there will be a patch for Intel Macs or decent performance under Rosetta, unless you expected to get your money back in the next year or you knew you'd be upgrading to CS3 in 2 year's time?

    I guess the question is how long you expected to get high performance out of those applications - it's rarely going to be 'forever' and I suppose you have to ask whether you'd have updated them in a couple of years to newer versions. Will WoW 2 be out then... and running at a higher FPS on an Intel Mac...
     
  18. RacerX macrumors 65832

    Joined:
    Aug 2, 2004
    #18
    What happened when you moved to Mac OS X from Mac OS 8/9?

    This transition actually looks to be easier than that.

    As with the transition to Mac OS X, you'll eventually need to upgrade apps.

    And as for games, those have a short shelf life. By the time the transition is complete, WoW and other games are going to be over shadowed by newer games.

    I never got a Mac OS X native version of Rainbow Six or Rogue Spear... I loved those games. But I did get Ghost Recon and we now have Rainbow Six 3.

    Those are Apple apps. There are already native versions of them. I would imagine that a small patch may work, but by the time this happens, you'll most likely need to upgrade to run them anyways.

    Honestly, none of this is all that complex. If you did either of the previous transitions, then you should have some idea of what you can look forward to.

    And no one has a gun to your head here and is forcing you to buy one of these as soon as they come out. You can keep on using your current Mac for years. And you'll be able to buy (at least) higher end Macs with G5s until mid 2007. You could put this transition off until around 2010 if you wanted. It is not a today thing.
     
  19. slb macrumors 6502

    Joined:
    Apr 15, 2005
    Location:
    New Mexico
    #19
    I said that was the goal of OpenStep. It didn't always accomplish it at the time. :)

    The technology has matured a lot since then. OS X was designed from the beginning to be portable. People should take comfort in the fact that a chip switch was always kept in the back of the minds of OS X devs since 10.0.
     
  20. Cooknn macrumors 68020

    Cooknn

    Joined:
    Aug 23, 2003
    Location:
    Fort Myers, FL
    #20
    Just got an e-mail from the president of the company that wrote the software for my panoramic photo's (sig) - they're at WWDC right now with their source code. Apple is helping them get it compiled into universal binaries as we speak :D Now that's customer service! They'll have that program running on Intel Mac's way before I buy one :p
     

Share This Page