Has Apple given up on Q2D Extreme?

Discussion in 'macOS' started by zakatov, Jun 16, 2005.

  1. zakatov macrumors 6502a

    zakatov

    Joined:
    Mar 8, 2005
    Location:
    South Florida
    #1
    Now that the next (and probably the largest) Tiger update is on the horizon, and still no word on Q2DE implementation, it leaves me wondering if Apple has abandoned this technology altogether. Does anyone know if Q2DE is capable of running on Mactel version of OSX, because its inability to run on future OSX platform is the only possilble reason why it's not being developed further.

    Any ideas?
     
  2. therevolution macrumors 6502

    Joined:
    May 12, 2003
    #2
    It's implemented - just not turned on by default. Here are the instructions to enable it.

    As to whether it can run on Intel... I have no idea.
     
  3. zakatov thread starter macrumors 6502a

    zakatov

    Joined:
    Mar 8, 2005
    Location:
    South Florida
    #3
    If it's turned off by default, then for all practical purposes it can be considered not present. Something has stopped Apple from enabling it when Tiger was first released, and now, almost two updates later, there's no word on its progress. That's what's worrying me
     
  4. therevolution macrumors 6502

    Joined:
    May 12, 2003
    #4
    I got the impression that Tiger shipped with Q2DE disabled because there were still a few bugs left to iron out. I wouldn't assume it has anything to do with the Intel stuff. I guess we just get to wait and see what happens.
     
  5. stcanard macrumors 65816

    stcanard

    Joined:
    Oct 19, 2003
    Location:
    Vancouver
    #5
    Odd display flickers when doing some things. For instance whenever I rotate an image in iPhoto I get little checkerboards flickering.

    But whatever they're doing is small bugfix so I wouldn't expect a lot of updates on progress; because its already there and implemented people won't know if its shipping enabled until the gold maser comes out with the flag turned on.
     
  6. freiheit macrumors 6502a

    Joined:
    Jul 20, 2004
    Location:
    California
    #6
    I believe many versions of MacOS (and virtually every other OS on the planet) have shipped with certain "turned off" features that developers could tinker with to be ready for future OS releases. For instance, I recall that before Tiger was released there was talk of a development "resolution independent" display setting that would be put into Tiger; the eventual goal to be to allow MacOS X to always keep objects sized properly but alter their display quality (clarity) by manipulating the resolution. I suspect Quartz 2D Extreme could be very similar -- it may be there simply to give developers a taste of a future feature, but disabled so that Average Joe isn't affected by its pre-release quirks.
     
  7. daveL macrumors 68020

    daveL

    Joined:
    Jun 18, 2003
    Location:
    Montana
    #7
    Given that Q2D Extreme is all about moving one more large piece of the GUI drawing subsystem directly to the GPU, and the supported Mac GPUs exist in the PC world now and going forward, I see no impact at all with regard to the move to Intel CPUs.
     
  8. devman macrumors 65816

    devman

    Joined:
    Apr 19, 2004
    Location:
    AU
    #8
    It's off by default and with good reason. But you're no worse off now than you were before. In fact, you're better off because you have Core Image which runs on the GPU if you do things right. Where last year, CI examples had their views subclassing NSView (because Q2D Extreme was presumed) now you want your CI view to subclass NSOpenGLView instead.

    If you don't follow all of that - dont worry about it. All that matters is that software developers make use of CI correctly and you still get the benefits of it running on the GPU.
     
  9. zakatov thread starter macrumors 6502a

    zakatov

    Joined:
    Mar 8, 2005
    Location:
    South Florida
    #9
    I have no background in programming whatsoever, but from what you just said it seems that "last year," the CI would let the OS handle the GPU stuff without special calls, whereas now you have to specifically get the GPU involved. Am I understanding this correctly?
     
  10. devman macrumors 65816

    devman

    Joined:
    Apr 19, 2004
    Location:
    AU
    #10
    not quite.

    disclaimer1: I'm a newbie to the platform, but here's my understanding of it.

    disclaimer2: Taking a bit of license with some details to make this understandable to non-programmers.

    Core Image is presented as an API (application programming interface). Think of that as a set of functions that programmers can make use of. It is a new API (set of functions) available in Tiger. There's a good overview of it here.

    http://www.apple.com/macosx/features/coreimage/

    Now, there are several layers of API available for graphics and media. At the highest level are Cocoa APIs (NS), then there's Core Graphics (CG) which is for 2D, there's also now Core Image (CI) for image processing that runs on the GPU, there's OpenGL (GL) which is for 3D (also runs on GPU). (there are others too such as CM).

    3D stuff (GL) runs on the GPU. You see this in the genie effect when you minimize and maximize an app or when you use expose (for example). This hardware acceleration (3D on GPU) has been marketed as Quartz Extreme.

    Core Image is a new API for image processing that runs on the GPU. It ships with a bunch of pre-supplied functions such as filters and transitions, plus a plugin architecture caled Image Units. It's a much simpler, higher-level API for hardware accelerated (GPU based) image processing.

    We didn't have this before and apps that make use of it mean we are better off than we were before, even without Quartz 2D extreme. Since not all 2D is hardware accelerated (Q2D extreme) then programmers using CI simply need to be aware of what runs out on the GPU and what doesn't.
     

Share This Page