Aperture 3 - performance bottlenecks?

Discussion in 'Digital Photography' started by HHarm, Jun 23, 2010.

  1. HHarm macrumors regular

    Joined:
    Mar 4, 2009
    #1
    I have a 2.66 quad core Mac Pro with ati 4870 and 6Gb of ram. I also have a X-25M x2 raid0 boot drive, a regular 7200 rpm drive and a NAS with a gigabyte connection. My camera is Canon 5D mark II. I shoot raw.


    Aperture 3 isn't too snappy although it's completely usable.

    1. Would extra memory benefit - maybe 12 or 24 gigs? Currently Aperture doesn't fill the 6 gig memory but if it recognized that there's more, would it take benefit from it?

    2. Does the library location have a lot to do with performance? The initial load isn't that important but would it affect when manipulating images or changing between images inside a a library? Also would extra ram lower the need to use the hd?


    Or is the graphics card the only real bottleneck for my setup? (I have 30" display)
     
  2. VirtualRain macrumors 603

    VirtualRain

    Joined:
    Aug 1, 2008
    Location:
    Vancouver, BC
    #2
    I have a similar setup... 2.93GHz Quad, with 3xX25 in RAID0 and 6GB of RAM. Looking at iStat, the issue for me seems to be CPU speed. Aperture doesn't use all cores, I see it peak around 400-500% CPU utilization and it's not maxing out my memory. So I think a faster processor is the key to better Aperture performance. 3.33GHz here I come. :)
     
  3. G.T. macrumors 6502a

    Joined:
    Jul 12, 2008
    #3
    Would 3.33GHz really bring much benefit, especially power to price ratio
     
  4. davegregory macrumors regular

    Joined:
    Jul 7, 2009
    Location:
    Burlington, Ontario
    #4
    I'm surprised with this setup that you find it "not too snappy". I have an i7 iMac with 16gb of ram and I find it very fast since 3.0.2 came out. I know that it's subjective to rate the performance like that. Do you have a lot of files in your library? I've noticed from using Aperture 2 that if you have more than 1000 photos in your library its really starts to bog down. I have taken to making libraries for each shoot I do, or one for a month for things that don't fall into any real category. If you don't have a lot of photos in your library then I'm baffled as to why it wouldn't be running faster.
     
  5. VirtualRain macrumors 603

    VirtualRain

    Joined:
    Aug 1, 2008
    Location:
    Vancouver, BC
    #5
    For a 2.66GHz user, an upgrade to a 3.33 is a 25% increase performance for what may be a net $500 invesment (assuming an $800 cpu is purchased from eBay and the old one is sold for $300).

    There's a thread in the Mac Pro forum about this very topic now where one person sourced a 3.33GHz Xeon for $559 or something to that effect.

    Given that Apple charges $1200 for the same upgrade, it's highly likely the $500 investment can be recovered in eventual resale value as well.

    And a 25% performance improvement is nothing to scoff at.
     
  6. cosmokanga2 macrumors 6502a

    cosmokanga2

    Joined:
    Jan 7, 2008
    Location:
    Canada, where we live in igloos.
    #6
    It could be you're graphics card as Aperture is very GPU intensive plus you're powering a 30" screen. Also, make sure the library is either on your main HD or at the very least FW800 or E-Sata connected.

    Though this might not affect you, another photographer who shoots with a 5D MK II has had major problems with Aperture and his RAW files. His posts, and posts might help.
     
  7. CrackedButter macrumors 68040

    CrackedButter

    Joined:
    Jan 15, 2003
    Location:
    51st State of America
    #7
    Those are good links, but from that blog post, there is another link provided by a reader to an 11 page Apple support discussion page which really goes indepth toward optimising Aperture, dealing with performance by looking at how the Aperture library over time fragments and thus reduces performance every time its in use.

    I'm seeing it right now with my Aperture library as its heavily fragmented, taking me 7 hours to transfer 230GB of data to the backup drive while a folder half that size takes over 30 minutes because it isn't as fragmented.

    Tonight I'll be either buying idefrag or formatting and reinstalling my Snow Leopard installation. It is worth a read if anybody is interested in improving their system's performance.

    Here are the links to the truly insightful discussion: http://discussions.apple.com/thread.jspa?threadID=2343039&start=0&tstart=0
     
  8. OreoCookie macrumors 68030

    Joined:
    Apr 14, 2001
    Location:
    Sendai, Japan
    #8
    In my case it was RAM: I went from a 2 GB 2 GHz Core Duore to a 2 GB 2.33 GHz Core 2 Duo to a 8 GB 2.5 GHz Core i5 machine. I didn't feel a thing going from 2 to 2.33 GHz, but ever since upgrading to my new MacBook Pro, Aperture has been running like a dream. It feels like it was always supposed to feel like this. I rarely wait. The first time I browsed the library, I thought: wow, this is fast. Now I change from Preview mode to RAW. Turned out, I was in RAW the whole time! BTW, in all three cases, I kept my 640 GB harddrive, so harddrive are not a factor in my case.

    Judging by the cpu meter, the biggest reasons seems to be having enough RAM.
     
  9. CrackedButter macrumors 68040

    CrackedButter

    Joined:
    Jan 15, 2003
    Location:
    51st State of America
    #9
    From page 4 of the supplied link:

    If our average user needs more disk space, he simply replaces the smaller disk with a bigger one, and when he does...YOU GUESSED IT...the restore process building the new drive does a basic defrag of the data from the old drive due to the way restore works, PLUS gave him plenty of new space...DOUBLE BONUS, reset the trouble timer, marketing guy is proved right again!! The funniest thing here is that the user will attribute the great increase in speed to his cool larger drive, when in reality the majority of the performance gain was due to the defrag.

    OK, but what if you do not run out of space on your drive? Well the length of time it takes with an average user with average use and small files to see a serious slowdown is gonna translate into 2-3 years and then it typically coincides with a trip to the Apple store, and "Jeez these new machines are so much faster than my old one...time to get a new computer..." End of life for our average computer no more risk of slowdowns here, a happy customer, AND that marketing guy was right again....I am telling you that is one smart guy, heck, simply brilliant. I am glad I am an Apple shareholder, lol.


    Not so fast Max ;-)
     
  10. OreoCookie macrumors 68030

    Joined:
    Apr 14, 2001
    Location:
    Sendai, Japan
    #10
    @CrackedButter
    Well, I've replaced the harddrive way before I got the new machine (the harddrive replacement was in early December, the upgrade to the interim machine in late December, the new machine came in May). So yes, I benefitted from my data being contiguous, but I kept the harddrive when updating the machines.

    On my old machines, I did not see a significant speed increase, because these machines were running out of RAM all the time. Perhaps I would have seen a speed increase if I had more RAM, but compared old and new harddrive. For sure, the newer one would be a lot faster, but in practice, I didn't see and feel any benefits (other than having bootloads of free harddrive space).
     
  11. CrackedButter macrumors 68040

    CrackedButter

    Joined:
    Jan 15, 2003
    Location:
    51st State of America
    #11
    Maybe your drive hasn't fragmented to such a degree that would impact performance yet.
     
  12. OreoCookie macrumors 68030

    Joined:
    Apr 14, 2001
    Location:
    Sendai, Japan
    #12
    I've done a block copy from the old drive to the new one which was filled to the brim. Perhaps OS X' built-in defrag utility has taken care of that, but I doubt it. However, it surely helps that the data is located on faster parts of the drive.

    In any case, the fact that I really, really have to try to get any page outs, RAM is a much bigger factor in my specific configuration. I probably need only 6 GB at most times, but I only had the option of 4 (too little) and 8 (currently more than enough :)).
     
  13. termina3 macrumors 65816

    Joined:
    Jul 16, 2007
    Location:
    TX
    #13
    How big is the library? I've found my bottlenecks have to do with R/W speeds to the HDDs…not RAM or processing.

    EDIT: Just followed crackedbutter's link. It explains all my problems with Aperture.

    The ultimate upgrade in this case (unfortunately) is switching to Lightroom.
     
  14. OreoCookie macrumors 68030

    Joined:
    Apr 14, 2001
    Location:
    Sendai, Japan
    #14
    It's about 78 GB in size. Typical projects contain 100~300 photos (in RAW, of course).
    I've tried all incarnations of Lightroom on my old machines, I didn't find it faster than Aperture at all -- even though I was using a very small test library consisting of only 50 images or so. If the bottle neck is the harddrive, then I don't think you can even expect a speed-up from switching: the harddrive under Aperture is just as fast/slow as under Lightroom.
     
  15. CrackedButter macrumors 68040

    CrackedButter

    Joined:
    Jan 15, 2003
    Location:
    51st State of America
    #15
    No, it might be easier to switch to LR but Aperture takes more time to manage from reading that link and its worth the time if managed properly.
     
  16. termina3 macrumors 65816

    Joined:
    Jul 16, 2007
    Location:
    TX
    #16
    I went ahead and installed iDefrag, ran it on my RAID array that holds my Aperture files, and boy! did it speed things up. I'm not exaggerating when I say loading full-size files takes half the time--or less--than it did previously. LightRoom is not faster when compared to my 'new' Aperture.

    Great product. Best third-party utility I've ever bought.
     
  17. CrackedButter macrumors 68040

    CrackedButter

    Joined:
    Jan 15, 2003
    Location:
    51st State of America
    #17
    Can you run iDefrag on the master boot drive while its being used?
     
  18. CrackedButter macrumors 68040

    CrackedButter

    Joined:
    Jan 15, 2003
    Location:
    51st State of America
    #18
    For the moment, I've deleted all my cache files and this has significantly improved performance. I would have done a full format and reinstall but my clients have pushed back the dates and I don't want to compromise my only working OSX system at the moment.

    Exports still take quite a while though.
     
  19. termina3 macrumors 65816

    Joined:
    Jul 16, 2007
    Location:
    TX
    #19
    No; iDefrag lets you reboot into a special mode that will run iDefrag on the boot drive. You cannot access the rest of the OS or any applications while running the defrag.

    I have had little success, however, using this mode; I've started using target disk mode instead.
     

Share This Page