Core 2 Duo, Speeding up the compilation

Discussion in 'Mac Programming' started by magneto2007, Jun 7, 2008.

  1. magneto2007 macrumors newbie

    Sep 25, 2007

    is there a way to speed up the compilation process on a core 2 duo? I wonder if there are some gcc flags that'll allow the compiler to use both of the cores at the same time.

  2. Cromulent macrumors 603


    Oct 2, 2006
    The Land of Hope and Glory
    You can write a makefile and GNU make can compile a file a core, but as far as I know GCC is single threaded.
  3. rastersize macrumors member

    Apr 9, 2008
    Not for GCC, but for make:
    Where N is a number. 5 is a rather good one for 2 cores.
    Also see:

    Another (huge) speed up is to mount the directory in which GCC works when compiling your app into your ram.
    mount -t tmpfs tmpfs /path/to/working/dir
    Just don't forget to unmount afterwards ;)
    umount /path/to/working/dir
  4. lee1210 macrumors 68040


    Jan 10, 2005
    Dallas, TX
    As the article linked by rastersize notes, not all build chains have an "independent" order like that. I might break my chain into two (or more) steps. The first would build any libraries(non-executables you reference). These should be independent of each other. Then in another, you can build all of your binaries. If you build objects outside of libraries that aren't immediately linked to an executable, these should be fine in the first step since they don't have to be linked.

    Others may have more interesting input on this, I haven't tried it yet, but probably will now.

  5. rastersize macrumors member

    Apr 9, 2008
    Just remembered, in XCode you can also check the project setting "Build independent targets in parallel".
  6. gnasher729 macrumors P6


    Nov 25, 2005
    XCode does that out of the box. It will automatically use all available cores. Each core can only compile one source file at a time, so putting everything into one huge source file is a bad idea (it is a bad idea anyway).

    If you have more than one Mac, XCode lets you use distributed builds. So for huge builds, adding a MacMini could almost double your build speeds.

    Use a build target that builds only for your local platform (if you start with the XCode templates, the debug target does that).

    Use plain C or Objective-C instead of C++. A lot of time is spent compiling header files for the C++ STL.

    Make sure you use pre-compiled headers.
    Use predictive compilation and zero-link.

    Don't create hundreds of tiny little source files. There is quite a bit of overhead for each source file.

    Compiling is almost completely CPU-bound. Faster CPU makes a difference, faster hard drive or using a RAM disk makes very little difference.
  7. operator207 macrumors 6502

    Jul 24, 2007
    The -j option can be good and bad. I have "finished" step 17 of 50 before step 16 was complete. Step 17 required step 16 to be finished before being started as it depended on the data from step 16 to complete successfully. So that failed, which failed the entire compile. Your restart the compile, and get farther potentially failing at another step. It does not happen often, but it does happen. This happened to me on FreeBSD, not OS X, but close enough. I never really found a significant increase in speed (few minutes here and there over a 10 hour compile) to warrant using the -j option that much. Maybe its my old Dual P3 box that just does not take advantage of this as well as a C2D would, but the -j option has been around in FreeBSD since those Proc's were almost new, if not new. At least that was when I started playing around with the option in FreeBSD's make world.

    The only time I have seen a great increase in speed, which is the few minutes in many hour long compile I mention above, was a make world in FreeBSD.
  8. gnasher729 macrumors P6


    Nov 25, 2005
    Never has happened to me in about two years of using XCode. You'd think that if XCode had problems with that, one of the hundreds of thousands of XCode developers would have complained to Apple. Especially one of those who bought an eight core MacPro to make things faster.

    On an iMac, it makes things about twice as fast. And you'd have to look very, very hard indeed to find how to switch it off anyway.
  9. magneto2007 thread starter macrumors newbie

    Sep 25, 2007
    Thank you very much. I will try putting the flags to my makefile and see the result.
  10. operator207 macrumors 6502

    Jul 24, 2007
    Thats wonderful, I am sure you read my entire post and that I never had this problem in OS X, only FreeBSD. I was relating, as Darwin and FreeBSD are very similar. I was stating that it *could* happen, not that it *did* happen.

  11. laprej macrumors regular

    Oct 14, 2005
    Troy, NY
    If step 17 depended on step 16, that should have been specified in the Makefile. That is the point of Makefiles - to specify an explicit ordering to dependencies. Using the -j option is very useful when, say, you're building the Linux kernel which has a lot of independent parts, but even it has some dependencies. make is a tool that does what you tell it and if you don't write a Makefile correctly then you can't very well expect that tool to be able to parallelize your build for you.
  12. operator207 macrumors 6502

    Jul 24, 2007
    That may be true on Linux, but as I said, and just repeated, this was on FreeBSD doing a make world (that link has obviously changed from what I used and read back in the late 90's) . Rarely, unless you had made some significant changes to the "world" you did not change anything in the Makefile for making world. This was also back in FreeBSD 3.x and early 4.x. (recent is 7.x) As you can imagine, its been awhile for me doing a make world. I normally do not do a make world anymore, as it has not been needed for what I do. Stability is key in my environment, not "cutting edge" so to speak.

    I know the reason for Makefiles, their purpose, and the purpose of the make command. I have since I started compiling from source 10+ years ago. :)

    Just because I do not go into depth on every comment I make, does not mean I am clueless. :eek: It usually means I believe, because of previous comments by other posters, that we are all on a certain knowledge level.

    When I was using make world, it was cutting edge, it was when that option to update the system just came out for FreeBSD. It was there in an infant form before, but had become more stable. There were a few times, when depending on what day you decided to do a make world, that you could literally bork your system, and have to spend most of the day fixing it because someone committed some bad code in a Makefile, or somewhere else. One way to keep this from happening, was to do a cvsup on Monday, and then wait till next Monday to make sure their was not any issues. When your admining many machines that are production, but need to be upgraded, but your company will not pay for backup servers to sit in their place while your upgrading, you tend to be a bit more cautious. :)

    Regardless, you are essentially correct, in the fact of what a Makefile is, and how make works. However, if you used make to rebuild FreeBSD back in the 'day', you would have found it to not have been quite as mature as it is now.

    note: NOTHING I am stating can be used for Linux as gospel, as I am NOT talking about Linux. True, it may be similar, but you CANNOT take everything you learn on Linux, and directly apply it to FreeBSD. You can take that knowledge, learn from it, and learn how to use FreeBSD, but it is definitely different. I know this, I have administered both for many years. I just prefer FreeBSD. To me FreeBSD is like those really comfortable pair of jeans you love to wear, and your wife loves to hide from you because of the holes in the crotch. ;)
  13. Sander macrumors 6502

    Apr 24, 2008
    That surprises me. Isn't this a sign that the disk cache subsystem is not doing its job?

Share This Page