This can have problems with cross platform development. GCC defines long as 8 bytes, whereas Visual Studio's compiler uses 4 bytes for a long and 8 bytes for long long. I do believe that GCC defines long long as 8 bytes as well.
Most of the time there is no need for a specific type. According to all the relevant standards (C, C++, Objective-C, Objective-C++) "long" is a signed type with at least 32 usable bits. So you use long when you need 32 bits, and don't mind getting a few more. And you don't really care about the size (which is 1 one some rather strange implementations of the C language).
There are also the types size_t (big enough to hold the size of any object, or the number of items in any array), intptr_t (big enough to hold the value of any pointer converted to an integer so that you can convert it back without loss), and ptrdiff_t (big enough to hold the difference between any pointers into the same array).
(Running 10.6 on 2010 MacBook Pro 13", not sure if that matters.)[/QUOTE]
It matters immensely. That model is 64-bit capable, and 10.6 defaults to 64-bit when the CPU is 64-bit capable.
Worth adding: Most likely you are building a Universal Binary, possibly PPC + x86 + X86_64. In that case, you have actually three applications in one package, and they can behave differently. For example, an app might display "long = 8 byte" on a 64-bit capable Intel processor, and "long = 4 byte" on a Core Duo (not Core 2 Duo) or PowerPC processor.