PDA

View Full Version : sizeof(long) gets me 8 bytes? not 4?




zippyfly
Sep 23, 2010, 03:45 AM
NSLog(@"Long is %i bytes", sizeof(long));
NSLog(@"Unsigned long is %i bytes", sizeof(unsigned long));


gets

Long is 8 bytes
Unsigned long is 8 bytes

I thought long is 4 bytes and long long is 8 bytes?

(Running 10.6 on 2010 MacBook Pro 13", not sure if that matters.)



Cromulent
Sep 23, 2010, 03:57 AM
NSLog(@"Long is %i bytes", sizeof(long));
NSLog(@"Unsigned long is %i bytes", sizeof(unsigned long));


gets

Long is 8 bytes
Unsigned long is 8 bytes

I thought long is 4 bytes and long long is 8 bytes?

(Running 10.6 on 2010 MacBook Pro 13", not sure if that matters.)

All a long needs to be is at least the size of an int and equal or less than the size of a long long.

The rest is implementation defined.

For reference an int must be at least 16 bits.

ShortCutMan
Sep 23, 2010, 04:15 AM
This can have problems with cross platform development. GCC defines long as 8 bytes, whereas Visual Studio's compiler uses 4 bytes for a long and 8 bytes for long long. I do believe that GCC defines long long as 8 bytes as well.

foidulus
Sep 23, 2010, 04:32 AM
Thats because you are running it on a 64 bit platform and gcc defaults to that architecture when you are compiling.

Here is a little test you can run, enter the following into a file:


#include <stdio.h>
#include <execinfo.h>

int main(int argc, char *argv[]) {

printf("size is %zd\n",sizeof(long));
return 0;
}



now compile it like this:

gcc -arch x86_64 bob.c

run it

now compile the same file with

gcc -arch i386 bob.c

run it again

You will notice that they are different. MS might default to 32 bit in their world which is why you will see it as being 4 bytes

If you want fixed says int values in c use int32_t and int64_t

they will be the same length regardless of platform

Cromulent
Sep 23, 2010, 05:15 AM
This is why it is always safer to use the C99 types if you want a specific integer size.

Such as uint8_t and int64_t for instance.

gnasher729
Sep 23, 2010, 07:02 AM
This is why it is always safer to use the C99 types if you want a specific integer size.

Such as uint8_t and int64_t for instance.

Most of the time there is no need for a specific type. According to all the relevant standards (C, C++, Objective-C, Objective-C++) "long" is a signed type with at least 32 usable bits. So you use long when you need 32 bits, and don't mind getting a few more. And you don't really care about the size (which is 1 one some rather strange implementations of the C language).

There are also the types size_t (big enough to hold the size of any object, or the number of items in any array), intptr_t (big enough to hold the value of any pointer converted to an integer so that you can convert it back without loss), and ptrdiff_t (big enough to hold the difference between any pointers into the same array).

chown33
Sep 23, 2010, 10:49 AM
[CODE](Running 10.6 on 2010 MacBook Pro 13", not sure if that matters.)

It matters immensely. That model is 64-bit capable, and 10.6 defaults to 64-bit when the CPU is 64-bit capable.

gnasher729
Sep 23, 2010, 11:41 AM
It matters immensely. That model is 64-bit capable, and 10.6 defaults to 64-bit when the CPU is 64-bit capable.

Worth adding: Most likely you are building a Universal Binary, possibly PPC + x86 + X86_64. In that case, you have actually three applications in one package, and they can behave differently. For example, an app might display "long = 8 byte" on a 64-bit capable Intel processor, and "long = 4 byte" on a Core Duo (not Core 2 Duo) or PowerPC processor.

SidBala
Sep 23, 2010, 12:58 PM
Just specify exactly what type you want with the types someone already mentioned.

It's much safer.

zippyfly
Sep 23, 2010, 06:26 PM
Wow. Many thanks for so many replies. I didn't realize the topic was so complex. But anyway, you guys cleared it up for me. Thanks a ton!

robvas
Sep 24, 2010, 07:23 AM
The C faq has a section on data type sizes, 64-bit variables, etc

http://c-faq.com/~scs/cgi-bin/faqcat.cgi?sec=decl