In the XCode debugger, floating point variables are rounded to a few significant digits. Earlier versions of XCode than the 4.5.1 showed them with more significant digits. I had to look at some of these with greater precision and I couldn't figure out how to do it with the XCode preferences so I decided to print them to the console using printf(). As a test I printed M_PI which is defined in math.h as a literal.
long double pi = M_PI;
printf("%.48Lf", pi); //zeros after 48 places
printf logs a different value than M_PI
In math.h:
3.14159265358979323846264338327950288 and with printf:
3.141592653589793115997963468544185161590576171875
Only the first 16 significant digits are the same. Yes I know that this is plenty of precision and that I shouldn't worry about that. Why would this be?
long double pi = M_PI;
printf("%.48Lf", pi); //zeros after 48 places
printf logs a different value than M_PI
In math.h:
3.14159265358979323846264338327950288 and with printf:
3.141592653589793115997963468544185161590576171875
Only the first 16 significant digits are the same. Yes I know that this is plenty of precision and that I shouldn't worry about that. Why would this be?