Decimals and int

Discussion in 'Mac Programming' started by JonnyFrond, Feb 28, 2011.

  1. JonnyFrond macrumors newbie

    Joined:
    Jan 15, 2011
    #1
    Hi there,

    Can anyone out there help me with this. I have just written a small program to add to numbers together and find the squareroot in C++

    The problem is that I have declared my two variables as int, yet it accepts this. In Visual Studio it comes out, quite correctly, with an error, as the variables should be declared as doubles.

    ????

    Any input on this would be greatly appreciated.

    Kind regards

    Jonny
     
  2. robbieduncan Moderator emeritus

    robbieduncan

    Joined:
    Jul 24, 2002
    Location:
    London
  3. JonnyFrond thread starter macrumors newbie

    Joined:
    Jan 15, 2011
    #3
    Here's the code, as you can see it is very simple, yet it is not what I would expect. This was given us as a demo of an error, yet me on my mac didn't get one.

    It's going to make it hard to learn if it xcode has it's own version of C++ that is not cross platform. I am expecting to be using unix when I finish my degree, not osx.

    #include <iostream>
    #include <cmath>
    #include <iomanip>
    using namespace std;

    int main ()
    {
    int a,b;
    cout <<"Enter valur a and b " << endl;
    cin >> a >> b; // read user input
    cout << "The total is " << sqrt(a+b) << endl;
    cout << fixed << setprecision(2);

    cin.ignore();
    return 0;
    }
     
  4. lee1210, Feb 28, 2011
    Last edited: Feb 28, 2011

    lee1210 macrumors 68040

    lee1210

    Joined:
    Jan 10, 2005
    Location:
    Dallas, TX
    #4
    Using Visual Studio as the cross-platform baseline is probably more likely going to cause problems.
    OS X is UNIX.

    Are you hoping to get a complaint about the coercion of a+b to match one of the candidates of sqrt (float, double, or long double)?

    -Lee
     
  5. subsonix macrumors 68040

    Joined:
    Feb 2, 2008
    #5
    Your assumption is wrong, errors or warnings at compile time comes down to the compiler not the language. OS X does use standard C++ and OS X is unix. http://www.opengroup.org/openbrand/register/xy.htm The compiler used for C++ is GNUcc or GCC for short.
     
  6. holmesf, Feb 28, 2011
    Last edited: Feb 28, 2011

    holmesf macrumors 6502a

    Joined:
    Sep 30, 2001
    #6
    Visual studio is actually wrong to throw the error. If your professor told you it's an error, then he doesn't know the language and isn't worth his weight in carbon. Moreover the conversion won't cause any problems because doubles can represent ints without loss of precision.

    http://www.cplusplus.com/doc/tutorial/typecasting/

     
  7. kuwisdelu macrumors 65816

    Joined:
    Jan 13, 2008
    #7
    If your end-goal is UNIX compatibility, don't trust Visual Studio. Trust OS X's version of gcc. My code from OS X always compiles on my university's Linux/UNIX serveres. Some of my classmate's VS code? Not so much.

    And as others have said, there's nothing wrong with that code. Standard C/C++ will convert the int to a double internally, since sqrt() doesn't do ints.
     
  8. JonnyFrond thread starter macrumors newbie

    Joined:
    Jan 15, 2011
    #8
    Thank you guys, all your answers have helped me understand this. I am at Uni, and they use Visual Studio, I have a mac and have decided to use xcode, for a few reasons, some of which you guys have outlined here.

    As a mature student, I have come to learning from a slightly different place from others and I am trying to set myself up more for industry as opposed to just passing my degree.

    In my mind, using Visual studio will lead to bad programming practices and windows only libraries. Using Xcode will lead to possibly writing programs that will require a lot of fiddling with on the uni computers in order to submit something that will be markable.

    Thanks for your input.

    Jonny
     
  9. Sander, Mar 1, 2011
    Last edited: Mar 1, 2011

    Sander macrumors 6502

    Joined:
    Apr 24, 2008
    #9
    The error is probably not about the type coercion; it's about ambiguity. Visual Studio is right to complain. If you try this code in Comeau (regarded as the most standards-compliant compiler there is), you'll get the same error. Note that you can try this online.

    I don't have a GCC at hand here, but I'm surprised it would accept this.
     
  10. Sander macrumors 6502

    Joined:
    Apr 24, 2008
    #10
    Hmm... I was able to access GCC in the mean time, and the reason it accepts this is because in its cmath header file you'll find this:

    Code:
      using ::sqrt;
    
      inline float
      sqrt(float __x)
      { return __builtin_sqrtf(__x); }
    
      inline long double
      sqrt(long double __x)
      { return __builtin_sqrtl(__x); }
    
      template<typename _Tp>
        inline typename __gnu_cxx::__enable_if<__is_integer<_Tp>::__value,
                                               double>::__type
        sqrt(_Tp __x)
        { return __builtin_sqrt(__x); }
    Your integer sqrt-call triggers the template version.

    Apparently the other compilers use libraries without this trick.

    Regarding standards-compliance: C++ adds float and long double versions of the functions in math.h (in C, these functions have different names since you can't overload functions in C; in this case they would be sqrtf() and sqrtl()).

    An integer-taking sqrt() function is not part of the standard library.

    In short, GCC provides more than is strictly required.
     
  11. Hansr macrumors 6502a

    Joined:
    Apr 1, 2007
    #11
    Ah you're going to be completely useless anyway when you get out off school and we have to retrain you anyway :D

    You'll get plenty of sources for bad coding practices, I doubt VS which is a fairly good IDE (one of my favorites actually) will be a significant contribution to that :)
     
  12. balamw Moderator

    balamw

    Staff Member

    Joined:
    Aug 16, 2005
    Location:
    New England
    #12
    It could be if they are teaching managed code (C++/CLI) instead of "regular" C++. Then of course you'd have to use Visual Studio. (NOTE: I'm not saying it's "bad", just non-standard and non-portable.)


    B
     
  13. ChrisA, Mar 1, 2011
    Last edited: Mar 1, 2011

    ChrisA macrumors G4

    Joined:
    Jan 5, 2006
    Location:
    Redondo Beach, California
    #13
    About the error. It's always best to use explicit conversion. Not just in C but just always. It never does any hard except to use up a few more bytes of disk space and it lets the reader, maybe years later, know that you thought about types. It documents your intent.

    Others have already pointed out that if the goal is to move to UNIX then (1) the compiler inside xcode is the most common compiler used in UNIX/Linux systems and (2) Max OS X is UNIX (plus a bit more added on top)

    Why is Visual Studio different? Microsoft has a vested business interest in getting people locked into their software. It's not that they hire stupid engineers that can't write a compiler, the differences are intensional.
     
  14. notjustjay macrumors 603

    notjustjay

    Joined:
    Sep 19, 2003
    Location:
    Canada, eh?
    #14
    It's not necessarily simply a marketing conspiracy that dictates differences in compiler warnings. One compiler might make some assumptions ("ah, I'm sure he really meant to cast these to doubles") while another insists that you do it yourself, throwing a warning or error. The one that makes assumptions makes it more convenient for the programmer, but this convenience comes at a cost: there is an added ambiguity now. This can lead to subtle errors where you assumed the compiler "should" be doing something while it in fact is assuming something entirely different. For example, you might write code where you think it should convert your ints to doubles and give you a decimal result, while it assumes maybe you intended to keep everything in the integer domain and it throws away the decimal results. I've run into this many times.

    Forcing you to cast explicitly can be a pain, but it helps you to always remember which operators require what types, and it avoids these sorts of errors. The cost is a bit of extra work for the programmer, but this may not be a bad thing. Some languages, such as Ada, take it to an extreme, forcing you to be aware of ALL the different types you use in your program, and when it is OK and not OK to convert between them. A huge pain, but it makes for fewer bugs in the code!
     

Share This Page