Why doesn't Swift automatically convert number types?

Discussion in 'iOS Programming' started by AlecZ, Mar 23, 2015.

  1. AlecZ macrumors 65816

    AlecZ

    Joined:
    Sep 11, 2014
    Location:
    Berkeley, CA
    #1
    ObjC had the annoyance of having to convert between NSNumber and primitive number types when putting numbers into data structure objects. Swift doesn't have that problem, but it does requires us to keep using constructors to make copies of numbers so they match the rest of the numbers in an arithmetic operation or variable assignment.

    The most common example is when you want to multiply a CGFloat by an Int somewhere when setting up view layout. It happens all the time. You'd do something like
    Code:
    var height:CGFloat = CGFloat(someInteger) * someFloat
    Also, it seems like it's a waste of memory and such to keep allocating this new data when you want to convert numbers. This code is calling the CGFloat constructor every time, right? Or does multiplication of two different number types do this anyway in ObjC?

    Why is it like this? Could this change in later versions of Swift?
     
  2. firewood macrumors 604

    Joined:
    Jul 29, 2003
    Location:
    Silicon Valley
    #2
    The CPU (both ARM and x86) only either does an integer multiply or a floating point multiply. So all programming languages have to do a conversion of one of the input values if they are mixed, either by manually in code, or auto-magically by the compiler.

    Since certain numeric type conversions can fail and Swift does not have error trapping, they want you to know or check that the conversion will absolutely work, and then do it manually. That avoids another ton of potential errors in typical C code.
     

Share This Page