Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

AlecZ

macrumors 65816
Original poster
Sep 11, 2014
1,173
123
Berkeley, CA
ObjC had the annoyance of having to convert between NSNumber and primitive number types when putting numbers into data structure objects. Swift doesn't have that problem, but it does requires us to keep using constructors to make copies of numbers so they match the rest of the numbers in an arithmetic operation or variable assignment.

The most common example is when you want to multiply a CGFloat by an Int somewhere when setting up view layout. It happens all the time. You'd do something like
Code:
var height:CGFloat = CGFloat(someInteger) * someFloat
Also, it seems like it's a waste of memory and such to keep allocating this new data when you want to convert numbers. This code is calling the CGFloat constructor every time, right? Or does multiplication of two different number types do this anyway in ObjC?

Why is it like this? Could this change in later versions of Swift?
 

firewood

macrumors G3
Jul 29, 2003
8,108
1,345
Silicon Valley
The CPU (both ARM and x86) only either does an integer multiply or a floating point multiply. So all programming languages have to do a conversion of one of the input values if they are mixed, either by manually in code, or auto-magically by the compiler.

Since certain numeric type conversions can fail and Swift does not have error trapping, they want you to know or check that the conversion will absolutely work, and then do it manually. That avoids another ton of potential errors in typical C code.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.