ObjC had the annoyance of having to convert between NSNumber and primitive number types when putting numbers into data structure objects. Swift doesn't have that problem, but it does requires us to keep using constructors to make copies of numbers so they match the rest of the numbers in an arithmetic operation or variable assignment.
The most common example is when you want to multiply a CGFloat by an Int somewhere when setting up view layout. It happens all the time. You'd do something like
Also, it seems like it's a waste of memory and such to keep allocating this new data when you want to convert numbers. This code is calling the CGFloat constructor every time, right? Or does multiplication of two different number types do this anyway in ObjC?
Why is it like this? Could this change in later versions of Swift?
The most common example is when you want to multiply a CGFloat by an Int somewhere when setting up view layout. It happens all the time. You'd do something like
Code:
var height:CGFloat = CGFloat(someInteger) * someFloat
Why is it like this? Could this change in later versions of Swift?