I'm still learning Swift and I'm currently attempting a CloudKit based database tutorial. I have it figured out for the most part. The CloudKit data includes a String Attribute, a Location attribute, and an image Asset.
I'm currently focusing on the image asset. In tinkering with the code, I have it so I can view the image asset, remove it from the viewcontroller, load in a new photo (from the camera roll) to replace the image with something else and re-upload it to CloudKit (which overwrites the existing image). That all works great.
I'd like to try something a little more challenging but I'm unable to locate any source material to help.
What I want to do is take the image asset's GPS data (lat/lon), which was taken with an iPhone with Location Services enabled, and compare it to the CloudKit Location data for that record where the image asset was held, and make sure they're very close to each other.
I don't really need CoreLocation, since I'm not caring about the iPhone's location. But I don't know how to query the image data's lat/lon GPS coordinates and make comparisons to the record's Location attribute.
Looking over the Apple documentation, I think the answer might be related to CGImageProperties, and more specifically kCGImagePropertyGPSLongitude and kCGImagePropertyGPSLatitude. Unfortunately, my GoogleFu is only turning up older references to ObjC code, which I don't understand. So I'm trying to find some sort of examples in Swift so I can get myself moving forward.
If anyone could give me a push in the right direction, that would be great.
I'm currently focusing on the image asset. In tinkering with the code, I have it so I can view the image asset, remove it from the viewcontroller, load in a new photo (from the camera roll) to replace the image with something else and re-upload it to CloudKit (which overwrites the existing image). That all works great.
I'd like to try something a little more challenging but I'm unable to locate any source material to help.
What I want to do is take the image asset's GPS data (lat/lon), which was taken with an iPhone with Location Services enabled, and compare it to the CloudKit Location data for that record where the image asset was held, and make sure they're very close to each other.
I don't really need CoreLocation, since I'm not caring about the iPhone's location. But I don't know how to query the image data's lat/lon GPS coordinates and make comparisons to the record's Location attribute.
Looking over the Apple documentation, I think the answer might be related to CGImageProperties, and more specifically kCGImagePropertyGPSLongitude and kCGImagePropertyGPSLatitude. Unfortunately, my GoogleFu is only turning up older references to ObjC code, which I don't understand. So I'm trying to find some sort of examples in Swift so I can get myself moving forward.
If anyone could give me a push in the right direction, that would be great.