I was wondering whether someone might be able to point me in the right direction or provide me with some ideas on how to approach this problem. What is the relationship between the dimensions and coordinates of the UIImageView and an image that a user captures. For example, if I draw a rectangle around a physical object that the user sees on the UIImageView using the users rear camera, and let's assume the user takes a photo, how will the x,y coordinates and dimensions of the rectangle compare with the actual image that was taken. Is it possible to replicate the exact dimensions and coordinates of the rectangle drawn on the UIImageView on the actual image taken by the phone? I apologize in advance if this sounds confusing.