I have an app that uses a live feed from the camera (via a UIImagePickerController) with a cameraOverlay on top of it. I now want to take a "screen shot" of the screen and save that to a UIImage and write that to a view over top the live feed so that it appears that the camera input has paused. I have everything in place to do this, but I can't figure out how to grab a screen shot.
Also, I'm using AVFoundation in another class to turn the Flash Torch on and off. Unfortunately this freezes my camera input from the UIImagePickerController for the rest of the program until I kill it... does anyone know why this happens?
EDIT: It seems I might have to migrate all my camera code to AVFoundation? Can I have a live video feed if I use AVFoundation instead of UIImagePickerController?
Also, I'm using AVFoundation in another class to turn the Flash Torch on and off. Unfortunately this freezes my camera input from the UIImagePickerController for the rest of the program until I kill it... does anyone know why this happens?
EDIT: It seems I might have to migrate all my camera code to AVFoundation? Can I have a live video feed if I use AVFoundation instead of UIImagePickerController?
Last edited: